Combining Simulation Tools for End-to-End Trajectory Optimization
NASA Technical Reports Server (NTRS)
Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min
2015-01-01
Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.
Mars Smart Lander Simulations for Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Striepe, S. A.; Way, D. W.; Balaram, J.
2002-01-01
Two primary simulations have been developed and are being updated for the Mars Smart Lander Entry, Descent, and Landing (EDL). The high fidelity engineering end-to-end EDL simulation that is based on NASA Langley's Program to Optimize Simulated Trajectories (POST) and the end-to-end real-time, hardware-in-the-loop simulation testbed, which is based on NASA JPL's (Jet Propulsion Laboratory) Dynamics Simulator for Entry, Descent and Surface landing (DSENDS). This paper presents the status of these Mars Smart Lander EDL end-to-end simulations at this time. Various models, capabilities, as well as validation and verification for these simulations are discussed.
Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling
NASA Astrophysics Data System (ADS)
Schum, William K.; Doolittle, Christina M.; Boyarko, George A.
2006-05-01
During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.
Advanced radiometric and interferometric milimeter-wave scene simulations
NASA Technical Reports Server (NTRS)
Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.
1993-01-01
Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.
NASA Technical Reports Server (NTRS)
Fisher, Jody l.; Striepe, Scott A.
2007-01-01
The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.
Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project
NASA Technical Reports Server (NTRS)
Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.;
2008-01-01
Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.
NASA Technical Reports Server (NTRS)
Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
A Distributed Simulation Software System for Multi-Spacecraft Missions
NASA Technical Reports Server (NTRS)
Burns, Richard; Davis, George; Cary, Everett
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
High-End Computing Challenges in Aerospace Design and Engineering
NASA Technical Reports Server (NTRS)
Bailey, F. Ronald
2004-01-01
High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.
A Hardware-in-the-Loop Testbed for Spacecraft Formation Flying Applications
NASA Technical Reports Server (NTRS)
Leitner, Jesse; Bauer, Frank H. (Technical Monitor)
2001-01-01
The Formation Flying Test Bed (FFTB) at NASA Goddard Space Flight Center (GSFC) is being developed as a modular, hybrid dynamic simulation facility employed for end-to-end guidance, navigation, and control (GN&C) analysis and design for formation flying clusters and constellations of satellites. The FFTB will support critical hardware and software technology development to enable current and future missions for NASA, other government agencies, and external customers for a wide range of missions, particularly those involving distributed spacecraft operations. The initial capabilities of the FFTB are based upon an integration of high fidelity hardware and software simulation, emulation, and test platforms developed at GSFC in recent years; including a high-fidelity GPS simulator which has been a fundamental component of the Guidance, Navigation, and Control Center's GPS Test Facility. The FFTB will be continuously evolving over the next several years from a too[ with initial capabilities in GPS navigation hardware/software- in-the- loop analysis and closed loop GPS-based orbit control algorithm assessment to one with cross-link communications and relative navigation analysis and simulation capability. Eventually the FFT13 will provide full capability to support all aspects of multi-sensor, absolute and relative position determination and control, in all (attitude and orbit) degrees of freedom, as well as information management for satellite clusters and constellations. In this paper we focus on the architecture for the FFT13 as a general GN&C analysis environment for the spacecraft formation flying community inside and outside of NASA GSFC and we briefly reference some current and future activities which will drive the requirements and development.
NASA Technical Reports Server (NTRS)
Chung, Victoria I.; Crues, Edwin Z.; Blum, Mike G.; Alofs, Cathy; Busto, Juan
2007-01-01
This paper describes the architecture and implementation of a distributed launch and ascent simulation of NASA's Orion spacecraft and Ares I launch vehicle. This simulation is one segment of the Distributed Space Exploration Simulation (DSES) Project. The DSES project is a research and development collaboration between NASA centers which investigates technologies and processes for distributed simulation of complex space systems in support of NASA's Exploration Initiative. DSES is developing an integrated end-to-end simulation capability to support NASA development and deployment of new exploration spacecraft and missions. This paper describes the first in a collection of simulation capabilities that DSES will support.
End-to-end plasma bubble PIC simulations on GPUs
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Fox, William; Matteucci, Jackson; Bhattacharjee, Amitava
2017-10-01
Accelerator technologies play a crucial role in eventually achieving exascale computing capabilities. The current and upcoming leadership machines at ORNL (Titan and Summit) employ Nvidia GPUs, which provide vast computational power but also need specifically adapted computational kernels to fully exploit them. In this work, we will show end-to-end particle-in-cell simulations of the formation, evolution and coalescence of laser-generated plasma bubbles. This work showcases the GPU capabilities of the PSC particle-in-cell code, which has been adapted for this problem to support particle injection, a heating operator and a collision operator on GPUs.
DKIST Adaptive Optics System: Simulation Results
NASA Astrophysics Data System (ADS)
Marino, Jose; Schmidt, Dirk
2016-05-01
The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.
2009-05-01
It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.
Multibody Modeling and Simulation for the Mars Phoenix Lander Entry, Descent and Landing
NASA Technical Reports Server (NTRS)
Queen, Eric M.; Prince, Jill L.; Desai, Prasun N.
2008-01-01
A multi-body flight simulation for the Phoenix Mars Lander has been developed that includes high fidelity six degree-of-freedom rigid-body models for the parachute and lander system. The simulation provides attitude and rate history predictions of all bodies throughout the flight, as well as loads on each of the connecting lines. In so doing, a realistic behavior of the descending parachute/lander system dynamics can be simulated that allows assessment of the Phoenix descent performance and identification of potential sensitivities for landing. This simulation provides a complete end-to-end capability of modeling the entire entry, descent, and landing sequence for the mission. Time histories of the parachute and lander aerodynamic angles are presented. The response of the lander system to various wind models and wind shears is shown to be acceptable. Monte Carlo simulation results are also presented.
The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.; Humphreys, William M.
2003-01-01
We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.
Impact of the Columbia Supercomputer on NASA Space and Exploration Mission
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Kwak, Dochan; Kiris, Cetin; Lawrence, Scott
2006-01-01
NASA's 10,240-processor Columbia supercomputer gained worldwide recognition in 2004 for increasing the space agency's computing capability ten-fold, and enabling U.S. scientists and engineers to perform significant, breakthrough simulations. Columbia has amply demonstrated its capability to accelerate NASA's key missions, including space operations, exploration systems, science, and aeronautics. Columbia is part of an integrated high-end computing (HEC) environment comprised of massive storage and archive systems, high-speed networking, high-fidelity modeling and simulation tools, application performance optimization, and advanced data analysis and visualization. In this paper, we illustrate the impact Columbia is having on NASA's numerous space and exploration applications, such as the development of the Crew Exploration and Launch Vehicles (CEV/CLV), effects of long-duration human presence in space, and damage assessment and repair recommendations for remaining shuttle flights. We conclude by discussing HEC challenges that must be overcome to solve space-related science problems in the future.
Hurricane Intensity Forecasts with a Global Mesoscale Model on the NASA Columbia Supercomputer
NASA Technical Reports Server (NTRS)
Shen, Bo-Wen; Tao, Wei-Kuo; Atlas, Robert
2006-01-01
It is known that General Circulation Models (GCMs) have insufficient resolution to accurately simulate hurricane near-eye structure and intensity. The increasing capabilities of high-end computers (e.g., the NASA Columbia Supercomputer) have changed this. In 2004, the finite-volume General Circulation Model at a 1/4 degree resolution, doubling the resolution used by most of operational NWP center at that time, was implemented and run to obtain promising landfall predictions for major hurricanes (e.g., Charley, Frances, Ivan, and Jeanne). In 2005, we have successfully implemented the 1/8 degree version, and demonstrated its performance on intensity forecasts with hurricane Katrina (2005). It is found that the 1/8 degree model is capable of simulating the radius of maximum wind and near-eye wind structure, and thereby promising intensity forecasts. In this study, we will further evaluate the model s performance on intensity forecasts of hurricanes Ivan, Jeanne, Karl in 2004. Suggestions for further model development will be made in the end.
Computational Simulations of the NASA Langley HyMETS Arc-Jet Facility
NASA Technical Reports Server (NTRS)
Brune, A. J.; Bruce, W. E., III; Glass, D. E.; Splinter, S. C.
2017-01-01
The Hypersonic Materials Environmental Test System (HyMETS) arc-jet facility located at the NASA Langley Research Center in Hampton, Virginia, is primarily used for the research, development, and evaluation of high-temperature thermal protection systems for hypersonic vehicles and reentry systems. In order to improve testing capabilities and knowledge of the test article environment, an effort is underway to computationally simulate the flow-field using computational fluid dynamics (CFD). A detailed three-dimensional model of the arc-jet nozzle and free-jet portion of the flow-field has been developed and compared to calibration probe Pitot pressure and stagnation-point heat flux for three test conditions at low, medium, and high enthalpy. The CFD model takes into account uniform pressure and non-uniform enthalpy profiles at the nozzle inlet as well as catalytic recombination efficiency effects at the probe surface. Comparing the CFD results and test data indicates an effectively fully-catalytic copper surface on the heat flux probe of about 10% efficiency and a 2-3 kpa pressure drop from the arc heater bore, where the pressure is measured, to the plenum section, prior to the nozzle. With these assumptions, the CFD results are well within the uncertainty of the stagnation pressure and heat flux measurements. The conditions at the nozzle exit were also compared with radial and axial velocimetry. This simulation capability will be used to evaluate various three-dimensional models that are tested in the HyMETS facility. An end-to-end aerothermal and thermal simulation of HyMETS test articles will follow this work to provide a better understanding of the test environment, test results, and to aid in test planning. Additional flow-field diagnostic measurements will also be considered to improve the modeling capability.
Design and simulation of EVA tools and robot end effectors for servicing missions of the HST
NASA Technical Reports Server (NTRS)
Naik, Dipak; Dehoff, P. H.
1995-01-01
The Hubble Space Telescope (HST) was launched into near-earth orbit by the Space Shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A Space Shuttle repair mission in January 1994 installed small corrective mirrors that restored the full intended optical capability of the HST. A Second Servicing Mission (SM2) scheduled in 1997 will involve considerable Extra Vehicular Activity (EVA). To reduce EVA time, the addition of robotic capability in the remaining servicing missions has been proposed. Toward that end, two concept designs for a general purpose end effector for robots are presented in this report.
High-Fidelity Buckling Analysis of Composite Cylinders Using the STAGS Finite Element Code
NASA Technical Reports Server (NTRS)
Hilburger, Mark W.
2014-01-01
Results from previous shell buckling studies are presented that illustrate some of the unique and powerful capabilities in the STAGS finite element analysis code that have made it an indispensable tool in structures research at NASA over the past few decades. In particular, prototypical results from the development and validation of high-fidelity buckling simulations are presented for several unstiffened thin-walled compression-loaded graphite-epoxy cylindrical shells along with a discussion on the specific methods and user-defined subroutines in STAGS that are used to carry out the high-fidelity simulations. These simulations accurately account for the effects of geometric shell-wall imperfections, shell-wall thickness variations, local shell-wall ply-gaps associated with the fabrication process, shell-end geometric imperfections, nonuniform applied end loads, and elastic boundary conditions. The analysis procedure uses a combination of nonlinear quasi-static and transient dynamic solution algorithms to predict the prebuckling and unstable collapse response characteristics of the cylinders. Finally, the use of high-fidelity models in the development of analysis-based shell-buckling knockdown (design) factors is demonstrated.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
A laboratory breadboard system for dual-arm teleoperation
NASA Technical Reports Server (NTRS)
Bejczy, A. K.; Szakaly, Z.; Kim, W. S.
1990-01-01
The computing architecture of a novel dual-arm teleoperation system is described. The novelty of this system is that: (1) the master arm is not a replica of the slave arm; it is unspecific to any manipulator and can be used for the control of various robot arms with software modifications; and (2) the force feedback to the general purpose master arm is derived from force-torque sensor data originating from the slave hand. The computing architecture of this breadboard system is a fully synchronized pipeline with unique methods for data handling, communication and mathematical transformations. The computing system is modular, thus inherently extendable. The local control loops at both sites operate at 100 Hz rate, and the end-to-end bilateral (force-reflecting) control loop operates at 200 Hz rate, each loop without interpolation. This provides high-fidelity control. This end-to-end system elevates teleoperation to a new level of capabilities via the use of sensors, microprocessors, novel electronics, and real-time graphics displays. A description is given of a graphic simulation system connected to the dual-arm teleoperation breadboard system. High-fidelity graphic simulation of a telerobot (called Phantom Robot) is used for preview and predictive displays for planning and for real-time control under several seconds communication time delay conditions. High fidelity graphic simulation is obtained by using appropriate calibration techniques.
SPoRT - An End-to-End R2O Activity
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.
2009-01-01
Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.
Large-Scale NASA Science Applications on the Columbia Supercluster
NASA Technical Reports Server (NTRS)
Brooks, Walter
2005-01-01
Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.
GNC Architecture Design for ARES Simulation. Revision 3.0. Revision 3.0
NASA Technical Reports Server (NTRS)
Gay, Robert
2006-01-01
The purpose of this document is to describe the GNC architecture and associated interfaces for all ARES simulations. Establishing a common architecture facilitates development across the ARES simulations and provides an efficient mechanism for creating an end-to-end simulation capability. In general, the GNC architecture is the frame work in which all GNC development takes place, including sensor and effector models. All GNC software applications have a standard location within the architecture making integration easier and, thus more efficient.
NASA Technical Reports Server (NTRS)
Gott, Charles; Galicki, Peter; Shores, David
1990-01-01
The Helmet Mounted Display system and Part Task Trainer are two projects currently underway that are closely related to the in-flight crew training concept. The first project is a training simulator and an engineering analysis tool. The simulator's unique helmet mounted display actually projects the wearer into the simulated environment of 3-D space. Miniature monitors are mounted in front of the wearers eyes. Partial Task Trainer is a kinematic simulator for the Shuttle Remote Manipulator System. The simulator consists of a high end graphics workstation with a high resolution color screen and a number of input peripherals that create a functional equivalent of the RMS control panel in the back of the Orbiter. It is being used in the training cycle for Shuttle crew members. Activities are underway to expand the capability of the Helmet Display System and the Partial Task Trainer.
NASA HPCC Technology for Aerospace Analysis and Design
NASA Technical Reports Server (NTRS)
Schulbach, Catherine H.
1999-01-01
The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.
Simulation-Based Analysis of Reentry Dynamics for the Sharp Atmospheric Entry Vehicle
NASA Technical Reports Server (NTRS)
Tillier, Clemens Emmanuel
1998-01-01
This thesis describes the analysis of the reentry dynamics of a high-performance lifting atmospheric entry vehicle through numerical simulation tools. The vehicle, named SHARP, is currently being developed by the Thermal Protection Materials and Systems branch of NASA Ames Research Center, Moffett Field, California. The goal of this project is to provide insight into trajectory tradeoffs and vehicle dynamics using simulation tools that are powerful, flexible, user-friendly and inexpensive. Implemented Using MATLAB and SIMULINK, these tools are developed with an eye towards further use in the conceptual design of the SHARP vehicle's trajectory and flight control systems. A trajectory simulator is used to quantify the entry capabilities of the vehicle subject to various operational constraints. Using an aerodynamic database computed by NASA and a model of the earth, the simulator generates the vehicle trajectory in three-dimensional space based on aerodynamic angle inputs. Requirements for entry along the SHARP aerothermal performance constraint are evaluated for different control strategies. Effect of vehicle mass on entry parameters is investigated, and the cross range capability of the vehicle is evaluated. Trajectory results are presented and interpreted. A six degree of freedom simulator builds on the trajectory simulator and provides attitude simulation for future entry controls development. A Newtonian aerodynamic model including control surfaces and a mass model are developed. A visualization tool for interpreting simulation results is described. Control surfaces are roughly sized. A simple controller is developed to fly the vehicle along its aerothermal performance constraint using aerodynamic flaps for control. This end-to-end demonstration proves the suitability of the 6-DOF simulator for future flight control system development. Finally, issues surrounding real-time simulation with hardware in the loop are discussed.
High-power graphic computers for visual simulation: a real-time--rendering revolution
NASA Technical Reports Server (NTRS)
Kaiser, M. K.
1996-01-01
Advances in high-end graphics computers in the past decade have made it possible to render visual scenes of incredible complexity and realism in real time. These new capabilities make it possible to manipulate and investigate the interactions of observers with their visual world in ways once only dreamed of. This paper reviews how these developments have affected two preexisting domains of behavioral research (flight simulation and motion perception) and have created a new domain (virtual environment research) which provides tools and challenges for the perceptual psychologist. Finally, the current limitations of these technologies are considered, with an eye toward how perceptual psychologist might shape future developments.
Modeling a Hall Thruster from Anode to Plume Far Field
2005-01-01
Hall thruster simulation capability that begins with propellant injection at the thruster anode, and ends in the plume far field. The development of a comprehensive simulation capability is critical for a number of reasons. The main motivation stems from the need to directly couple simulation of the plasma discharge processes inside the thruster and the transport of the plasma to the plume far field. The simulation strategy will employ two existing codes, one for the Hall thruster device and one for the plume. The coupling will take place in the plume
Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Matzen, M. Keith
2014-09-16
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less
NASA Technical Reports Server (NTRS)
Gillian, Ronnie E.; Lotts, Christine G.
1988-01-01
The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.
An extensive coronagraphic simulation applied to LBT
NASA Astrophysics Data System (ADS)
Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.
2016-08-01
In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.
Integration of Modelling and Graphics to Create an Infrared Signal Processing Test Bed
NASA Astrophysics Data System (ADS)
Sethi, H. R.; Ralph, John E.
1989-03-01
The work reported in this paper was carried out as part of a contract with MoD (PE) UK. It considers the problems associated with realistic modelling of a passive infrared system in an operational environment. Ideally all aspects of the system and environment should be integrated into a complete end-to-end simulation but in the past limited computing power has prevented this. Recent developments in workstation technology and the increasing availability of parallel processing techniques makes the end-to-end simulation possible. However the complexity and speed of such simulations means difficulties for the operator in controlling the software and understanding the results. These difficulties can be greatly reduced by providing an extremely user friendly interface and a very flexible, high power, high resolution colour graphics capability. Most system modelling is based on separate software simulation of the individual components of the system itself and its environment. These component models may have their own characteristic inbuilt assumptions and approximations, may be written in the language favoured by the originator and may have a wide variety of input and output conventions and requirements. The models and their limitations need to be matched to the range of conditions appropriate to the operational scenerio. A comprehensive set of data bases needs to be generated by the component models and these data bases must be made readily available to the investigator. Performance measures need to be defined and displayed in some convenient graphics form. Some options are presented for combining available hardware and software to create an environment within which the models can be integrated, and which provide the required man-machine interface, graphics and computing power. The impact of massively parallel processing and artificial intelligence will be discussed. Parallel processing will make real time end-to-end simulation possible and will greatly improve the graphical visualisation of the model output data. Artificial intelligence should help to enhance the man-machine interface.
Nested high-resolution large-eddy simulations in WRF to support wind power
NASA Astrophysics Data System (ADS)
Mirocha, J.; Kirkil, G.; Kosovic, B.; Lundquist, J. K.
2009-12-01
The WRF model’s grid nesting capability provides a potentially powerful framework for simulating flow over a wide range of scales. One such application is computation of realistic inflow boundary conditions for large eddy simulations (LES) by nesting LES domains within mesoscale domains. While nesting has been widely and successfully applied at GCM to mesoscale resolutions, the WRF model’s nesting behavior at the high-resolution (Δx < 1000m) end of the spectrum is less well understood. Nesting LES within msoscale domains can significantly improve turbulent flow prediction at the scale of a wind park, providing a basis for superior site characterization, or for improved simulation of turbulent inflows encountered by turbines. We investigate WRF’s grid nesting capability at high mesh resolutions using nested mesoscale and large-eddy simulations. We examine the spatial scales required for flow structures to equilibrate to the finer mesh as flow enters a nest, and how the process depends on several parameters, including grid resolution, turbulence subfilter stress models, relaxation zones at nest interfaces, flow velocities, surface roughnesses, terrain complexity and atmospheric stability. Guidance on appropriate domain sizes and turbulence models for LES in light of these results is provided This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 LLNL-ABS-416482
Simulation and Flight Test Capability for Testing Prototype Sense and Avoid System Elements
NASA Technical Reports Server (NTRS)
Howell, Charles T.; Stock, Todd M.; Verstynen, Harry A.; Wehner, Paul J.
2012-01-01
NASA Langley Research Center (LaRC) and The MITRE Corporation (MITRE) have developed, and successfully demonstrated, an integrated simulation-to-flight capability for evaluating sense and avoid (SAA) system elements. This integrated capability consists of a MITRE developed fast-time computer simulation for evaluating SAA algorithms, and a NASA LaRC surrogate unmanned aircraft system (UAS) equipped to support hardware and software in-the-loop evaluation of SAA system elements (e.g., algorithms, sensors, architecture, communications, autonomous systems), concepts, and procedures. The fast-time computer simulation subjects algorithms to simulated flight encounters/ conditions and generates a fitness report that records strengths, weaknesses, and overall performance. Reviewed algorithms (and their fitness report) are then transferred to NASA LaRC where additional (joint) airworthiness evaluations are performed on the candidate SAA system-element configurations, concepts, and/or procedures of interest; software and hardware components are integrated into the Surrogate UAS research systems; and flight safety and mission planning activities are completed. Onboard the Surrogate UAS, candidate SAA system element configurations, concepts, and/or procedures are subjected to flight evaluations and in-flight performance is monitored. The Surrogate UAS, which can be controlled remotely via generic Ground Station uplink or automatically via onboard systems, operates with a NASA Safety Pilot/Pilot in Command onboard to permit safe operations in mixed airspace with manned aircraft. An end-to-end demonstration of a typical application of the capability was performed in non-exclusionary airspace in October 2011; additional research, development, flight testing, and evaluation efforts using this integrated capability are planned throughout fiscal year 2012 and 2013.
Medical Robotic and Tele surgical Simulation Education Research
2017-05-01
training exercises, DVSS = 40, dVT = 65, and RoSS = 52 for skills development. All three offer 3D visual images but use different display technologies...capabilities with an emphasis on their educational skills. They offer unique advantages and capabilities in training robotic sur- geons. Each device has been...evaluate the transfer of training effect of each simulator. Collectively, this work will offer end users and potential buyers a comparison of the value
Xyce Parallel Electronic Simulator : users' guide, version 2.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont
2004-06-01
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for allmore » numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce These input formats include standard analytical models, behavioral models look-up Parallel Electronic Simulator is designed to support a variety of device model inputs. tables, and mesh-level PDE device models. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important feature of Xyce is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Ultimately, these capabilities are migrated to end users.« less
Investigation of a compact coaxially fed switched oscillator.
Wang, Yuwei; Chen, Dongqun; Zhang, Jiande; Cao, Shengguang; Li, Da; Liu, Chebo
2013-09-01
To generate a relative high frequency mesoband microwave, a compact coaxially fed transmission line switched oscillator with high voltage capability is investigated. The characteristic impedance and voltage capability of the low impedance transmission line (LITL) have been analyzed. It is shown that the working voltage of the oscillator can reach up to 200 kV when it is filled by pressurized nitrogen and charged by a nanosecond driving source. By utilizing a commercial electromagnetic simulation code, the transient performance of the switched oscillator with a lumped resistance load is simulated. It is illustrated that the center frequency of the output signal reaches up to ~0.6 GHz when the spark gap practically closes with a single channel. Besides, the influence of the closing mode and rapidity of the spark gap, the permittivity of the insulator at the output end of the LITL, and the load impedance on the transient performance of the designed oscillator has been analyzed in quantification. Finally, the good transient performance of the switched oscillator has been preliminarily proved by the experiment.
Application of CFE/POST2 for Simulation of Launch Vehicle Stage Separation
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Tartabini, Paul V.; Toniolo, Matthew D.; Roithmayr, Carlos M.; Karlgaard, Christopher D.; Samareh, Jamshid A.
2009-01-01
The constraint force equation (CFE) methodology provides a framework for modeling constraint forces and moments acting at joints that connect multiple vehicles. With implementation in Program to Optimize Simulated Trajectories II (POST 2), the CFE provides a capability to simulate end-to-end trajectories of launch vehicles, including stage separation. In this paper, the CFE/POST2 methodology is applied to the Shuttle-SRB separation problem as a test and validation case. The CFE/POST2 results are compared with STS-1 flight test data.
High-order continuum kinetic method for modeling plasma dynamics in phase space
Vogman, G. V.; Colella, P.; Shumlak, U.
2014-12-15
Continuum methods offer a high-fidelity means of simulating plasma kinetics. While computationally intensive, these methods are advantageous because they can be cast in conservation-law form, are not susceptible to noise, and can be implemented using high-order numerical methods. Advances in continuum method capabilities for modeling kinetic phenomena in plasmas require the development of validation tools in higher dimensional phase space and an ability to handle non-cartesian geometries. To that end, a new benchmark for validating Vlasov-Poisson simulations in 3D (x,v x,v y) is presented. The benchmark is based on the Dory-Guest-Harris instability and is successfully used to validate a continuummore » finite volume algorithm. To address challenges associated with non-cartesian geometries, unique features of cylindrical phase space coordinates are described. Preliminary results of continuum kinetic simulations in 4D (r,z,v r,v z) phase space are presented.« less
NASA GRC's High Pressure Burner Rig Facility and Materials Test Capabilities
NASA Technical Reports Server (NTRS)
Robinson, R. Craig
1999-01-01
The High Pressure Burner Rig (HPBR) at NASA Glenn Research Center is a high-velocity. pressurized combustion test rig used for high-temperature environmental durability studies of advanced materials and components. The facility burns jet fuel and air in controlled ratios, simulating combustion gas chemistries and temperatures that are realistic to those in gas turbine engines. In addition, the test section is capable of simulating the pressures and gas velocities representative of today's aircraft. The HPBR provides a relatively inexpensive. yet sophisticated means for researchers to study the high-temperature oxidation of advanced materials. The facility has the unique capability of operating under both fuel-lean and fuel-rich gas mixtures. using a fume incinerator to eliminate any harmful byproduct emissions (CO, H2S) of rich-burn operation. Test samples are easily accessible for ongoing inspection and documentation of weight change, thickness, cracking, and other metrics. Temperature measurement is available in the form of both thermocouples and optical pyrometery. and the facility is equipped with quartz windows for observation and video taping. Operating conditions include: (1) 1.0 kg/sec (2.0 lbm/sec) combustion and secondary cooling airflow capability: (2) Equivalence ratios of 0.5- 1.0 (lean) to 1.5-2.0 (rich), with typically 10% H2O vapor pressure: (3) Gas temperatures ranging 700-1650 C (1300-3000 F): (4) Test pressures ranging 4-12 atmospheres: (5) Gas flow velocities ranging 10-30 m/s (50-100) ft/sec.: and (6) Cyclic and steady-state exposure capabilities. The facility has historically been used to test coupon-size materials. including metals and ceramics. However complex-shaped components have also been tested including cylinders, airfoils, and film-cooled end walls. The facility has also been used to develop thin-film temperature measurement sensors.
NASA Astrophysics Data System (ADS)
Carrico, T.; Langster, T.; Carrico, J.; Alfano, S.; Loucks, M.; Vallado, D.
The authors present several spacecraft rendezvous and close proximity maneuvering techniques modeled with a high-precision numerical integrator using full force models and closed loop control with a Fuzzy Logic intelligent controller to command the engines. The authors document and compare the maneuvers, fuel use, and other parameters. This paper presents an innovative application of an existing capability to design, simulate and analyze proximity maneuvers; already in use for operational satellites performing other maneuvers. The system has been extended to demonstrate the capability to develop closed loop control laws to maneuver spacecraft in close proximity to another, including stand-off, docking, lunar landing and other operations applicable to space situational awareness, space based surveillance, and operational satellite modeling. The fully integrated end-to-end trajectory ephemerides are available from the authors in electronic ASCII text by request. The benefits of this system include: A realistic physics-based simulation for the development and validation of control laws A collaborative engineering environment for the design, development and tuning of spacecraft law parameters, sizing actuators (i.e., rocket engines), and sensor suite selection. An accurate simulation and visualization to communicate the complexity, criticality, and risk of spacecraft operations. A precise mathematical environment for research and development of future spacecraft maneuvering engineering tasks, operational planning and forensic analysis. A closed loop, knowledge-based control example for proximity operations. This proximity operations modeling and simulation environment will provide a valuable adjunct to programs in military space control, space situational awareness and civil space exploration engineering and decision making processes.
Evolution of Software-Only-Simulation at NASA IV and V
NASA Technical Reports Server (NTRS)
McCarty, Justin; Morris, Justin; Zemerick, Scott
2014-01-01
Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.
NASA Astrophysics Data System (ADS)
Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team
2018-01-01
We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.
Simulation of Forward and Inverse X-ray Scattering From Shocked Materials
NASA Astrophysics Data System (ADS)
Barber, John; Marksteiner, Quinn; Barnes, Cris
2012-02-01
The next generation of high-intensity, coherent light sources should generate sufficient brilliance to perform in-situ coherent x-ray diffraction imaging (CXDI) of shocked materials. In this work, we present beginning-to-end simulations of this process. This includes the calculation of the partially-coherent intensity profiles of self-amplified stimulated emission (SASE) x-ray free electron lasers (XFELs), as well as the use of simulated, shocked molecular-dynamics-based samples to predict the evolution of the resulting diffraction patterns. In addition, we will explore the corresponding inverse problem by performing iterative phase retrieval to generate reconstructed images of the simulated sample. The development of these methods in the context of materials under extreme conditions should provide crucial insights into the design and capabilities of shocked in-situ imaging experiments.
A virtual reality system for the training of volunteers involved in health emergency situations.
De Leo, Gianluca; Ponder, Michal; Molet, Tom; Fato, Marco; Thalmann, Daniel; Magnenat-Thalmann, Nadia; Bermano, Francesco; Beltrame, Francesco
2003-06-01
In order to guarantee an effective and punctual medical intervention to injured people involved in health emergency situations, where usually both professional and non-professional health operators are involved, a fast and accurate treatment has to be carried out. In case of catastrophic or very critical situations, non-professional operators who did not receive proper training (volunteers are among them) could be affected by psychological inhibitions. Their performances could slow down in such way that would affect the quality of the treatment and increase both direct and indirect costs. Our virtual reality system that is currently in use at the health care emergency center of San Martino Hospital in Genoa, Italy, has been designed and developed to check health emergency operators' capabilities to adopt correct decision-making procedures, to make optimal use of new technological equipment and to overcome psychological barriers. Our system is composed of (1) a high-end simulation PC, whose main functions are execution of the main software module, rendering of 3D scenes in stereo mode, rendering of sound, and control of data transmission from/to VR devices; (2) a low-end control PC, which controls the VR simulation running on the simulation PC, manages medical emergency simulation scenarios, introduces unexpected events to the simulation and controls the simulation difficulty level; (3) a magnetic-based motion tracking device used for head and hand tracking; (4) a wireless pair of shutter glasses together with a cathode ray tube wall projector; and (5) a high-end surround sound system. The expected benefits have been verified through the design and implementation of controlled clinical trials.
Multi-threaded ATLAS simulation on Intel Knights Landing processors
NASA Astrophysics Data System (ADS)
Farrell, Steven; Calafiura, Paolo; Leggett, Charles; Tsulaia, Vakhtang; Dotti, Andrea; ATLAS Collaboration
2017-10-01
The Knights Landing (KNL) release of the Intel Many Integrated Core (MIC) Xeon Phi line of processors is a potential game changer for HEP computing. With 72 cores and deep vector registers, the KNL cards promise significant performance benefits for highly-parallel, compute-heavy applications. Cori, the newest supercomputer at the National Energy Research Scientific Computing Center (NERSC), was delivered to its users in two phases with the first phase online at the end of 2015 and the second phase now online at the end of 2016. Cori Phase 2 is based on the KNL architecture and contains over 9000 compute nodes with 96GB DDR4 memory. ATLAS simulation with the multithreaded Athena Framework (AthenaMT) is a good potential use-case for the KNL architecture and supercomputers like Cori. ATLAS simulation jobs have a high ratio of CPU computation to disk I/O and have been shown to scale well in multi-threading and across many nodes. In this paper we will give an overview of the ATLAS simulation application with details on its multi-threaded design. Then, we will present a performance analysis of the application on KNL devices and compare it to a traditional x86 platform to demonstrate the capabilities of the architecture and evaluate the benefits of utilizing KNL platforms like Cori for ATLAS production.
An efficient group multicast routing for multimedia communication
NASA Astrophysics Data System (ADS)
Wang, Yanlin; Sun, Yugen; Yan, Xinfang
2004-04-01
Group multicasting is a kind of communication mechanism whereby each member of a group sends messages to all the other members of the same group. Group multicast routing algorithms capable of satisfying quality of service (QoS) requirements of multimedia applications are essential for high-speed networks. We present a heuristic algorithm for group multicast routing with end to end delay constraint. Source-specific routing trees for each member are generated in our algorithm, which satisfy member"s bandwidth and end to end delay requirements. Simulations over random network were carried out to compare proposed algorithm performance with Low and Song"s. The experimental results show that our proposed algorithm performs better in terms of network cost and ability in constructing feasible multicast trees for group members. Moreover, our algorithm achieves good performance in balancing traffic, which can avoid link blocking and enhance the network behavior efficiently.
NASA Astrophysics Data System (ADS)
Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.
2016-05-01
The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.
Dynamic Emulation of NASA Missions for IVandV: A Case Study of JWST and SLS
NASA Technical Reports Server (NTRS)
Yokum, Steve
2015-01-01
Software-Only-Simulations are an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations ranging from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).
An Overview of the Distributed Space Exploration Simulation (DSES) Project
NASA Technical Reports Server (NTRS)
Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.
2007-01-01
This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.
SIMNET: an insider's perspective
NASA Astrophysics Data System (ADS)
Cosby, L. Neale
1995-04-01
Simulator Networking (SIMNET) began with a young scientist's idea but has ended up changing an entire industry and the way the military does business. And the story isn't over yet. SIMNET began as an advanced research project aimed at developing a core technology for networking hundreds of affordable simulators worldwide in real time to practice joint collective warfighting skills and to develop better acquisition practices. It was a daring project that proved the Advanced Research Projects Agency (ARPA) mission of doing "what cannot be done." It was a serious threat to the existing simulation industry. As it turned out, the government got what it wanted—a low-cost, high-performance virtual simulation capability that could be proliferated like consumer electronics. This paper provides an insider's view of the program history, identifies some possible lessons for future developers, and opines future growth for SIMNET technology.
Investigation of a compact coaxially fed switched oscillator
NASA Astrophysics Data System (ADS)
Wang, Yuwei; Chen, Dongqun; Zhang, Jiande; Cao, Shengguang; Li, Da; Liu, Chebo
2013-09-01
To generate a relative high frequency mesoband microwave, a compact coaxially fed transmission line switched oscillator with high voltage capability is investigated. The characteristic impedance and voltage capability of the low impedance transmission line (LITL) have been analyzed. It is shown that the working voltage of the oscillator can reach up to 200 kV when it is filled by pressurized nitrogen and charged by a nanosecond driving source. By utilizing a commercial electromagnetic simulation code, the transient performance of the switched oscillator with a lumped resistance load is simulated. It is illustrated that the center frequency of the output signal reaches up to ˜0.6 GHz when the spark gap practically closes with a single channel. Besides, the influence of the closing mode and rapidity of the spark gap, the permittivity of the insulator at the output end of the LITL, and the load impedance on the transient performance of the designed oscillator has been analyzed in quantification. Finally, the good transient performance of the switched oscillator has been preliminarily proved by the experiment.
NASA Technical Reports Server (NTRS)
Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.
1984-01-01
The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.
End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics
NASA Technical Reports Server (NTRS)
Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.
2012-01-01
The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.
A system for the simulation and evaluation of satellite communication networks
NASA Technical Reports Server (NTRS)
Bagwell, J. W.
1983-01-01
With the emergence of a new era in satellite communications brought about by NASA's thrust into the Ka band with multibeam and onboard processing technologies, new and innovative techniques for evaluating these concepts and systems are required. To this end, NASA, in conjunction with its extensive program for advanced communications technology development, has undertaken to develop a concept for the simulation and evaluation of a complete communications network. Incorporated in this network will be proof of concept models of the latest technologies proposed for future satellite communications systems. These include low noise receivers, matrix switches, baseband processors, and solid state and tube type high power amplifiers. To accomplish this, numerous supporting technologies must be added to those aforementioned proof of concept models. These include controllers for synchronization, order wire, and resource allocation, gain compensation, signal leveling, power augmentation, and rain fade and range delay simulation. Taken together, these will be assembled to comprise a system capable of addressing numerous design and performance questions. The simulation and evaluation system as planned will be modular in design and implementation, capable of modification and updating to track and evaluate a continuum emerging concepts and technologies.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
Simulating optoelectronic systems for remote sensing with SENSOR
NASA Astrophysics Data System (ADS)
Boerner, Anko
2003-04-01
The consistent end-to-end simulation of airborne and spaceborne remote sensing systems is an important task and sometimes the only way for the adaptation and optimization of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software ENvironment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. It allows the simulation of a wide range of optoelectronic systems for remote sensing. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. Part three consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimization requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and examples of its use are given. The verification of SENSOR is demonstrated.
The Distributed Space Exploration Simulation (DSES)
NASA Technical Reports Server (NTRS)
Crues, Edwin Z.; Chung, Victoria I.; Blum, Mike G.; Bowman, James D.
2007-01-01
The paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which focuses on the investigation and development of technologies, processes and integrated simulations related to the collaborative distributed simulation of complex space systems in support of NASA's Exploration Initiative. This paper describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. In the network work area, DSES is developing a Distributed Simulation Network that will provide agency wide support for distributed simulation between all NASA centers. In the software work area, DSES is developing a collection of software models, tool and procedures that ease the burden of developing distributed simulations and provides a consistent interoperability infrastructure for agency wide participation in integrated simulation. Finally, for simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper will present current status and plans for each of these work areas with specific examples of simulations that support NASA's exploration initiatives.
Space Power Facility-Capabilities for Space Environmental Testing Within a Single Facility
NASA Technical Reports Server (NTRS)
Sorge, Richard N.
2013-01-01
The purpose of this paper is to describe the current and near-term environmental test capabilities of the NASA Glenn Research Center's Space Power Facility (SPF) located at Sandusky, Ohio. The paper will present current and near-term capabilities for conducting electromagnetic interference and compatibility testing, base-shake sinusoidal vibration testing, reverberant acoustic testing, and thermal-vacuum testing. The paper will also present modes of transportation, handling, ambient environments, and operations within the facility to conduct those tests. The SPF is in the midst of completing and activating new or refurbished capabilities which, when completed, will provide the ability to conduct most or all required full-scale end-assembly space simulation tests at a single test location. It is envisioned that the capabilities will allow a customer to perform a wide range of space simulation tests in one facility at reasonable cost.
V&V Of CFD Modeling Of The Argonne Bubble Experiment: FY15 Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, Nathaniel C.; Wardle, Kent E.; Bailey, James L.
2015-09-30
In support of the development of accelerator-driven production of the fission product Mo 99, computational fluid dynamics (CFD) simulations of an electron-beam irradiated, experimental-scale bubble chamber have been conducted in order to aid in interpretation of existing experimental results, provide additional insights into the physical phenomena, and develop predictive thermal hydraulic capabilities that can be applied to full-scale target solution vessels. Toward that end, a custom hybrid Eulerian-Eulerian-Lagrangian multiphase solver was developed, and simulations have been performed on high-resolution meshes. Good agreement between experiments and simulations has been achieved, especially with respect to the prediction of the maximum temperature ofmore » the uranyl sulfate solution in the experimental vessel. These positive results suggest that the simulation methodology that has been developed will prove to be suitable to assist in the development of full-scale production hardware.« less
Recent Developments in Hardware-in-the-Loop Formation Navigation and Control
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Luquette, Richard J.
2005-01-01
The Formation Flying Test-Bed (FFTB) at NASA Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-tc-end guidance, navigation, and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, are reviewed with a focus on many recent improvements. Two significant upgrades to the FFTB are a message-oriented middleware (MOM) architecture, and a software crosslink for inter-spacecraft ranging. The MOM architecture provides a common messaging bus for software agents, easing integration, arid supporting the GSFC Mission Services Evolution Center (GMSEC) architecture via software bridge. Additionally, the FFTB s hardware capabilities are expanding. Recently, two Low-Power Transceivers (LPTs) with ranging capability have been introduced into the FFTB. The LPT crosslinks will be connected to a modified Crosslink Channel Simulator (CCS), which applies realistic space-environment effects to the Radio Frequency (RF) signals produced by the LPTs.
Development of the NTF-117S Semi-Span Balance
NASA Technical Reports Server (NTRS)
Lynn, Keith C.
2010-01-01
A new high-capacity semi-span force and moment balance has recently been developed for use at the National Transonic Facility at the NASA Langley Research Center. This new semi-span balance provides the NTF a new measurement capability that will support testing of semi-span test models at transonic high-lift testing regimes. Future testing utilizing this new balance capability will include active circulation control and propulsion simulation testing of semi-span transonic wing models. The NTF has recently implemented a new highpressure air delivery station that will provide both high and low mass flow pressure lines that are routed out to the semi-span models via a set high/low pressure bellows that are indirectly linked to the metric end of the NTF-117S balance. A new check-load stand is currently being developed to provide the NTF with an in-house capability that will allow for performing check-loads on the NTF-117S balance in order to determine the pressure tare affects on the overall performance of the balance. An experimental design is being developed that will allow for experimentally assessing the static pressure tare affects on the balance performance.
Simulation capability for dynamics of two-body flexible satellites
NASA Technical Reports Server (NTRS)
Austin, F.; Zetkov, G.
1973-01-01
An analysis and computer program were prepared to realistically simulate the dynamic behavior of a class of satellites consisting of two end bodies separated by a connecting structure. The shape and mass distribution of the flexible end bodies are arbitrary; the connecting structure is flexible but massless and is capable of deployment and retraction. Fluid flowing in a piping system and rigid moving masses, representing a cargo elevator or crew members, have been modeled. Connecting structure characteristics, control systems, and externally applied loads are modeled in easily replaced subroutines. Subroutines currently available include a telescopic beam-type connecting structure as well as attitude, deployment, spin and wobble control. In addition, a unique mass balance control system was developed to sense and balance mass shifts due to the motion of a cargo elevator. The mass of the cargo may vary through a large range. Numerical results are discussed for various types of runs.
Simulator Evaluation of Runway Incursion Prevention Technology for General Aviation Operations
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Prinzel, Lawrence J., III
2011-01-01
A Runway Incursion Prevention System (RIPS) has been designed under previous research to enhance airport surface operations situation awareness and provide cockpit alerts of potential runway conflict, during transport aircraft category operations, in order to prevent runway incidents while also improving operations capability. This study investigated an adaptation of RIPS for low-end general aviation operations using a fixed-based simulator at the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). The purpose of the study was to evaluate modified RIPS aircraft-based incursion detection algorithms and associated alerting and airport surface display concepts for low-end general aviation operations. This paper gives an overview of the system, simulation study, and test results.
Time-Dependent Simulations of Turbopump Flows
NASA Technical Reports Server (NTRS)
Kiris, Cetin; Kwak, Dochan; Chan, William; Williams, Robert
2002-01-01
Unsteady flow simulations for RLV (Reusable Launch Vehicles) 2nd Generation baseline turbopump for one and half impeller rotations have been completed by using a 34.3 Million grid points model. MLP (Multi-Level Parallelism) shared memory parallelism has been implemented in INS3D, and benchmarked. Code optimization for cash based platforms will be completed by the end of September 2001. Moving boundary capability is obtained by using DCF module. Scripting capability from CAD (computer aided design) geometry to solution has been developed. Data compression is applied to reduce data size in post processing. Fluid/Structure coupling has been initiated.
Scarlata, Simone; Palermo, Patrizio; Candoli, Piero; Tofani, Ariela; Petitti, Tommasangelo; Corbetta, Lorenzo
2017-04-01
Linear endobronchial ultrasound transbronchial needle aspiration (EBUS-TBNA) represents a pivotal innovation in interventional pulmonology; determining the best approach to guarantee systematic and efficient training is expected to become a main issue in the forthcoming years. Virtual reality simulators have been proposed as potential EBUS-TBNA training instruments, to avoid unskilled beginners practicing directly in real-life settings. A validated and perfected simulation program could be used before allowing beginners to practice on patients. Our goal was to test the reliability of the EBUS-Skills and Task Assessment Tool (STAT) and its subscores for measuring the competence of experienced bronchoscopists approaching EBUS-guided TBNA, using only the virtual reality simulator as both a training and an assessment tool. Fifteen experienced bronchoscopists, with poor or no experience in EBUS-TBNA, participated in this study. They were all administered the Italian version of the EBUS-STAT evaluation tool, during a high-fidelity virtual reality simulation. This was followed by a single 7-hour theoretical and practical (on simulators) session on EBUS-TBNA, at the end of which their skills were reassessed by EBUS-STAT. An overall, significant improvement in EBUS-TBNA skills was observed, thereby confirming that (a) virtual reality simulation can facilitate practical learning among practitioners, and (b) EBUS-STAT is capable of detecting these improvements. The test's overall ability to detect differences was negatively influenced by the minimal variation of the scores relating to items 1 and 2, was not influenced by the training, and improved significantly when the 2 items were not considered. Apart from these 2 items, all the remaining subscores were equally capable of revealing improvements in the learner. Lastly, we found that trainees with presimulation EBUS-STAT scores above 79 did not show any significant improvement after virtual reality training, suggesting that this score represents a cutoff value capable of predicting the likelihood that simulation can be beneficial. Virtual reality simulation is capable of providing a practical learning tool for practitioners with previous experience in flexible bronchoscopy, and the EBUS-STAT questionnaire is capable of detecting these changes. A pretraining EBUS-STAT score below 79 is a good indicator of those candidates who will benefit from the simulation training. Further studies are needed to verify whether a modified version of the questionnaire would be capable of improving its performance among experienced bronchoscopists.
Rigorous vector wave propagation for arbitrary flat media
NASA Astrophysics Data System (ADS)
Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.
2017-08-01
Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.
The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing
NASA Technical Reports Server (NTRS)
Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.
2010-01-01
The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.
NASA Astrophysics Data System (ADS)
Kumar, S.; Peters-Lidard, C. D.; Harrison, K.; Santanello, J. A.; Bach Kirschbaum, D.
2014-12-01
Observing System Simulation Experiments (OSSEs) are often conducted to evaluate the worth of existing data and data yet to be collected from proposed new missions. As missions increasingly require a broader ``Earth systems'' focus, it is important that the OSSEs capture the potential benefits of the observations on end-use applications. Towards this end, the results from the OSSEs must also be evaluated with a suite of metrics that capture the value, uncertainty, and information content of the observations while factoring in both science and societal impacts. In this presentation, we present the development of an end-to-end and end-use application oriented OSSE platform using the capabilities of the NASA Land Information System (LIS) developed for terrestrial hydrology. Four case studies that demonstrate the capabilities of the system will be presented: (1) A soil moisture OSSE that employs simulated L-band measurements and examines their impacts towards applications such as floods and droughts. The experiment also uses a decision-theory based analysis to assess the economic utility of observations towards improving drought and flood risk estimates, (2) A GPM-relevant study quantifies the impact of improved precipitation retrievals from GPM towards improving landslide forecasts, (3) A case study that examines the utility of passive microwave soil moisture observations towards weather prediction, and (4) OSSEs used for developing science requirements for the GRACE-2 mission. These experiments also demonstrate the value of a comprehensive modeling environment such as LIS for conducting end-to-end OSSEs by linking satellite observations, physical models, data assimilation algorithms and end-use application models in a single integrated framework.
Innovative Educational Aerospace Research at the Northeast High School Space Research Center
NASA Technical Reports Server (NTRS)
Luyet, Audra; Matarazzo, Anthony; Folta, David
1997-01-01
Northeast High Magnet School of Philadelphia, Pennsylvania is a proud sponsor of the Space Research Center (SPARC). SPARC, a model program of the Medical, Engineering, and Aerospace Magnet school, provides talented students the capability to successfully exercise full simulations of NASA manned missions. These simulations included low-Earth Shuttle missions and Apollo lunar missions in the past, and will focus on a planetary mission to Mars this year. At the end of each scholastic year, a simulated mission, lasting between one and eight days, is performed involving 75 students as specialists in seven teams The groups are comprised of Flight Management, Spacecraft Communications (SatCom), Computer Networking, Spacecraft Design and Engineering, Electronics, Rocketry, Robotics, and Medical teams in either the mission operations center or onboard the spacecraft. Software development activities are also required in support of these simulations The objective of this paper is to present the accomplishments, technology innovations, interactions, and an overview of SPARC with an emphasis on how the program's educational activities parallel NASA mission support and how this education is preparing student for the space frontier.
Enhanced modeling and simulation of EO/IR sensor systems
NASA Astrophysics Data System (ADS)
Hixson, Jonathan G.; Miller, Brian; May, Christopher
2015-05-01
The testing and evaluation process developed by the Night Vision and Electronic Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) provides end to end systems evaluation, testing, and training of EO/IR sensors. By combining NV-LabCap, the Night Vision Integrated Performance Model (NV-IPM), One Semi-Automated Forces (OneSAF) input sensor file generation, and the Night Vision Image Generator (NVIG) capabilities, NVESD provides confidence to the M&S community that EO/IR sensor developmental and operational testing and evaluation are accurately represented throughout the lifecycle of an EO/IR system. This new process allows for both theoretical and actual sensor testing. A sensor can be theoretically designed in NV-IPM, modeled in NV-IPM, and then seamlessly input into the wargames for operational analysis. After theoretical design, prototype sensors can be measured by using NV-LabCap, then modeled in NV-IPM and input into wargames for further evaluation. The measurement process to high fidelity modeling and simulation can then be repeated again and again throughout the entire life cycle of an EO/IR sensor as needed, to include LRIP, full rate production, and even after Depot Level Maintenance. This is a prototypical example of how an engineering level model and higher level simulations can share models to mutual benefit.
Improved Load Alleviation Capability for the KC-135
1997-09-01
software, such as Matlab, Mathematica, Simulink, and Robotica Front End for Mathematica available in the simulation laboratory Overview This thesis report is...outlined in Spong’s text in order to utilize the Robotica system development software which automates the process of calculating the kinematic and...kinematic and dynamic equations can be accomplished using a computer tool called Robotica Front End (RFE) [ 15], developed by Doctor Spong. Boom Root d3
NASA Astrophysics Data System (ADS)
Srimannarayana, K.; Vengal Rao, P.; Sai Shankar, M.; Kishore, P.
2014-05-01
A temperature independent high sensitive pressure sensing system using fiber Bragg grating (FBG) and `C' shaped Bourdon tube (CBT) is demonstrated. The sensor is configured by firmly fixing the FBG (FBG1) between free and fixed ends of the CBT. Additional FBG (FBG2) in line to the FBG1 is introduced which is shielded from the external pressure, tend to measure only the ambient temperature fluctuations. The CBT has an elliptical cross section where its free end is sealed and the fixed end is open for subjecting the liquid or gas pressure to be measured. With the application of pressure, the free end of CBT tends to straighten out results in an axial strain in FBG1 causes red shift in Bragg wavelength. The pressure can be determined by measuring the shift of the Bragg wavelength. The experimental pressure sensitivity is found to be 66.9 pm/psi over a range of 0 to 100 psi. The test results show that the Bragg wavelength shift is linear corresponds to change in applied pressure and well agreed with the simulated results. This simple and high sensitive design is capable of measuring static/dynamic pressure and temperature simultaneously which suits for industrial applications.
NASA Technical Reports Server (NTRS)
Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon
2016-01-01
Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.
SENSOR: a tool for the simulation of hyperspectral remote sensing systems
NASA Astrophysics Data System (ADS)
Börner, Anko; Wiest, Lorenz; Keller, Peter; Reulke, Ralf; Richter, Rolf; Schaepman, Michael; Schläpfer, Daniel
The consistent end-to-end simulation of airborne and spaceborne earth remote sensing systems is an important task, and sometimes the only way for the adaptation and optimisation of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software Environment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray-tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. The third part consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimisation requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and first examples of its use are given. The verification of SENSOR is demonstrated. This work is closely related to the Airborne PRISM Experiment (APEX), an airborne imaging spectrometer funded by the European Space Agency.
NASA Technical Reports Server (NTRS)
Orifici, Adrian C.; Krueger, Ronald
2010-01-01
With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.
Parachute Models Used in the Mars Science Laboratory Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Cruz, Juan R.; Way, David W.; Shidner, Jeremy D.; Davis, Jody L.; Powell, Richard W.; Kipp, Devin M.; Adams, Douglas S.; Witkowski, Al; Kandis, Mike
2013-01-01
An end-to-end simulation of the Mars Science Laboratory (MSL) entry, descent, and landing (EDL) sequence was created at the NASA Langley Research Center using the Program to Optimize Simulated Trajectories II (POST2). This simulation is capable of providing numerous MSL system and flight software responses, including Monte Carlo-derived statistics of these responses. The MSL POST2 simulation includes models of EDL system elements, including those related to the parachute system. Among these there are models for the parachute geometry, mass properties, deployment, inflation, opening force, area oscillations, aerodynamic coefficients, apparent mass, interaction with the main landing engines, and off-loading. These models were kept as simple as possible, considering the overall objectives of the simulation. The main purpose of this paper is to describe these parachute system models to the extent necessary to understand how they work and some of their limitations. A list of lessons learned during the development of the models and simulation is provided. Future improvements to the parachute system models are proposed.
Flight code validation simulator
NASA Astrophysics Data System (ADS)
Sims, Brent A.
1996-05-01
An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.
Simulation and analysis of tape spring for deployed space structures
NASA Astrophysics Data System (ADS)
Chang, Wei; Cao, DongJing; Lian, MinLong
2018-03-01
The tape spring belongs to the configuration of ringent cylinder shell, and the mechanical properties of the structure are significantly affected by the change of geometrical parameters. There are few studies on the influence of geometrical parameters on the mechanical properties of the tape spring. The bending process of the single tape spring was simulated based on simulation software. The variations of critical moment, unfolding moment, and maximum strain energy in the bending process were investigated, and the effects of different radius angles of section and thickness and length on driving capability of the simple tape spring was studied by using these parameters. Results show that the driving capability and resisting disturbance capacity grow with the increase of radius angle of section in the bending process of the single tape spring. On the other hand, these capabilities decrease with increasing length of the single tape spring. In the end, the driving capability and resisting disturbance capacity grow with the increase of thickness in the bending process of the single tape spring. The research has a certain reference value for improving the kinematic accuracy and reliability of deployable structures.
University Research in Support of TREAT Modeling and Simulation, FY 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeHart, Mark David
Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed inmore » support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.« less
Extreme Scale Computing to Secure the Nation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L; McGraw, J R; Johnson, J R
2009-11-10
Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Baldwin, Philip J.; Kurichh, Rishi; Naasz, Bo J.; Luquette, Richard J.
2007-01-01
The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation and. control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, have expanded to include S-band Radio Frequency (RF) modems for inter-spacecraft communication and ranging. To enable realistic simulations that require RF ranging sensors for relative navigation, a mechanism is needed to buffer the RF signals exchanged between spacecraft that accurately emulates the dynamic environment through which the RF signals travel, including the effects of medium, moving platforms, and radiated power. The Path Emulator for RF Signals (PERFS), currently under development at NASA GSFC, provides this capability. The function and performance of a prototype device are presented.
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Baldwin, Philip J.; Kurichh, Rishi; Naasz, Bo J.; Luquette, Richard J.
2007-01-01
The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, have expanded to include S-band Radio Frequency (RF) modems for interspacecraft communication and ranging. To enable realistic simulations that require RF ranging sensors for relative navigation, a mechanism is needed to buffer the RF signals exchanged between spacecraft that accurately emulates the dynamic environment through which the RF signals travel, including the effects of the medium, moving platforms, and radiated power. The Path Emulator for Radio Frequency Signals (PERFS), currently under development at NASA GSFC, provides this capability. The function and performance of a prototype device are presented.
Simulation of APEX data: the SENSOR approach
NASA Astrophysics Data System (ADS)
Boerner, Anko; Schaepman, Michael E.; Schlaepfer, Daniel; Wiest, Lorenz; Reulke, Ralf
1999-10-01
The consistent simulation of airborne and spaceborne hyperspectral data is an important task and sometimes the only way for the adaptation and optimization of a sensor and its observing conditions, the choice and test of algorithms for data processing, error estimations and the evaluation of the capabilities of the whole sensor system. The integration of three approaches is suggested for the data simulation of APEX (Airborne Prism Experiment): (1) a spectrally consistent approach (e.g. using AVIRIS data), (2) a geometrically consistent approach (e.g. using CASI data), and (3) an end-to- end simulation of the sensor system. In this paper, the last approach is discussed in detail. Such a technique should be used if there is no simple deterministic relation between input and output parameters. The simulation environment SENSOR (Software Environment for the Simulation of Optical Remote Sensing Systems) presented here includes a full model of the sensor system, the observed object and the atmosphere. The simulator consists of three parts. The first part describes the geometrical relations between object, sun, and sensor using a ray tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor-radiance using a pre-calculated multidimensional lookup-table for the atmospheric boundary conditions and bi- directional reflectances. Part three consists of an optical and an electronic sensor model for the generation of digital images. Application-specific algorithms for data processing must be considered additionally. The benefit of using an end- to-end simulation approach is demonstrated, an example of a simulated APEX data cube is given, and preliminary steps of evaluation of SENSOR are carried out.
The end-to-end simulator for the E-ELT HIRES high resolution spectrograph
NASA Astrophysics Data System (ADS)
Genoni, M.; Landoni, M.; Riva, M.; Pariani, G.; Mason, E.; Di Marcantonio, P.; Disseau, K.; Di Varano, I.; Gonzalez, O.; Huke, P.; Korhonen, H.; Li Causi, Gianluca
2017-06-01
We present the design, architecture and results of the End-to-End simulator model of the high resolution spectrograph HIRES for the European Extremely Large Telescope (E-ELT). This system can be used as a tool to characterize the spectrograph both by engineers and scientists. The model allows to simulate the behavior of photons starting from the scientific object (modeled bearing in mind the main science drivers) to the detector, considering also calibration light sources, and allowing to perform evaluation of the different parameters of the spectrograph design. In this paper, we will detail the architecture of the simulator and the computational model which are strongly characterized by modularity and flexibility that will be crucial in the next generation astronomical observation projects like E-ELT due to of the high complexity and long-time design and development. Finally, we present synthetic images obtained with the current version of the End-to-End simulator based on the E-ELT HIRES requirements (especially high radial velocity accuracy). Once ingested in the Data reduction Software (DRS), they will allow to verify that the instrument design can achieve the radial velocity accuracy needed by the HIRES science cases.
Remote sensing of the low-latitude daytime ionosphere: ICON simulations and retrievals
NASA Astrophysics Data System (ADS)
Stephan, A. W.; Korpela, E.; England, S.; Immel, T. J.
2016-12-01
The Ionospheric Connection Explorer (ICON) sensor suite includes a spectrograph that will provide altitude profiles of the OII 61.7 and 83.4 nm airglow features, from which the daytime F-region ionosphere can be inferred. To make the connection between these extreme-ultraviolet (EUV) airglow emissions and ionospheric densities, ICON will use a method that has matured significantly in the last decade with the analysis of data from the Remote Atmospheric and Ionospheric Detection System (RAIDS) on the International Space Station, and the Special Sensor Ultraviolet Limb Imager (SSULI) sensors on the Defense Meteorological Satellite Program (DMSP) series of satellites. We will present end-to-end simulations of ICON EUV airglow measurements and data inversion for the expected viewing geometry and sensor capabilities, including noise. While we will focus on the performance of the algorithm for ICON within the context of the current state of knowledge, we will also identify areas where fundamental information can be gained from the high-sensitivity ICON measurements that could be used as feedback to directly improve the overall performance of the algorithm itself.
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.
2008-01-01
NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In addition, the performance of DSIL under different traffic loads with different mix of data and priorities are evaluated.
An effective approach for road asset management through the FDTD simulation of the GPR signal
NASA Astrophysics Data System (ADS)
Benedetto, Andrea; Pajewski, Lara; Adabi, Saba; Kusayanagi, Wolfgang; Tosti, Fabio
2015-04-01
Ground-penetrating radar is a non-destructive tool widely used in many fields of application including pavement engineering surveys. Over the last decade, the need for further breakthroughs capable to assist end-users and practitioners as decision-support systems in more effective road asset management is increasing. In more details and despite the high potential and the consolidated results obtained over years by this non-destructive tool, pavement distress manuals are still based on visual inspections, so that only the effects and not the causes of faults are generally taken into account. In this framework, the use of simulation can represent an effective solution for supporting engineers and decision-makers in understanding the deep responses of both revealed and unrevealed damages. In this study, the potential of using finite-difference time-domain simulation of the ground-penetrating radar signal is analyzed by simulating several types of flexible pavement at different center frequencies of investigation typically used for road surveys. For these purposes, the numerical simulator GprMax2D, implementing the finite-difference time-domain method, was used, proving to be a highly effective tool for detecting road faults. In more details, comparisons with simplified undisturbed modelled pavement sections were carried out showing promising agreements with theoretical expectations, and good chances for detecting the shape of damages are demonstrated. Therefore, electromagnetic modelling has proved to represent a valuable support system in diagnosing the causes of damages, even for early or unrevealed faults. Further perspectives of this research will be focused on the modelling of more complex scenarios capable to represent more accurately the real boundary conditions of road cross-sections. Acknowledgements - This work has benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar".
PSPICE controlled-source models of analogous circuit for Langevin type piezoelectric transducer
NASA Astrophysics Data System (ADS)
Chen, Yeongchin; Wu, Menqjiun; Liu, Weikuo
2007-02-01
The design and construction of wide-band and high efficiency acoustical projector has long been considered an art beyond the capabilities of many smaller groups. Langevin type piezoelectric transducers have been the most candidate of sonar array system applied in underwater communication. The transducers are fabricated, by bolting head mass and tail mass on both ends of stacked piezoelectric ceramic, to satisfy the multiple, conflicting design for high power transmitting capability. The aim of this research is to study the characteristics of Langevin type piezoelectric transducer that depend on different metal loading. First, the Mason equivalent circuit is used to model the segmented piezoelectric ceramic, then, the impedance network of tail and head masses is deduced by the Newton’s theory. To obtain the optimal solution to a specific design formulation, PSPICE controlled-source programming techniques can be applied. A valid example of the application of PSPICE models for Langevin type transducer analysis is presented and the simulation results are in good agreement with the experimental measurements.
Low-cost, high-speed back-end processing system for high-frequency ultrasound B-mode imaging.
Chang, Jin Ho; Sun, Lei; Yen, Jesse T; Shung, K Kirk
2009-07-01
For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution.
Low-Cost, High-Speed Back-End Processing System for High-Frequency Ultrasound B-Mode Imaging
Chang, Jin Ho; Sun, Lei; Yen, Jesse T.; Shung, K. Kirk
2009-01-01
For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution. PMID:19574160
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
Modeling and analysis of hybrid pixel detector deficiencies for scientific applications
NASA Astrophysics Data System (ADS)
Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman
2015-08-01
Semiconductor hybrid pixel detectors often consist of a pixellated sensor layer bump bonded to a matching pixelated readout integrated circuit (ROIC). The sensor can range from high resistivity Si to III-V materials, whereas a Si CMOS process is typically used to manufacture the ROIC. Independent, device physics and electronic design automation (EDA) tools are used to determine sensor characteristics and verify functional performance of ROICs respectively with significantly different solvers. Some physics solvers provide the capability of transferring data to the EDA tool. However, single pixel transient simulations are either not feasible due to convergence difficulties or are prohibitively long. A simplified sensor model, which includes a current pulse in parallel with detector equivalent capacitor, is often used; even then, spice type top-level (entire array) simulations range from days to weeks. In order to analyze detector deficiencies for a particular scientific application, accurately defined transient behavioral models of all the functional blocks are required. Furthermore, various simulations, such as transient, noise, Monte Carlo, inter-pixel effects, etc. of the entire array need to be performed within a reasonable time frame without trading off accuracy. The sensor and the analog front-end can be modeling using a real number modeling language, as complex mathematical functions or detailed data can be saved to text files, for further top-level digital simulations. Parasitically aware digital timing is extracted in a standard delay format (sdf) from the pixel digital back-end layout as well as the periphery of the ROIC. For any given input, detector level worst-case and best-case simulations are performed using a Verilog simulation environment to determine the output. Each top-level transient simulation takes no more than 10-15 minutes. The impact of changing key parameters such as sensor Poissonian shot noise, analog front-end bandwidth, jitter due to clock distribution etc. can be accurately analyzed to determine ROIC architectural viability and bottlenecks. Hence the impact of the detector parameters on the scientific application can be studied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.
Semiconductor hybrid pixel detectors often consist of a pixellated sensor layer bump bonded to a matching pixelated readout integrated circuit (ROIC). The sensor can range from high resistivity Si to III-V materials, whereas a Si CMOS process is typically used to manufacture the ROIC. Independent, device physics and electronic design automation (EDA) tools are used to determine sensor characteristics and verify functional performance of ROICs respectively with significantly different solvers. Some physics solvers provide the capability of transferring data to the EDA tool. However, single pixel transient simulations are either not feasible due to convergence difficulties or are prohibitively long.more » A simplified sensor model, which includes a current pulse in parallel with detector equivalent capacitor, is often used; even then, spice type top-level (entire array) simulations range from days to weeks. In order to analyze detector deficiencies for a particular scientific application, accurately defined transient behavioral models of all the functional blocks are required. Furthermore, various simulations, such as transient, noise, Monte Carlo, inter-pixel effects, etc. of the entire array need to be performed within a reasonable time frame without trading off accuracy. The sensor and the analog front-end can be modeling using a real number modeling language, as complex mathematical functions or detailed data can be saved to text files, for further top-level digital simulations. Parasitically aware digital timing is extracted in a standard delay format (sdf) from the pixel digital back-end layout as well as the periphery of the ROIC. For any given input, detector level worst-case and best-case simulations are performed using a Verilog simulation environment to determine the output. Each top-level transient simulation takes no more than 10-15 minutes. The impact of changing key parameters such as sensor Poissonian shot noise, analog front-end bandwidth, jitter due to clock distribution etc. can be accurately analyzed to determine ROIC architectural viability and bottlenecks. Hence the impact of the detector parameters on the scientific application can be studied.« less
Blood Pump Development Using Rocket Engine Flow Simulation Technology
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kiris, Cetin
2001-01-01
This paper reports the progress made towards developing complete blood flow simulation capability in humans, especially in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed to quantify the flow in these devices such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended to the analysis and development of a ventricular assist device (VAD), i.e., a blood pump. The blood flow in a VAD is practically incompressible and Newtonian, and thus an incompressible Navier-Stokes solution procedure can be applied. A primitive variable formulation is used in conjunction with the overset grid approach to handle complex moving geometry. The primary purpose of developing the incompressible flow analysis capability was to quantify the flow in advanced turbopump for space propulsion system. The same procedure has been extended to the development of NASA-DeBakey VAD that is based on an axial blood pump. Due to massive computing requirements, high-end computing is necessary for simulating three-dimensional flow in these pumps. Computational, experimental, and clinical results are presented.
A Coupled Surface Nudging Scheme for use in Retrospective ...
A surface analysis nudging scheme coupling atmospheric and land surface thermodynamic parameters has been implemented into WRF v3.8 (latest version) for use with retrospective weather and climate simulations, as well as for applications in air quality, hydrology, and ecosystem modeling. This scheme is known as the flux-adjusting surface data assimilation system (FASDAS) developed by Alapaty et al. (2008). This scheme provides continuous adjustments for soil moisture and temperature (via indirect nudging) and for surface air temperature and water vapor mixing ratio (via direct nudging). The simultaneous application of indirect and direct nudging maintains greater consistency between the soil temperature–moisture and the atmospheric surface layer mass-field variables. The new method, FASDAS, consistently improved the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as well as for high resolution regional climate predictions. This new capability has been released in WRF Version 3.8 as option grid_sfdda = 2. This new capability increased the accuracy of atmospheric inputs for use air quality, hydrology, and ecosystem modeling research to improve the accuracy of respective end-point research outcome. IMPACT: A new method, FASDAS, was implemented into the WRF model to consistently improve the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as wel
NASA Technical Reports Server (NTRS)
Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.;
2006-01-01
The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.
Multiple Monochromatic Imaging (MMI) Status and Plans for LANL Campaigns on Omega and NIF
NASA Astrophysics Data System (ADS)
Wysocki, F. J.; Hsu, S. C.; Tregillis, I. L.; Schmitt, M. J.; Kyrala, G. A.; Martinson, D. D.; Murphy, T. J.; Mancini, R. C.; Nagayama, T.
2011-10-01
LANL's DIME (Defect Implosion Experiment) campaigns on Omega and NIF are aimed at obtaining improved understanding of defect-induced mix via experiments and simulations of directly driven high-Z doped plastic capsules with DD or DT gas fill. To this end, the MMI diagnostic has been identified as a key diagnostic for providing space and time-resolved density, temperature, and mix profiles. The high Z shell dopants used on Omega are Ti and V, and to be used on NIF are Ge and Se. This poster will discuss the following four areas of MMI-related work at LANL, in collaboration with UNR: (1) data and preliminary analysis of MMI data from FY11 Omega campaigns, (2) development of a capability to generate simulated MMI data from radiation- hydrodynamic simulations of ICF implosions, (3) design of an MMI instrument for NIF that will cover the photon energy range 9.5-16.9 keV which includes the Ge/Se, H- like/He-like, α/ β lines, and (4) the development of MMI data post- processing and spectroscopic analysis tools. Supported by DOE NNSA.
Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics
NASA Technical Reports Server (NTRS)
Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel
2008-01-01
This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.
NASA Technical Reports Server (NTRS)
Peille, Phillip; Ceballos, Maria Teresa; Cobo, Beatriz; Wilms, Joern; Bandler, Simon; Smith, Stephen J.; Dauser, Thomas; Brand, Thorsten; Den Haretog, Roland; de Plaa, Jelle;
2016-01-01
The X-ray Integral Field Unit (X-IFU) microcalorimeter, on-board Athena, with its focal plane comprising 3840 Transition Edge Sensors (TESs) operating at 90 mK, will provide unprecedented spectral-imaging capability in the 0.2-12 keV energy range. It will rely on the on-board digital processing of current pulses induced by the heat deposited in the TES absorber, as to recover the energy of each individual events. Assessing the capabilities of the pulse reconstruction is required to understand the overall scientific performance of the X-IFU, notably in terms of energy resolution degradation with both increasing energies and count rates. Using synthetic data streams generated by the X-IFU End-to-End simulator, we present here a comprehensive benchmark of various pulse reconstruction techniques, ranging from standard optimal filtering to more advanced algorithms based on noise covariance matrices. Beside deriving the spectral resolution achieved by the different algorithms, a first assessment of the computing power and ground calibration needs is presented. Overall, all methods show similar performances, with the reconstruction based on noise covariance matrices showing the best improvement with respect to the standard optimal filtering technique. Due to prohibitive calibration needs, this method might however not be applicable to the X-IFU and the best compromise currently appears to be the so-called resistance space analysis which also features very promising high count rate capabilities.
Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing
NASA Technical Reports Server (NTRS)
Doyle, Richard; Bergman, Larry; Some, Raphael; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael
2013-01-01
Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and the mission; it can be aptly viewed as a "technology multiplier" in that advances in onboard computing provide dramatic improvements in flight functions and capabilities across the NASA mission classes, and will enable new flight capabilities and mission scenarios, increasing science and exploration return per mission-dollar.
Gpu Implementation of a Viscous Flow Solver on Unstructured Grids
NASA Astrophysics Data System (ADS)
Xu, Tianhao; Chen, Long
2016-06-01
Graphics processing units have gained popularities in scientific computing over past several years due to their outstanding parallel computing capability. Computational fluid dynamics applications involve large amounts of calculations, therefore a latest GPU card is preferable of which the peak computing performance and memory bandwidth are much better than a contemporary high-end CPU. We herein focus on the detailed implementation of our GPU targeting Reynolds-averaged Navier-Stokes equations solver based on finite-volume method. The solver employs a vertex-centered scheme on unstructured grids for the sake of being capable of handling complex topologies. Multiple optimizations are carried out to improve the memory accessing performance and kernel utilization. Both steady and unsteady flow simulation cases are carried out using explicit Runge-Kutta scheme. The solver with GPU acceleration in this paper is demonstrated to have competitive advantages over the CPU targeting one.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Modular, high power, variable R dynamic electrical load simulator
NASA Technical Reports Server (NTRS)
Joncas, K. P.
1974-01-01
The design of a previously developed basic variable R load simulator was entended to increase its power dissipation and transient handling capabilities. The delivered units satisfy all design requirements, and provides for a high power, modular simulation capability uniquely suited to the simulation of complex load responses. In addition to presenting conclusions and recommendations and pertinent background information, the report covers program accomplishments; describes the simulator basic circuits, transfer characteristic, protective features, assembly, and specifications; indicates the results of simulator evaluation, including burn-in and acceptance testing; provides acceptance test data; and summarizes the monthly progress reports.
NASA Astrophysics Data System (ADS)
Mallick, Rajnish; Ganguli, Ranjan; Kumar, Ravi
2017-05-01
The optimized design of a smart post-buckled beam actuator (PBA) is performed in this study. A smart material based piezoceramic stack actuator is used as a prime-mover to drive the buckled beam actuator. Piezoceramic actuators are high force, small displacement devices; they possess high energy density and have high bandwidth. In this study, bench top experiments are conducted to investigate the angular tip deflections due to the PBA. A new design of a linear-to-linear motion amplification device (LX-4) is developed to circumvent the small displacement handicap of piezoceramic stack actuators. LX-4 enhances the piezoceramic actuator mechanical leverage by a factor of four. The PBA model is based on dynamic elastic stability and is analyzed using the Mathieu-Hill equation. A formal optimization is carried out using a newly developed meta-heuristic nature inspired algorithm, named as the bat algorithm (BA). The BA utilizes the echolocation capability of bats. An optimized PBA in conjunction with LX-4 generates end rotations of the order of 15° at the output end. The optimized PBA design incurs less weight and induces large end rotations, which will be useful in development of various mechanical and aerospace devices, such as helicopter trailing edge flaps, micro and nano aerial vehicles and other robotic systems.
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
Wang, Bei; Ethier, Stephane; Tang, William; ...
2017-06-29
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudkevich, Aleksandr; Goldis, Evgeniy
This research conducted by the Newton Energy Group, LLC (NEG) is dedicated to the development of pCloud: a Cloud-based Power Market Simulation Environment. pCloud is offering power industry stakeholders the capability to model electricity markets and is organized around the Software as a Service (SaaS) concept -- a software application delivery model in which software is centrally hosted and provided to many users via the internet. During the Phase I of this project NEG developed a prototype design for pCloud as a SaaS-based commercial service offering, system architecture supporting that design, ensured feasibility of key architecture's elements, formed technological partnershipsmore » and negotiated commercial agreements with partners, conducted market research and other related activities and secured funding for continue development of pCloud between the end of Phase I and beginning of Phase II, if awarded. Based on the results of Phase I activities, NEG has established that the development of a cloud-based power market simulation environment within the Windows Azure platform is technologically feasible, can be accomplished within the budget and timeframe available through the Phase II SBIR award with additional external funding. NEG believes that pCloud has the potential to become a game-changing technology for the modeling and analysis of electricity markets. This potential is due to the following critical advantages of pCloud over its competition: - Standardized access to advanced and proven power market simulators offered by third parties. - Automated parallelization of simulations and dynamic provisioning of computing resources on the cloud. This combination of automation and scalability dramatically reduces turn-around time while offering the capability to increase the number of analyzed scenarios by a factor of 10, 100 or even 1000. - Access to ready-to-use data and to cloud-based resources leading to a reduction in software, hardware, and IT costs. - Competitive pricing structure, which will make high-volume usage of simulation services affordable. - Availability and affordability of high quality power simulators, which presently only large corporate clients can afford, will level the playing field in developing regional energy policies, determining prudent cost recovery mechanisms and assuring just and reasonable rates to consumers. - Users that presently do not have the resources to internally maintain modeling capabilities will now be able to run simulations. This will invite more players into the industry, ultimately leading to more transparent and liquid power markets.« less
NASA Astrophysics Data System (ADS)
Rhoads, James
Central objectives: WFIRST-AFTA has tremendous potential for studying the epoch of "Cosmic Dawn" the period encompassing the formation of the first galaxies and quasars, and their impact on the surrounding universe through cosmological reionization. Our goal is to ensure that this potential is realized through the middle stages of mission planning, culminating in designs for both WFIRST and its core surveys that meet the core objectives in dark energy and exoplanet science, while maximizing the complementary Cosmic Dawn science. Methods: We will consider a combined approach to studying Cosmic Dawn using a judicious mixture of guest investigator data analysis of the primary WFIRST surveys, and a specifically designed Guest Observer program to complement those surveys. The Guest Observer program will serve primarily to obtain deep field observations, with particular attention to the capabilities of WFIRST for spectroscopic deep fields using the WFI grism. We will bring to bear our years of experience with slitless spectroscopy on the Hubble Space Telescope, along with an expectation of JWST slitless grism spectroscopy. We will use this experience to examine the implications of WFIRST’s grism resolution and wavelength coverage for deep field observations, and if appropriate, to suggest potential modifications of these parameters to optimize the science return on WFIRST. We have assembled a team of experts specializing in (1) Lyman Break Galaxies at redshifts higher than 7 (2) Quasars at high redshifts (3) Lyman-alpha galaxies as probes of reionization (4) Theoretical simulations of high-redshift galaxies (5) Simulations of grism observations (6) post-processing analysis to find emission line galaxies and high redshift galaxies (7) JWST observations and calibrations. With this team we intend to do end-to-end simulations starting with halo populations and expected spectra of high redshift galaxies and finally extracting what we can learn about (a) reionization using the Lyman-alpha test (b) the sources of reionization - both galaxies and AGN and (c) how to optimize WFIRST-AFTA surveys to maximize scientific output of this mission. Along the way, we will simulate the galaxy and AGN populations expected beyond redshift 7, and will simulate observations and data analysis of these populations with WFIRST. Significance of work: Cosmic Dawn is one of the central pillars of the "New Worlds, New Horizons" decadal survey. WFIRST's highly sensitive and wide-field near-infrared capabilities offer a natural tool to obtain statistically useful samples of faint galaxies and AGN beyond redshift 7. Thus, we expect Cosmic Dawn observations will constitute a major component of the GO program ultimately executed by WFIRST. By supporting our Science Investigation Team to consider the interplay between the mission parameters and the ultimate harvest of Cosmic Dawn science, NASA will help ensure the success of WFIRST as a broadly focused flagship mission.
A Testbed Environment for Buildings-to-Grid Cyber Resilience Research and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sridhar, Siddharth; Ashok, Aditya; Mylrea, Michael E.
The Smart Grid is characterized by the proliferation of advanced digital controllers at all levels of its operational hierarchy from generation to end consumption. Such controllers within modern residential and commercial buildings enable grid operators to exercise fine-grained control over energy consumption through several emerging Buildings-to-Grid (B2G) applications. Though this capability promises significant benefits in terms of operational economics and improved reliability, cybersecurity weaknesses in the supporting infrastructure could be exploited to cause a detrimental effect and this necessitates focused research efforts on two fronts. First, the understanding of how cyber attacks in the B2G space could impact grid reliabilitymore » and to what extent. Second, the development and validation of cyber-physical application-specific countermeasures that are complementary to traditional infrastructure cybersecurity mechanisms for enhanced cyber attack detection and mitigation. The PNNL B2G testbed is currently being developed to address these core research needs. Specifically, the B2G testbed combines high-fidelity buildings+grid simulators, industry-grade building automation and Supervisory Control and Data Acquisition (SCADA) systems in an integrated, realistic, and reconfigurable environment capable of supporting attack-impact-detection-mitigation experimentation. In this paper, we articulate the need for research testbeds to model various B2G applications broadly by looking at the end-to-end operational hierarchy of the Smart Grid. Finally, the paper not only describes the architecture of the B2G testbed in detail, but also addresses the broad spectrum of B2G resilience research it is capable of supporting based on the smart grid operational hierarchy identified earlier.« less
Pulse shaping and energy storage capabilities of angularly multiplexed KrF laser fusion drivers
NASA Astrophysics Data System (ADS)
Lehmberg, R. H.; Giuliani, J. L.; Schmitt, A. J.
2009-07-01
This paper describes a rep-rated multibeam KrF laser driver design for the 500kJ Inertial Fusion test Facility (FTF) recently proposed by NRL, then models its optical pulse shaping capabilities using the ORESTES laser kinetics code. It describes a stable and reliable iteration technique for calculating the required precompensated input pulse shape that will achieve the desired output shape, even when the amplifiers are heavily saturated. It also describes how this precompensation technique could be experimentally implemented in real time on a reprated laser system. The simulations show that this multibeam system can achieve a high fidelity pulse shaping capability, even for a high gain shock ignition pulse whose final spike requires output intensities much higher than the ˜4MW/cm2 saturation levels associated with quasi-cw operation; i.e., they show that KrF can act as a storage medium even for pulsewidths of ˜1ns. For the chosen pulse, which gives a predicted fusion energy gain of ˜120, the simulations predict the FTF can deliver a total on-target energy of 428kJ, a peak spike power of 385TW, and amplified spontaneous emission prepulse contrast ratios IASE/I<3×10-7 in intensity and FASE/F<1.5×10-5 in fluence. Finally, the paper proposes a front-end pulse shaping technique that combines an optical Kerr gate with cw 248nm light and a 1μm control beam shaped by advanced fiber optic technology, such as the one used in the National Ignition Facility (NIF) laser.
NASA Astrophysics Data System (ADS)
Ingraham, Patrick Jon
This thesis determines the capability of detecting faint companions in the presence of speckle noise when performing space-based high-contrast imaging through spectral differential imagery (SDI) using a low-order Fabry-Perot etalon as a tunable filter. The performance of such a tunable filter is illustrated through the Tunable Filter Imager (TFI), an instrument designed for the James Webb Space Telescope (JWST). Using a TFI prototype etalon and a custom designed test bed, the etalon's ability to perform speckle-suppression through SDI is demonstrated experimentally. Improvements in contrast vary with separation, ranging from a factor of ˜10 at working angles greater than 11 lambda/D and increasing up to a factor of ˜60 at 5 lambda/D. These measurements are consistent with a Fresnel optical propagation model which shows the speckle suppression capability is limited by the test bed and not the etalon. This result demonstrates that a tunable filter is an attractive option to perform high-contrast imaging through SDI. To explore the capability of space-based SDI using an etalon, we perform an end-to-end Fresnel propagation of JWST and TFI. Using this simulation, a contrast improvement ranging from a factor of ˜7 to ˜100 is predicted, depending on the instrument's configuration. The performance of roll-subtraction is simulated and compared to that of SDI. The SDI capability of the Near-Infrared Imager and Slitless Spectrograph (NIRISS), the science instrument module to replace TFI in the JWST Fine Guidance Sensor is also determined. Using low resolution, multi-band (0.85-2.4 microm) multi-object spectroscopy, 104 objects towards the central region of the Orion Nebular Cluster have been assigned spectral types including 7 new brown dwarfs, and 4 new planetary mass candidates. These objects are useful for determining the substellar initial mass function and for testing evolutionary and atmospheric models of young stellar and substellar objects. Using the measured H band magnitudes, combined with our determined extinction values, the classified objects are used to create an Hertzsprung-Russell diagram for the cluster. Our results indicate a single epoch of star formation beginning ˜1 Myr ago. The initial mass function of the cluster is derived and found to be consistent with the values determined for other young clusters and the galactic disk.
Experiments evaluating compliance and force feedback effect on manipulator performance
NASA Technical Reports Server (NTRS)
Kugath, D. A.
1972-01-01
The performance capability was assessed of operators performing simulated space tasks using manipulator systems which had compliance and force feedback varied. Two manipulators were used, the E-2 electromechanical man-equivalent (force, reach, etc.) master-slave system and a modified CAM 1400 hydraulic master-slave with 100 lbs force capability at reaches of 24 ft. The CAM 1400 was further modified to operate without its normal force feedback. Several experiments and simulations were performed. The first two involved the E-2 absorbing the energy of a moving mass and secondly, guiding a mass thru a maze. Thus, both work and self paced tasks were studied as servo compliance was varied. Three simulations were run with the E-2 mounted on the CAM 1400 to evaluate the concept of a dexterous manipulator as an end effector of a boom-manipulator. Finally, the CAM 1400 performed a maze test and also simulated the capture of a large mass as the servo compliance was varied and with force feedback included and removed.
High fidelity studies of exploding foil initiator bridges, Part 3: ALEGRA MHD simulations
NASA Astrophysics Data System (ADS)
Neal, William; Garasi, Christopher
2017-01-01
Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, and predict a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this third paper of a three part study, the experimental results presented in part 2 are compared against 3-dimensional MHD simulations. This improved experimental capability, along with advanced simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.
Simulation Modeling and Performance Evaluation of Space Networks
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Segui, John
2006-01-01
In space exploration missions, the coordinated use of spacecraft as communication relays increases the efficiency of the endeavors. To conduct trade-off studies of the performance and resource usage of different communication protocols and network designs, JPL designed a comprehensive extendable tool, the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE). The design and development of MACHETE began in 2000 and is constantly evolving. Currently, MACHETE contains Consultative Committee for Space Data Systems (CCSDS) protocol standards such as Proximity-1, Advanced Orbiting Systems (AOS), Packet Telemetry/Telecommand, Space Communications Protocol Specification (SCPS), and the CCSDS File Delivery Protocol (CFDP). MACHETE uses the Aerospace Corporation s Satellite Orbital Analysis Program (SOAP) to generate the orbital geometry information and contact opportunities. Matlab scripts provide the link characteristics. At the core of MACHETE is a discrete event simulator, QualNet. Delay Tolerant Networking (DTN) is an end-to-end architecture providing communication in and/or through highly stressed networking environments. Stressed networking environments include those with intermittent connectivity, large and/or variable delays, and high bit error rates. To provide its services, the DTN protocols reside at the application layer of the constituent internets, forming a store-and-forward overlay network. The key capabilities of the bundling protocols include custody-based reliability, ability to cope with intermittent connectivity, ability to take advantage of scheduled and opportunistic connectivity, and late binding of names to addresses. In this presentation, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the use of MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions
The Framework for 0-D Atmospheric Modeling (F0AM) v3.1
NASA Technical Reports Server (NTRS)
Wolfe, Glenn M.; Marvin, Margaret R.; Roberts, Sandra J.; Travis, Katherine R.; Liao, Jin
2016-01-01
The Framework for 0-D Atmospheric Modeling(F0AM) is a flexible and user-friendly MATLAB-based platform for simulation of atmospheric chemistry systems. The F0AM interface incorporates front-end configuration of observational constraints and model setups, making it readily adaptable to simulation of photochemical chambers, Lagrangian plumes, and steady-state or time-evolving solar cycles. Six different chemical mechanisms and three options for calculation of photolysis frequencies are currently available. Example simulations are presented to illustrate model capabilities and, more generally, highlight some of the advantages and challenges of 0-D box modeling.
Linear fixed-field multipass arcs for recirculating linear accelerators
Morozov, V. S.; Bogacz, S. A.; Roblin, Y. R.; ...
2012-06-14
Recirculating Linear Accelerators (RLA's) provide a compact and efficient way of accelerating particle beams to medium and high energies by reusing the same linac for multiple passes. In the conventional scheme, after each pass, the different energy beams coming out of the linac are separated and directed into appropriate arcs for recirculation, with each pass requiring a separate fixed-energy arc. In this paper we present a concept of an RLA return arc based on linear combined-function magnets, in which two and potentially more consecutive passes with very different energies are transported through the same string of magnets. By adjusting themore » dipole and quadrupole components of the constituting linear combined-function magnets, the arc is designed to be achromatic and to have zero initial and final reference orbit offsets for all transported beam energies. We demonstrate the concept by developing a design for a droplet-shaped return arc for a dog-bone RLA capable of transporting two beam passes with momenta different by a factor of two. Finally, we present the results of tracking simulations of the two passes and lay out the path to end-to-end design and simulation of a complete dog-bone RLA.« less
Space Logistics: Launch Capabilities
NASA Technical Reports Server (NTRS)
Furnas, Randall B.
1989-01-01
The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.
Stability Analysis of Intertank Formed Skin/Stringer Compression Panel with Simulated Damage
NASA Technical Reports Server (NTRS)
Harper, David W.; Wingate, Robert J.
2012-01-01
The External Tank (ET) is a component of the Space Shuttle launch vehicle that contains fuel and oxidizer. During launch, the ET supplies the space shuttle main engines with liquid hydrogen and liquid oxygen. In addition to supplying fuel and oxidizer, it is the backbone structural component of the Space Shuttle. It is comprised of a liquid hydrogen (LH2) tank and a liquid oxygen (LOX) tank, which are separated by an Intertank. The Intertank is a stringer-stiffened cylindrical structure with hat-section stringers that are roll formed from aluminum-lithium alloy Al-2090. Cracks in the Intertank stringers of the STS-133 ET were noticed after a November 5, 2010 launch attempt. The cracks were approximately nine inches long and occurred on the forward end of the Intertank (near the LOX tank), along the fastener line, and were believed to have occurred while loading the ET with the cryogenic propellants. These cracks generated questions about the structural integrity of the Intertank. In order to determine the structural capability of the Intertank with varying degrees of damage, a finite element model (FEM) simulating a 1995 compression panel test was analyzed and correlated to test data. Varying degrees of damage were simulated in the FEM, and non-linear stability analyses were performed. The high degree of similarity between the compression panel and the Intertank provided confidence that the ET Intertank would have similar capabilities.
Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology
NASA Technical Reports Server (NTRS)
Blaser, Tammy M.
2003-01-01
NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.
Asynchronous transfer mode link performance over ground networks
NASA Technical Reports Server (NTRS)
Chow, E. T.; Markley, R. W.
1993-01-01
The results of an experiment to determine the feasibility of using asynchronous transfer mode (ATM) technology to support advanced spacecraft missions that require high-rate ground communications and, in particular, full-motion video are reported. Potential nodes in such a ground network include Deep Space Network (DSN) antenna stations, the Jet Propulsion Laboratory, and a set of national and international end users. The experiment simulated a lunar microrover, lunar lander, the DSN ground communications system, and distributed science users. The users were equipped with video-capable workstations. A key feature was an optical fiber link between two high-performance workstations equipped with ATM interfaces. Video was also transmitted through JPL's institutional network to a user 8 km from the experiment. Variations in video depending on the networks and computers were observed, the results are reported.
Building Airport Surface HITL Simulation Capability
NASA Technical Reports Server (NTRS)
Chinn, Fay Cherie
2016-01-01
FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.
NASA Astrophysics Data System (ADS)
Hartmann, H. C.; Pagano, T. C.; Sorooshian, S.; Bales, R.
2002-12-01
Expectations for hydroclimatic research are evolving as changes in the contract between science and society require researchers to provide "usable science" that can improve resource management policies and practices. However, decision makers have a broad range of abilities to access, interpret, and apply scientific research. "High-end users" have technical capabilities and operational flexibility capable of readily exploiting new information and products. "Low-end users" have fewer resources and are less likely to change their decision making processes without clear demonstration of benefits by influential early adopters (i.e., high-end users). Should research programs aim for efficiency, targeting high-end users? Should they aim for impact, targeting decisions with high economic value or great influence (e.g., state or national agencies)? Or should they focus on equity, whereby outcomes benefit groups across a range of capabilities? In this case study, we focus on hydroclimatic variability and forecasts. Agencies and individuals responsible for resource management decisions have varying perspectives about hydroclimatic variability and opportunities for using forecasts to improve decision outcomes. Improper interpretation of forecasts is widespread and many individuals find it difficult to place forecasts in an appropriate regional historical context. In addressing these issues, we attempted to mitigate traditional inequities in the scope, communication, and accessibility of hydroclimatic research results. High-end users were important in prioritizing information needs, while low-end users were important in determining how information should be communicated. For example, high-end users expressed hesitancy to use seasonal forecasts in the absence of quantitative performance evaluations. Our subsequently developed forecast evaluation framework and research products, however, were guided by the need for a continuum of evaluation measures and interpretive materials to enable low-end users to increase their understanding of probabilistic forecasts, credibility concepts, and implications for decision making. We also developed an interactive forecast assessment tool accessible over the Internet, to support resource decisions by individuals as well as agencies. The tool provides tutorials for guiding forecast interpretation, including quizzes that allow users to test their forecast interpretation skills. Users can monitor recent and historical observations for selected regions, communicated using terminology consistent with available forecast products. The tool also allows users to evaluate forecast performance for the regions, seasons, forecast lead times, and performance criteria relevant to their specific decision making situations. Using consistent product formats, the evaluation component allows individuals to use results at the level they are capable of understanding, while offering opportunity to shift to more sophisticated criteria. Recognizing that many individuals lack Internet access, the forecast assessment webtool design also includes capabilities for customized report generation so extension agents or other trusted information intermediaries can provide material to decision makers at meetings or site visits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.
2011-02-01
This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less
NASA Astrophysics Data System (ADS)
Burtovoi, A.; Zampieri, L.; Giuliani, A.; Bigongiari, C.; Di Pierro, F.; Stamerra, A.
2017-01-01
The development and construction of the Cherenkov Telescope Array (CTA) opens up new opportunities for the study of very high energy (VHE, E > 100 GeV) sources. As a part of CTA, the ASTRI project, led by INAF, has one of the main goals to develop one of the mini-arrays of CTA pre-production telescopes, proposed to be installed at the CTA southern site. Thanks to the innovative dual-mirror optical design of its small-sized telescopes, the ASTRI mini-array will be characterized by a large field of view, an excellent angular resolution and a good sensitivity up to energies of several tens of TeV. Pulsar wind nebulae, along with Supernova Remnants, are among the most abundant sources that will be identified and investigated, with the ultimate goal to move significantly closer to an understanding of the origin of cosmic rays (CR). As part of the ongoing effort to investigate the scientific capabilities for both CTA as a whole and the ASTRI mini-array, we performed simulations of the Vela X region. We simulated its extended VHE γ-ray emission using the results of the detailed H.E.S.S. analysis of this source. We estimated the resolving capabilities of the diffuse emission and the detection significance of the pulsar with both CTA as a whole and the ASTRI mini-array. Moreover with these instruments it will be possible to observe the high-energy end of SNRs spectrum, searching for particles with energies near the cosmic-rays "knee" (E ˜ 1015 eV). We simulated a set of ASTRI mini-array observations for one young and an evolved SNRs in order to test the capabilities of this instrument to discover and study PeVatrons on the Galactic plane.
Evolutionary online behaviour learning and adaptation in real robots.
Silva, Fernando; Correia, Luís; Christensen, Anders Lyhne
2017-07-01
Online evolution of behavioural control on real robots is an open-ended approach to autonomous learning and adaptation: robots have the potential to automatically learn new tasks and to adapt to changes in environmental conditions, or to failures in sensors and/or actuators. However, studies have so far almost exclusively been carried out in simulation because evolution in real hardware has required several days or weeks to produce capable robots. In this article, we successfully evolve neural network-based controllers in real robotic hardware to solve two single-robot tasks and one collective robotics task. Controllers are evolved either from random solutions or from solutions pre-evolved in simulation. In all cases, capable solutions are found in a timely manner (1 h or less). Results show that more accurate simulations may lead to higher-performing controllers, and that completing the optimization process in real robots is meaningful, even if solutions found in simulation differ from solutions in reality. We furthermore demonstrate for the first time the adaptive capabilities of online evolution in real robotic hardware, including robots able to overcome faults injected in the motors of multiple units simultaneously, and to modify their behaviour in response to changes in the task requirements. We conclude by assessing the contribution of each algorithmic component on the performance of the underlying evolutionary algorithm.
NASA Technical Reports Server (NTRS)
Goodrich, Kenneth H.; McManus, John W.; Chappell, Alan R.
1992-01-01
A batch air combat simulation environment known as the Tactical Maneuvering Simulator (TMS) is presented. The TMS serves as a tool for developing and evaluating tactical maneuvering logics. The environment can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS is capable of simulating air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics and propulsive characteristics equivalent to those used in high-fidelity piloted simulation. Databases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system known as the Tactical Autopilot (TA) is implemented in the aircraft simulation model. The TA converts guidance commands issued by computerized maneuvering logics in the form of desired angle-of-attack and wind axis-bank angle into inputs to the inner-loop control augmentation system of the aircraft. This report describes the capabilities and operation of the TMS.
Integrated Simulation Design Challenges to Support TPS Repair Operations
NASA Technical Reports Server (NTRS)
Quiocho, Leslie J.; Crues, Edwin Z.; Huynh, An; Nguyen, Hung T.; MacLean, John
2005-01-01
During the Orbiter Repair Maneuver (ORM) operations planned for Return to Flight (RTF), the Shuttle Remote Manipulator System (SRMS) must grapple the International Space Station (ISS), undock the Orbiter, maneuver it through a long duration trajectory, and orient it to an EVA crewman poised at the end of the Space Station Remote Manipulator System (SSRMS) to facilitate the repair of the Thermal Protection System (TPS). Once repair has been completed and confirmed, then the SRMS proceeds back through the trajectory to dock the Orbiter to the Orbiter Docking System. In order to support analysis of the complex dynamic interactions of the integrated system formed by the Orbiter, ISS, SRMS, and SSRMS during the ORM, simulation tools used for previous 'nominal' mission support required substantial enhancements. These upgrades were necessary to provide analysts with the capabilities needed to study integrated system performance. This paper discusses the simulation design challenges encountered while developing simulation capabilities to mirror the ORM operations. The paper also describes the incremental build approach that was utilized, starting with the subsystem simulation elements and integration into increasing more complex simulations until the resulting ORM worksite dynamics simulation had been assembled. Furthermore, the paper presents an overall integrated simulation V&V methodology based upon a subsystem level testing, integrated comparisons, and phased checkout.
Opportunities for leveraging OS virtualization in high-end supercomputing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Patrick G.; Pedretti, Kevin Thomas Tauke
2010-11-01
This paper examines potential motivations for incorporating virtualization support in the system software stacks of high-end capability supercomputers. We advocate that this will increase the flexibility of these platforms significantly and enable new capabilities that are not possible with current fixed software stacks. Our results indicate that compute, virtual memory, and I/O virtualization overheads are low and can be further mitigated by utilizing well-known techniques such as large paging and VMM bypass. Furthermore, since the addition of virtualization support does not affect the performance of applications using the traditional native environment, there is essentially no disadvantage to its addition.
NASA Technical Reports Server (NTRS)
Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon
2016-01-01
Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutland, Christopher J.
2009-04-26
The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less
Podolsky, Dale J; Fisher, David M; Wong Riff, Karen W; Szasz, Peter; Looi, Thomas; Drake, James M; Forrest, Christopher R
2018-06-01
This study assessed technical performance in cleft palate repair using a newly developed assessment tool and high-fidelity cleft palate simulator through a longitudinal simulation training exercise. Three residents performed five and one resident performed nine consecutive endoscopically recorded cleft palate repairs using a cleft palate simulator. Two fellows in pediatric plastic surgery and two expert cleft surgeons also performed recorded simulated repairs. The Cleft Palate Objective Structured Assessment of Technical Skill (CLOSATS) and end-product scales were developed to assess performance. Two blinded cleft surgeons assessed the recordings and the final repairs using the CLOSATS, end-product scale, and a previously developed global rating scale. The average procedure-specific (CLOSATS), global rating, and end-product scores increased logarithmically after each successive simulation session for the residents. Reliability of the CLOSATS (average item intraclass correlation coefficient (ICC), 0.85 ± 0.093) and global ratings (average item ICC, 0.91 ± 0.02) among the raters was high. Reliability of the end-product assessments was lower (average item ICC, 0.66 ± 0.15). Standard setting linear regression using an overall cutoff score of 7 of 10 corresponded to a pass score for the CLOSATS and the global score of 44 (maximum, 60) and 23 (maximum, 30), respectively. Using logarithmic best-fit curves, 6.3 simulation sessions are required to reach the minimum standard. A high-fidelity cleft palate simulator has been developed that improves technical performance in cleft palate repair. The simulator and technical assessment scores can be used to determine performance before operating on patients.
High-fidelity large eddy simulation for supersonic jet noise prediction
NASA Astrophysics Data System (ADS)
Aikens, Kurt M.
The problem of intense sound radiation from supersonic jets is a concern for both civil and military applications. As a result, many experimental and computational efforts are focused at evaluating possible noise suppression techniques. Large-eddy simulation (LES) is utilized in many computational studies to simulate the turbulent jet flowfield. Integral methods such as the Ffowcs Williams-Hawkings (FWH) method are then used for propagation of the sound waves to the farfield. Improving the accuracy of this two-step methodology and evaluating beveled converging-diverging nozzles for noise suppression are the main tasks of this work. First, a series of numerical experiments are undertaken to ensure adequate numerical accuracy of the FWH methodology. This includes an analysis of different treatments for the downstream integration surface: with or without including an end-cap, averaging over multiple end-caps, and including an approximate surface integral correction term. Secondly, shock-capturing methods based on characteristic filtering and adaptive spatial filtering are used to extend a highly-parallelizable multiblock subsonic LES code to enable simulations of supersonic jets. The code is based on high-order numerical methods for accurate prediction of the acoustic sources and propagation of the sound waves. Furthermore, this new code is more efficient than the legacy version, allows cylindrical multiblock topologies, and is capable of simulating nozzles with resolved turbulent boundary layers when coupled with an approximate turbulent inflow boundary condition. Even though such wall-resolved simulations are more physically accurate, their expense is often prohibitive. To make simulations more economical, a wall model is developed and implemented. The wall modeling methodology is validated for turbulent quasi-incompressible and compressible zero pressure gradient flat plate boundary layers, and for subsonic and supersonic jets. The supersonic code additions and the wall model treatment are then utilized to simulate military-style nozzles with and without beveling of the nozzle exit plane. Experiments of beveled converging-diverging nozzles have found reduced noise levels for some observer locations. Predicting the noise for these geometries provides a good initial test of the overall methodology for a more complex nozzle. The jet flowfield and acoustic data are analyzed and compared to similar experiments and excellent agreement is found. Potential areas of improvement are discussed for future research.
2001-07-01
Major General A C Figgures, Capability Manager (Manœuvre) UK MOD, provided the Conference with a fitting end message encouraging the SE and M&S...SESSION Welcoming Address - ‘Synthetic Environments - Managing the Breakout’ WA by M. Markin Opening Address for NATO M&S Conference OA by G. Sürsal...Keynote Address KN by G.J. Burrows Industry’s Role IR† by M. Mansell The RMCS SSEL I by J.R. Searle SESSION 1: POLICY, STRATEGY & MANAGEMENT A Strategy
Modelling the transient behaviour of pulsed current tungsten-inert-gas weldpools
NASA Astrophysics Data System (ADS)
Wu, C. S.; Zheng, W.; Wu, L.
1999-01-01
A three-dimensional model is established to simulate the pulsed current tungsten-inert-gas (TIG) welding process. The goal is to analyse the cyclic variation of fluid flow and heat transfer in weldpools under periodic arc heat input. To this end, an algorithm, which is capable of handling the transience, nonlinearity, multiphase and strong coupling encountered in this work, is developed. The numerical simulations demonstrate the transient behaviour of weldpools under pulsed current. Experimental data are compared with numerical results to show the effectiveness of the developed model.
Passivated diamond film temperature sensing probe and measuring system employing same
Young, Jack P.; Mamantov, Gleb
1998-01-01
A high temperature sensing probe includes an optical fiber or rod having a distal end and a proximal end. The optical fiber or rod has a coating secured to the distal end thereof, wherein the coating is capable of producing a Raman spectrum when exposed to an exciting radiation source.
Simulation study of a new inverse-pinch high Coulomb transfer switch
NASA Technical Reports Server (NTRS)
Choi, S. H.
1984-01-01
A simulation study of a simplified model of a high coulomb transfer switch is performed. The switch operates in an inverse pinch geometry formed by an all metal chamber, which greatly reduces hot spot formations on the electrode surfaces. Advantages of the switch over the conventional switches are longer useful life, higher current capability and lower inductance, which improves the characteristics required for a high repetition rate switch. The simulation determines the design parameters by analytical computations and comparison with the experimentally measured risetime, current handling capability, electrode damage, and hold-off voltages. The parameters of initial switch design can be determined for the anticipated switch performance. Results are in agreement with the experiment results. Although the model is simplified, the switch characteristics such as risetime, current handling capability, electrode damages, and hold-off voltages are accurately determined.
Laser-driven x-ray and neutron source development for industrial applications of plasma accelerators
NASA Astrophysics Data System (ADS)
Brenner, C. M.; Mirfayzi, S. R.; Rusby, D. R.; Armstrong, C.; Alejo, A.; Wilson, L. A.; Clarke, R.; Ahmed, H.; Butler, N. M. H.; Haddock, D.; Higginson, A.; McClymont, A.; Murphy, C.; Notley, M.; Oliver, P.; Allott, R.; Hernandez-Gomez, C.; Kar, S.; McKenna, P.; Neely, D.
2016-01-01
Pulsed beams of energetic x-rays and neutrons from intense laser interactions with solid foils are promising for applications where bright, small emission area sources, capable of multi-modal delivery are ideal. Possible end users of laser-driven multi-modal sources are those requiring advanced non-destructive inspection techniques in industry sectors of high value commerce such as aerospace, nuclear and advanced manufacturing. We report on experimental work that demonstrates multi-modal operation of high power laser-solid interactions for neutron and x-ray beam generation. Measurements and Monte Carlo radiation transport simulations show that neutron yield is increased by a factor ~2 when a 1 mm copper foil is placed behind a 2 mm lithium foil, compared to using a 2 cm block of lithium only. We explore x-ray generation with a 10 picosecond drive pulse in order to tailor the spectral content for radiography with medium density alloy metals. The impact of using >1 ps pulse duration on laser-accelerated electron beam generation and transport is discussed alongside the optimisation of subsequent bremsstrahlung emission in thin, high atomic number target foils. X-ray spectra are deconvolved from spectrometer measurements and simulation data generated using the GEANT4 Monte Carlo code. We also demonstrate the unique capability of laser-driven x-rays in being able to deliver single pulse high spatial resolution projection imaging of thick metallic objects. Active detector radiographic imaging of industrially relevant sample objects with a 10 ps drive pulse is presented for the first time, demonstrating that features of 200 μm size are resolved when projected at high magnification.
NASA Astrophysics Data System (ADS)
Ito, Mikiko; Lee, Jae Sung; Park, Min-Jae; Sim, Kwang-Souk; Jong Hong, Seong
2010-07-01
PET detectors with depth-of-interaction (DOI) encoding capability allow high spatial resolution and high sensitivity to be achieved simultaneously. To obtain DOI information from a mono-layer array of scintillation crystals using a single-ended readout, the authors devised a method based on light spreading within a crystal array and performed Monte Carlo simulations with individual scintillation photon tracking to prove the concept. A scintillation crystal array model was constructed using a grid method. Conventional grids are constructed using comb-shaped reflector strips with rectangular teeth to isolate scintillation crystals optically. However, the authors propose the use of triangularly shaped teeth, such that scintillation photons spread only in the x-direction in the upper halves of crystals and in the y-direction in lower halves. DOI positions can be estimated by considering the extent of two-dimensional light dispersion, which can be determined from the multiple anode outputs of a position-sensitive PMT placed under the crystal array. In the main simulation, a crystal block consisting of a 29 × 29 array of 1.5 mm × 1.5 mm × 20 mm crystals and a multi-anode PMT with 16 × 16 pixels were used. The effects of crystal size and non-uniform PMT output gain were also explored by simulation. The DOI resolution estimated for 1.5 × 1.5 × 20 mm3 crystals was 2.16 mm on average. Although the flood map was depth dependent, each crystal was well identified at all depths when a corner of the crystal array was irradiated with 511 keV gamma rays (peak-to-valley ratio ~9:1). DOI resolution was better than 3 mm up to a crystal length of 28 mm with a 1.5 × 1.5 mm2 or 2.0 × 2.0 mm2 crystal surface area. The devised light-sharing method allowed excellent DOI resolutions to be obtained without the use of dual-ended readout or multiple crystal arrays.
Cybersecurity in Hospitals: A Systematic, Organizational Perspective
Kaiser, Jessica P
2018-01-01
Background Cybersecurity incidents are a growing threat to the health care industry in general and hospitals in particular. The health care industry has lagged behind other industries in protecting its main stakeholder (ie, patients), and now hospitals must invest considerable capital and effort in protecting their systems. However, this is easier said than done because hospitals are extraordinarily technology-saturated, complex organizations with high end point complexity, internal politics, and regulatory pressures. Objective The purpose of this study was to develop a systematic and organizational perspective for studying (1) the dynamics of cybersecurity capability development at hospitals and (2) how these internal organizational dynamics interact to form a system of hospital cybersecurity in the United States. Methods We conducted interviews with hospital chief information officers, chief information security officers, and health care cybersecurity experts; analyzed the interview data; and developed a system dynamics model that unravels the mechanisms by which hospitals build cybersecurity capabilities. We then use simulation analysis to examine how changes to variables within the model affect the likelihood of cyberattacks across both individual hospitals and a system of hospitals. Results We discuss several key mechanisms that hospitals use to reduce the likelihood of cybercriminal activity. The variable that most influences the risk of cyberattack in a hospital is end point complexity, followed by internal stakeholder alignment. Although resource availability is important in fueling efforts to close cybersecurity capability gaps, low levels of resources could be compensated for by setting a high target level of cybersecurity. Conclusions To enhance cybersecurity capabilities at hospitals, the main focus of chief information officers and chief information security officers should be on reducing end point complexity and improving internal stakeholder alignment. These strategies can solve cybersecurity problems more effectively than blindly pursuing more resources. On a macro level, the cyber vulnerability of a country’s hospital infrastructure is affected by the vulnerabilities of all individual hospitals. In this large system, reducing variation in resource availability makes the whole system less vulnerable—a few hospitals with low resources for cybersecurity threaten the entire infrastructure of health care. In other words, hospitals need to move forward together to make the industry less attractive to cybercriminals. Moreover, although compliance is essential, it does not equal security. Hospitals should set their target level of cybersecurity beyond the requirements of current regulations and policies. As of today, policies mostly address data privacy, not data security. Thus, policy makers need to introduce policies that not only raise the target level of cybersecurity capabilities but also reduce the variability in resource availability across the entire health care system. PMID:29807882
High fidelity studies of exploding foil initiator bridges, Part 1: Experimental method
NASA Astrophysics Data System (ADS)
Bowden, Mike; Neal, William
2017-01-01
Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage and in the case of EFIs, flyer velocity. Correspondingly, experimental methods have in general been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, predicting a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately validated. In this first paper of a three part study, the experimental method for determining the current, voltage, flyer velocity and multi-dimensional profile of detonator components is presented. This improved capability, along with high fidelity simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.
The Osseus platform: a prototype for advanced web-based distributed simulation
NASA Astrophysics Data System (ADS)
Franceschini, Derrick; Riecken, Mark
2016-05-01
Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.
End-to-End simulations for the MICADO-MAORY SCAO mode
NASA Astrophysics Data System (ADS)
Vidal, Fabrice; Ferreira, Florian; Déo, Vincent; Sevin, Arnaud; Gendron, Eric; Clénet, Yann; Durand, Sébastien; Gratadour, Damien; Doucet, Nicolas; Rousset, Gérard; Davies, Richard
2018-04-01
MICADO is a E-ELT first light near-infrared imager. It will work at the diffraction limit of the telescope thanks to multi-conjugate adaptive optics (MCAO) and single-conjugate adaptive optics (SCAO) modes provided inside the MAORY AO module. The SCAO capability is a joint development by the MICADO and MAORY consortia, lead by MICADO, and is motivated by scientific programs for which SCAO will deliver the best AO performance (e.g. exoplanets, solar system science, bright AGNs, etc). Shack-Hartmann (SH) or Pyramid WFS were both envisioned for the wavefront measurement of the SCAO mode. In addition to WFS design considerations, numerical simulations are therefore needed to trade-off between these two WFS. COMPASS (COMputing Platform for Adaptive optics SyStems) is a GPU-based adaptive optics end-to-end simulation platform allowing us to perform numerical simulations in various modes (SCAO, LTAO, MOAO, MCAO...). COMPASS was originally bound to Yorick for its user interface and a major upgrade has been recently done to now bind to Python allowing a better long term support to the community. Thanks to the speed of computation of COMPASS we were able to span quickly a very large parameters of space at the E-ELT scale. We present the results of the study between WFS choice (SH or Pyramid), WFS parameters (detector noise, guide star magnitude, number of subapertures, number of controlled modes...), turbulence conditions and AO controls for the MICADO-MAORY SCAO mode.
Modeling Martian Dust Using Mars-GRAM
NASA Technical Reports Server (NTRS)
Justh, Hilary L.; Justus, C. G.
2010-01-01
Engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). From the surface to 80 km altitude, Mars-GRAM is based on NASA Ames Mars General Circulation Model (MGCM). Mars-GRAM and MGCM use surface topography from Mars Global Surveyor Mars Orbiter Laser Altimeter (MOLA), with altitudes referenced to the MOLA areoid, or constant potential surface. Traditional Mars-GRAM options for representing the mean atmosphere along entry corridors include: TES Mapping Years 1 and 2, with Mars-GRAM data coming from MGCM model results driven by observed TES dust optical depth TES Mapping Year 0, with user-controlled dust optical depth and Mars-GRAM data interpolated from MGCM model results driven by selected values of globally-uniform dust optical depth. Mars-GRAM 2005 has been validated against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES).
NASA Astrophysics Data System (ADS)
Scaff, L.; Li, Y.; Prein, A. F.; Liu, C.; Rasmussen, R.; Ikeda, K.
2017-12-01
A better representation of the diurnal cycle of convective precipitation is essential for the analysis of the energy balance and the water budget components such as runoff, evaporation and infiltration. Convection-permitting regional climate modeling (CPM) has been shown to improve the models' performance of summer precipitation, allowing to: (1) simulate the mesoscale processes in more detail and (2) to provide more insights in future changes in convective precipitation under climate change. In this work we investigate the skill of the Weather Research and Forecast model (WRF) in simulating the summer precipitation diurnal cycle over most of North America. We use 4 km horizontal grid spacing in a 13-years long current and future period. The future scenario is assuming no significant changes in large-scale weather patterns and aims to answer how the weather of the current climate would change if it would reoccur at the end of the century under a high-end emission scenario (Pseudo Global Warming). We emphasize on a region centered on the lee side of the Canadian Rocky Mountains, where the summer precipitation amount shows a regional maximum. The historical simulations are capable to correctly represent the diurnal cycle. At the lee-side of the Canadian Rockies the increase in the convective available potential energy as well as pronounced low-level moisture flux from the southeast Prairies explains the local maximum in summer precipitation. The PGW scenario shows an increase in summer precipitation amount and intensity in this region, consistently with a stronger source of moisture and convective energy.
A complete equation of state for non-ideal condensed phase explosives
NASA Astrophysics Data System (ADS)
Wilkinson, S. D.; Braithwaite, M.; Nikiforakis, N.; Michael, L.
2017-12-01
The objective of this work is to improve the robustness and accuracy of numerical simulations of both ideal and non-ideal explosives by introducing temperature dependence in mechanical equations of state for reactants and products. To this end, we modify existing mechanical equations of state to appropriately approximate the temperature in the reaction zone. Mechanical equations of state of the Mie-Grüneisen form are developed with extensions, which allow the temperature to be evaluated appropriately and the temperature equilibrium condition to be applied robustly. Furthermore, the snow plow model is used to capture the effect of porosity on the reactant equation of state. We apply the methodology to predict the velocity of compliantly confined detonation waves. Once reaction rates are calibrated for unconfined detonation velocities, simulations of confined rate sticks and slabs are performed, and the experimental detonation velocities are matched without further parameter alteration, demonstrating the predictive capability of our simulations. We apply the same methodology to both ideal (PBX9502, a high explosive with principal ingredient TATB) and non-ideal (EM120D, an ANE or ammonium nitrate based emulsion) explosives.
Process Capability of High Speed Micro End-Milling of Inconel 718 with Minimum Quantity Lubrication
NASA Astrophysics Data System (ADS)
Rahman, Mohamed Abd; Yeakub Ali, Mohammad; Rahman Shah Rosli, Abdul; Banu, Asfana
2017-03-01
The demand for micro-parts is expected to grow and micro-machining has been shown to be a viable manufacturing process to produce these products. These micro-products may be produced from hard-to-machine materials such as superalloys under little or no metal cutting fluids to reduce machining cost or drawbacks associated with health and environment. This project aims to investigate the capability of micro end-milling process of Inconel 718 with minimum quantity lubrication (MQL). Microtools DT-110 multi-process micro machine was used to machine 10 micro-channels with MQL and 10 more under dry condition while maintaining the same machining parameters. The width of the micro-channels was measured using digital microscope and used to determine the process capability indices, Cp and Cpk. QI Macros SPC for Excel was used to analyze the resultant machining data. The results indicated that micro end-milling process of Inconel 718 was not capable under both MQL and dry cutting conditions as indicated by the Cp values of less than 1.0. However, the use of MQL helped the process to be more stable and capable. Results obtained showed that the process variation was greatly reduced by using MQL in micro end-milling of Inconel 718.
GLORIA observations of de-/nitrification during the Arctic winter 2015/16 POLSTRACC campaign
NASA Astrophysics Data System (ADS)
Braun, Marleen; Woiwode, Wolfgang; Höpfner, Michael; Johansson, Sören; Friedl-Vallon, Felix; Oelhaf, Hermann; Preusse, Peter; Ungermann, Jörn; Grooß, Jens-Uwe; Jurkat, Tina; Khosrawi, Farahnaz; Kirner, Ole; Marsing, Andreas; Sinnhuber, Björn-Martin; Voigt, Christiane; Ziereis, Helmut; Orphal, Johannes
2017-04-01
Denitrification, the condensation and sedimentation of HNO3-containing particles in the winter stratosphere at high latitudes, is an important process affecting the deactivation of ozone-depleting halogen species. It modulates the vertical partitioning of chemically active NOy and the vertical redistribution of HNO3 can affect low stratospheric altitudes under sufficiently cold conditions. The capability of associated nitrification to disturb the NOy budget of the climate-relevant lowermost stratosphere (LMS) has hardly been investigated in detail and represents a challenge for model simulations. The Arctic winter 2015/16 was characterized by exceptionally cold stratospheric temperatures and widespread polar stratospheric clouds (PSCs) that were observed from mid-December 2015 until the end of February 2016 down to the LMS. Observations by the GLORIA (Gimballed Limb Observer for Radiance Imaging of the Atmosphere) spectrometer during the POLSTRACC (Polar Stratosphere in a Changing Climate) aircraft mission allow us to study the development of nitrification of the Arctic LMS during and after the 2015/16 PSC period with high vertical resolution. The vertical cross-sections of HNO3 distribution along the HALO (High Altitude and LOng range research aircraft) flight tracks derived from GLORIA observations show the result of significant vertical redistribution of NOy with strong nitrification of up to 6 ppbv in the LMS. We compare the results of the GLORIA observations with simulations by the state-of-the-art chemical-transport model CLaMS and the climate-chemistry model EMAC and discuss the capability of these models to reproduce nitrification of the Arctic LMS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perkins, Stephen J.; Wright, David W.; Zhang, Hailiang
2016-10-14
The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-artmore » molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in whichGenAppprovides the deployment infrastructure for running applications on both standard and high-performance computing hardware, andSASSIEprovides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data.GenAppproduces the accessible web-based front end termedSASSIE-web, andGenAppandSASSIEalso make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic `bottlebrush' polymers.« less
Perkins, Stephen J; Wright, David W; Zhang, Hailiang; Brookes, Emre H; Chen, Jianhan; Irving, Thomas C; Krueger, Susan; Barlow, David J; Edler, Karen J; Scott, David J; Terrill, Nicholas J; King, Stephen M; Butler, Paul D; Curtis, Joseph E
2016-12-01
The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-art molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in which GenApp provides the deployment infrastructure for running applications on both standard and high-performance computing hardware, and SASSIE provides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data. GenApp produces the accessible web-based front end termed SASSIE-web , and GenApp and SASSIE also make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic 'bottlebrush' polymers.
An Efficient Framework Model for Optimizing Routing Performance in VANETs.
Al-Kharasani, Nori M; Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala; Hanapi, Zurina Mohd
2018-02-15
Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED).
On the Feasibility of Intense Radial Velocity Surveys for Earth-twin Discoveries
NASA Astrophysics Data System (ADS)
Hall, Richard D.; Thompson, Samantha J.; Handley, Will; Queloz, Didier
2018-06-01
This work assesses the potential capability of the next generation of high-precision Radial Velocity (RV) instruments for Earth-twin exoplanet detection. From the perspective of the importance of data sampling, the Terra Hunting Experiment aims to do this through an intense series of nightly RV observations over a long baseline on a carefully selected target list, via the brand-new instrument HARPS3. This paper describes an end-to-end simulation of generating and processing such data to help us better understand the impact of uncharacterised stellar noise in the recovery of Earth-mass planets with orbital periods of the order of many months. We consider full Keplerian systems, realistic simulated stellar noise, instrument white noise, and location-specific weather patterns for our observation schedules. We use Bayesian statistics to assess various planetary models fitted to the synthetic data, and compare the successful planet recovery of the Terra Hunting Experiment schedule with a typical reference survey. We find that the Terra Hunting Experiment can detect Earth-twins in the habitable zones of solar-type stars, in single and multi-planet systems, and in the presence of stellar signals. Also that it out-performs a typical reference survey on accuracy of recovered parameters, and that it performs comparably to an uninterrupted space-based schedule.
NASA Technical Reports Server (NTRS)
1979-01-01
The requirements for a new research aircraft to provide in-flight V/STOL simulation were reviewed. The required capabilities were based on known limitations of ground based simulation and past/current experience with V/STOL inflight simulation. Results indicate that V/STOL inflight simulation capability is needed to aid in the design and development of high performance V/STOL aircraft. Although a new research V/STOL aircraft is preferred, an interim solution can be provided by use of the X-22A, the CH-47B, or the 4AV-8B aircraft modified for control/display flight research.
pysimm: A Python Package for Simulation of Molecular Systems
NASA Astrophysics Data System (ADS)
Fortunato, Michael; Colina, Coray
pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.
CAPS Simulation Environment Development
NASA Technical Reports Server (NTRS)
Murphy, Douglas G.; Hoffman, James A.
2005-01-01
The final design for an effective Comet/Asteroid Protection System (CAPS) will likely come after a number of competing designs have been simulated and evaluated. Because of the large number of design parameters involved in a system capable of detecting an object, accurately determining its orbit, and diverting the impact threat, a comprehensive simulation environment will be an extremely valuable tool for the CAPS designers. A successful simulation/design tool will aid the user in identifying the critical parameters in the system and eventually allow for automatic optimization of the design once the relationships of the key parameters are understood. A CAPS configuration will consist of space-based detectors whose purpose is to scan the celestial sphere in search of objects likely to make a close approach to Earth and to determine with the greatest possible accuracy the orbits of those objects. Other components of a CAPS configuration may include systems for modifying the orbits of approaching objects, either for the purpose of preventing a collision or for positioning the object into an orbit where it can be studied or used as a mineral resource. The Synergistic Engineering Environment (SEE) is a space-systems design, evaluation, and visualization software tool being leveraged to simulate these aspects of the CAPS study. The long-term goal of the SEE is to provide capabilities to allow the user to build and compare various CAPS designs by running end-to-end simulations that encompass the scanning phase, the orbit determination phase, and the orbit modification phase of a given scenario. Herein, a brief description of the expected simulation phases is provided, the current status and available features of the SEE software system is reported, and examples are shown of how the system is used to build and evaluate a CAPS detection design. Conclusions and the roadmap for future development of the SEE are also presented.
Binary processing and display concepts for low-cost Omega receivers. [airborne systems simulation
NASA Technical Reports Server (NTRS)
Lilley, R. W.
1974-01-01
A description is given of concepts related to plans for developing a low-cost, all-digital Omega receiver capable of offering to the small-aircraft pilot a reliable and accurate navigation aid. The receiver base considered includes a receiver front-end module, a receiver control module, a memory-aided phase-locked loop module, a housekeeping timer module, and a synthesizer module.
Flexible Environments for Grand-Challenge Simulation in Climate Science
NASA Astrophysics Data System (ADS)
Pierrehumbert, R.; Tobis, M.; Lin, J.; Dieterich, C.; Caballero, R.
2004-12-01
Current climate models are monolithic codes, generally in Fortran, aimed at high-performance simulation of the modern climate. Though they adequately serve their designated purpose, they present major barriers to application in other problems. Tailoring them to paleoclimate of planetary simulations, for instance, takes months of work. Theoretical studies, where one may want to remove selected processes or break feedback loops, are similarly hindered. Further, current climate models are of little value in education, since the implementation of textbook concepts and equations in the code is obscured by technical detail. The Climate Systems Center at the University of Chicago seeks to overcome these limitations by bringing modern object-oriented design into the business of climate modeling. Our ultimate goal is to produce an end-to-end modeling environment capable of configuring anything from a simple single-column radiative-convective model to a full 3-D coupled climate model using a uniform, flexible interface. Technically, the modeling environment is implemented as a Python-based software component toolkit: key number-crunching procedures are implemented as discrete, compiled-language components 'glued' together and co-ordinated by Python, combining the high performance of compiled languages and the flexibility and extensibility of Python. We are incrementally working towards this final objective following a series of distinct, complementary lines. We will present an overview of these activities, including PyOM, a Python-based finite-difference ocean model allowing run-time selection of different Arakawa grids and physical parameterizations; CliMT, an atmospheric modeling toolkit providing a library of 'legacy' radiative, convective and dynamical modules which can be knitted into dynamical models, and PyCCSM, a version of NCAR's Community Climate System Model in which the coupler and run-control architecture are re-implemented in Python, augmenting its flexibility and adaptability.
Evolutionary online behaviour learning and adaptation in real robots
Correia, Luís; Christensen, Anders Lyhne
2017-01-01
Online evolution of behavioural control on real robots is an open-ended approach to autonomous learning and adaptation: robots have the potential to automatically learn new tasks and to adapt to changes in environmental conditions, or to failures in sensors and/or actuators. However, studies have so far almost exclusively been carried out in simulation because evolution in real hardware has required several days or weeks to produce capable robots. In this article, we successfully evolve neural network-based controllers in real robotic hardware to solve two single-robot tasks and one collective robotics task. Controllers are evolved either from random solutions or from solutions pre-evolved in simulation. In all cases, capable solutions are found in a timely manner (1 h or less). Results show that more accurate simulations may lead to higher-performing controllers, and that completing the optimization process in real robots is meaningful, even if solutions found in simulation differ from solutions in reality. We furthermore demonstrate for the first time the adaptive capabilities of online evolution in real robotic hardware, including robots able to overcome faults injected in the motors of multiple units simultaneously, and to modify their behaviour in response to changes in the task requirements. We conclude by assessing the contribution of each algorithmic component on the performance of the underlying evolutionary algorithm. PMID:28791130
Prediction of topographic and bathymetric measurement performance of airborne low-SNR lidar systems
NASA Astrophysics Data System (ADS)
Cossio, Tristan
Low signal-to-noise ratio (LSNR) lidar (light detection and ranging) is an alternative paradigm to traditional lidar based on the detection of return signals at the single photoelectron level. The objective of this work was to predict low altitude (600 m) LSNR lidar system performance with regards to elevation measurement and target detection capability in topographic (dry land) and bathymetric (shallow water) scenarios. A modular numerical sensor model has been developed to provide data for further analysis due to the dearth of operational low altitude LSNR lidar systems. This simulator tool is described in detail, with consideration given to atmospheric effects, surface conditions, and the effects of laser phenomenology. Measurement performance analysis of the simulated topographic data showed results comparable to commercially available lidar systems, with a standard deviation of less than 12 cm for calculated elevation values. Bathymetric results, although dependent largely on water turbidity, were indicative of meter-scale horizontal data spacing for sea depths less than 5 m. The high prevalence of noise in LSNR lidar data introduces significant difficulties in data analysis. Novel algorithms to reduce noise are described, with particular focus on their integration into an end-to-end target detection classifier for both dry and submerged targets (cube blocks, 0.5 m to 1.0 m on a side). The key characteristic exploited to discriminate signal and noise is the temporal coherence of signal events versus the random distribution of noise events. Target detection performance over dry earth was observed to be robust, reliably detecting over 90% of targets with a minimal false alarm rate. Comparable results were observed in waters of high clarity, where the investigated system was generally able to detect more than 70% of targets to a depth of 5 m. The results of the study show that CATS, the University of Florida's LSNR lidar prototype, is capable of high fidelity (decimeter-scale) coverage of the topographic zone with limited applicability to shallow waters less than 5 m deep. To increase the spatial-temporal contrast between signal and noise events, laser pulse rate is the optimal system characteristic to improve in future LSNR lidar units.
NASA Technical Reports Server (NTRS)
Wong, Yen F.; Kegege, Obadiah; Schaire, Scott H.; Bussey, George; Altunc, Serhat; Zhang, Yuwen; Patel Chitra
2016-01-01
National Aeronautics and Space Administration (NASA) CubeSat missions are expected to grow rapidly in the next decade. Higher data rate CubeSats are transitioning away from Amateur Radio bands to higher frequency bands. A high-level communication architecture for future space-to-ground CubeSat communication was proposed within NASA Goddard Space Flight Center. This architecture addresses CubeSat direct-to-ground communication, CubeSat to Tracking Data Relay Satellite System (TDRSS) communication, CubeSat constellation with Mothership direct-to-ground communication, and CubeSat Constellation with Mothership communication through K-Band Single Access (KSA). A study has been performed to explore this communication architecture, through simulations, analyses, and identifying technologies, to develop the optimum communication concepts for CubeSat communications. This paper presents details of the simulation and analysis that include CubeSat swarm, daughter ship/mother ship constellation, Near Earth Network (NEN) S and X-band direct to ground link, TDRSS Multiple Access (MA) array vs Single Access mode, notional transceiver/antenna configurations, ground asset configurations and Code Division Multiple Access (CDMA) signal trades for daughter ship/mother ship CubeSat constellation inter-satellite cross link. Results of space science X-band 10 MHz maximum achievable data rate study are summarized. CubeSat NEN Ka-Band end-to-end communication analysis is provided. Current CubeSat communication technologies capabilities are presented. Compatibility test of the CubeSat transceiver through NEN and SN is discussed. Based on the analyses, signal trade studies and technology assessments, the desired CubeSat transceiver features and operation concepts for future CubeSat end-to-end communications are derived.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Simulation and Analyses of Multi-Body Separation in Launch Vehicle Staging Environment
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Hotchko, Nathaniel J.; Samareh, Jamshid; Covell, Peter F.; Tartabini, Paul V.
2006-01-01
The development of methodologies, techniques, and tools for analysis and simulation of multi-body separation is critically needed for successful design and operation of next generation launch vehicles. As a part of this activity, ConSep simulation tool is being developed. ConSep is a generic MATLAB-based front-and-back-end to the commercially available ADAMS. solver, an industry standard package for solving multi-body dynamic problems. This paper discusses the 3-body separation capability in ConSep and its application to the separation of the Shuttle Solid Rocket Boosters (SRBs) from the External Tank (ET) and the Orbiter. The results are compared with STS-1 flight data.
Simulation of Ground Winds Time Series
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
2008-01-01
A simulation process has been developed for generation of the longitudinal and lateral components of ground wind atmospheric turbulence as a function of mean wind speed, elevation, temporal frequency range and distance between locations. The distance between locations influences the spectral coherence between the simulated series at adjacent locations. Short distances reduce correlation only at high frequencies; as distances increase correlation is reduced over a wider range of frequencies. The choice of values for the constants d1 and d3 in the PSD model is the subject of work in progress. An improved knowledge of the values for zO as a function of wind direction at the ARES-1 launch pads is necessary for definition of d1. Results of other studies at other locations may be helpful as summarized in Fichtl's recent correspondence. Ideally, further research is needed based on measurements of ground wind turbulence with high resolution anemometers at a number of altitudes at a new KSC tower located closer to the ARES-1 launch pad .The proposed research would be based on turbulence measurements that may be influenced by surface terrain roughness that may be significantly different from roughness prior to 1970 in Fichtl's measurements. Significant improvements in instrumentation, data storage end processing will greatly enhance the capability to model ground wind profiles and ground wind turbulence.
NASA Technical Reports Server (NTRS)
Phillips, Veronica J.
2017-01-01
The Ames Engineering Directorate is the principal engineering organization supporting aerospace systems and spaceflight projects at NASA's Ames Research Center in California's Silicon Valley. The Directorate supports all phases of engineering and project management for flight and mission projects-from R&D to Close-out-by leveraging the capabilities of multiple divisions and facilities.The Mission Design Center (MDC) has full end-to-end mission design capability with sophisticated analysis and simulation tools in a collaborative concurrent design environment. Services include concept maturity level (CML) maturation, spacecraft design and trades, scientific instruments selection, feasibility assessments, and proposal support and partnerships. The Engineering Systems Division provides robust project management support as well as systems engineering, mechanical and electrical analysis and design, technical authority and project integration support to a variety of programs and projects across NASA centers. The Applied Manufacturing Division turns abstract ideas into tangible hardware for aeronautics, spaceflight and science applications, specializing in fabrication methods and management of complex fabrication projects. The Engineering Evaluation Lab (EEL) provides full satellite or payload environmental testing services including vibration, temperature, humidity, immersion, pressure/altitude, vacuum, high G centrifuge, shock impact testing and the Flight Processing Center (FPC), which includes cleanrooms, bonded stores and flight preparation resources. The Multi-Mission Operations Center (MMOC) is composed of the facilities, networks, IT equipment, software and support services needed by flight projects to effectively and efficiently perform all mission functions, including planning, scheduling, command, telemetry processing and science analysis.
Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clouse, C. J.; Edwards, M. J.; McCoy, M. G.
2015-07-07
Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.
Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design
NASA Technical Reports Server (NTRS)
Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael
2016-01-01
Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
Sustainable Human Presence on the Moon using In Situ Resources
NASA Technical Reports Server (NTRS)
McLemore, Carol A.; Fikes, John C.; McCarley, Kevin S.; Darby, Charles A.; Curreri, Peter A.; Kennedy, James P.; Good, James E.; Gilley, Scott D.
2008-01-01
New capabilities, technologies and infrastructure must be developed to enable a sustained human presence on the moon and beyond. The key to having this permanent presence is the utilization of in situ resources. To this end, NASA is investigating how in situ resources can be utilized to improve mission success by reducing up-mass, improving safety, reducing risk, and bringing down costs for the overall mission. To ensure that this capability is available when needed, technology development is required now. NASA/Marshall Space Flight Center (MSFC) is supporting this endeavor, along with other NASA centers, by exploring how lunar regolith can be mined for uses such as construction, life support, propulsion, power, and fabrication. Efforts at MSFC include development of lunar regolith simulant for hardware testing and development, extraction of oxygen and other materials from the lunar regolith, production of parts and tools on the moon from local materials or from provisioned feedstocks, and capabilities to show that produced parts are "ready for use". This paper discusses the lunar regolith, how the regolith is being replicated in the development of simulants and possible uses of the regolith.
Environmental impact analysis with the airspace concept evaluation system
NASA Technical Reports Server (NTRS)
Augustine, Stephen; Capozzi, Brian; DiFelici, John; Graham, Michael; Thompson, Terry; Miraflor, Raymond M. C.
2005-01-01
The National Aeronautics and Space Administration (NASA) Ames Research Center has developed the Airspace Concept Evaluation System (ACES), which is a fast-time simulation tool for evaluating Air Traffic Management (ATM) systems. This paper describes linking a capability to ACES which can analyze the environmental impact of proposed future ATM systems. This provides the ability to quickly evaluate metrics associated with environmental impacts of aviation for inclusion in multi-dimensional cost-benefit analysis of concepts for evolution of the National Airspace System (NAS) over the next several decades. The methodology used here may be summarized as follows: 1) Standard Federal Aviation Administration (FAA) noise and emissions-inventory models, the Noise Impact Routing System (NIRS) and the Emissions and Dispersion Modeling System (EDMS), respectively, are linked to ACES simulation outputs; 2) appropriate modifications are made to ACES outputs to incorporate all information needed by the environmental models (e.g., specific airframe and engine data); 3) noise and emissions calculations are performed for all traffic and airports in the study area for each of several scenarios, as simulated by ACES; and 4) impacts of future scenarios are compared to the current NAS baseline scenario. This paper also provides the results of initial end-to-end, proof-of-concept runs of the integrated ACES and environmental-modeling capability. These preliminary results demonstrate that if no growth is likely to be impeded by significant environmental impacts that could negatively affect communities throughout the nation.
Global Broadcast Service (GBS)
2013-12-01
as to be unusable by smaller and more mobile units. To this end, GBS currently uses broadcast payloads on two Ultra-High Frequency Follow-On ( UFO ...operational on UFO satellites 8, 9, 10. - Full Satellite Broadcast Manager capability. - Field 20% of JPO Receive Suites (19 units). - Personnel training...capabilities. - Augment UFO GBS with leased commercial satellite services to cover gaps over CONUS. - Demonstrate smart push and user pull capability
NASA Technical Reports Server (NTRS)
Holbrook, Mark; Pitts, Robert Lee; Gifford, Kevin K.; Jenkins, Andrew; Kuzminsky, Sebastian
2010-01-01
The International Space Station (ISS) is in an operational configuration and nearing final assembly. With its maturity and diverse payloads onboard, the opportunity exists to extend the orbital lab into a facility to exercise and demonstrate Delay/Disruption Tolerant Networking (DTN). DTN is an end-to-end network service providing communications through environments characterized by intermittent connectivity, variable delays, high bit error rates, asymmetric links and simplex links. The DTN protocols, also known as bundle protocols, provide a store-and-forward capability to accommodate end-to-end network services. Key capabilities of the bundling protocols include: the Ability to cope with intermittent connectivity, the Ability to take advantage of scheduled and opportunistic connectivity (in addition to always up connectivity), Custody Transfer, and end-to-end security. Colorado University at Boulder and the Huntsville Operational Support Center (HOSC) have been developing a DTN capability utilizing the Commercial Generic Bioprocessing Apparatus (CGBA) payload resources onboard the ISS, at the Boulder Payload Operations Center (POC) and at the HOSC. The DTN capability is in parallel with and is designed to augment current capabilities. The architecture consists of DTN endpoint nodes on the ISS and at the Boulder POC, and a DTN node at the HOSC. The DTN network is composed of two implementations; the Interplanetary Overlay Network (ION) and the open source DTN2 implementation. This paper presents the architecture, implementation, and lessons learned. By being able to handle the types of environments described above, the DTN technology will be instrumental in extending networks into deep space to support future missions to other planets and other solar system points of interest. Thus, this paper also discusses how this technology will be applicable to these types of deep space exploration missions.
High Fidelity Ion Beam Simulation of High Dose Neutron Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Was, Gary; Wirth, Brian; Motta, Athur
The objective of this proposal is to demonstrate the capability to predict the evolution of microstructure and properties of structural materials in-reactor and at high doses, using ion irradiation as a surrogate for reactor irradiations. “Properties” includes both physical properties (irradiated microstructure) and the mechanical properties of the material. Demonstration of the capability to predict properties has two components. One is ion irradiation of a set of alloys to yield an irradiated microstructure and corresponding mechanical behavior that are substantially the same as results from neutron exposure in the appropriate reactor environment. Second is the capability to predict the irradiatedmore » microstructure and corresponding mechanical behavior on the basis of improved models, validated against both ion and reactor irradiations and verified against ion irradiations. Taken together, achievement of these objectives will yield an enhanced capability for simulating the behavior of materials in reactor irradiations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, J.; Herner, K.; Jayatilaka, B.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
Boyd, J.; Herner, K.; Jayatilaka, B.; ...
2015-12-23
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in bothmore » software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. Furthermore, these efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.« less
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Boyd, J.; Herner, K.; Jayatilaka, B.; Roser, R.; Sakumoto, W.
2015-12-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and DO experiments each have nearly 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 or beyond. To achieve this, we are implementing a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology as well as leveraging resources available from currently-running experiments at Fermilab. These efforts will provide useful lessons in ensuring long-term data access for numerous experiments throughout high-energy physics, and provide a roadmap for high-quality scientific output for years to come.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.
2011-10-01
In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.
Surgical scissors extension adds the 7th axis of force feedback to the Freedom 6S.
Powers, Marilyn J; Sinclair, Ian P W; Brouwer, Iman; Laroche, Denis
2007-01-01
A virtual reality surgical simulator ideally allows seamless transition between the real and virtual world. In that respect, all of a surgeon's motions and tools must be simulated. Until now researchers have been limited to using a pen-like tool in six degrees-of-freedom. This paper presents the addition of haptically enabled scissors to the end effector of a 6-DOF haptic device, the Freedom 6S. The scissors are capable of pinching a maximum torque of 460 mN.m with low inertia and low back-drive friction. The device is a balanced design so that the user feels like they are holding no more than actual scissors, although with some added inertia on the load end. The system is interchangeable between the 6-DOF and 7-DOF configurations to allow switching tools quickly.
NASA Technical Reports Server (NTRS)
Simpson, W. R.
1994-01-01
An advanced sensor test capability is now operational at the Air Force Arnold Engineering Development Center (AEDC) for calibration and performance characterization of infrared sensors. This facility, known as the 7V, is part of a broad range of test capabilities under development at AEDC to provide complete ground test support to the sensor community for large-aperture surveillance sensors and kinetic kill interceptors. The 7V is a state-of-the-art cryo/vacuum facility providing calibration and mission simulation against space backgrounds. Key features of the facility include high-fidelity scene simulation with precision track accuracy and in-situ target monitoring, diffraction limited optical system, NIST traceable broadband and spectral radiometric calibration, outstanding jitter control, environmental systems for 20 K, high-vacuum, low-background simulation, and an advanced data acquisition system.
Kepler Mission: End-to-End System Demonstration
NASA Technical Reports Server (NTRS)
Borucki, William; Koch, D.; Dunham, E.; Jenkins, J.; Witteborn, F.; Updike, T.; DeVincenzi, Donald L. (Technical Monitor)
2000-01-01
A test facility has been constructed to demonstrate the capability of differential ensemble photometry to detect transits of Earth-size planets orbiting solar-like stars. The main objective is to determine the effects of various noise sources on the capability of a CCD photometer to maintain a system relative precision of 1 x $10^(-5)$ for mv = 12 stars in the presence of system-induced noise sources. The facility includes a simulated star field, fast optics to simulate the telescope, a thinned back-illuminated CCD similar to those to be used on the spacecraft and computers to perform the onboard control, data processing and extraction. The test structure is thermally and mechanically isolated so that each source of noise can be introduced in a controlled fashion and evaluated for its contribution to the total noise budget. The effects of pointing errors or a changing thermal environment are imposed by piezo-electric devices. Transits are injected by heating small wires crossing apertures in the star plate. Signals as small as those from terrestrial-size transits of solar-like stars are introduced to demonstrate that such planets can be detected under realistic noise conditions. Examples of imposing several noise sources and the resulting detectabilities are presented. These show that a differential ensemble photometric approach CCD photometer can readily detect signals associated with Earth-size transits.
Hot zero power reactor calculations using the Insilico code
Hamilton, Steven P.; Evans, Thomas M.; Davidson, Gregory G.; ...
2016-03-18
In this paper we describe the reactor physics simulation capabilities of the insilico code. A description of the various capabilities of the code is provided, including detailed discussion of the geometry, meshing, cross section processing, and neutron transport options. Numerical results demonstrate that the insilico SP N solver with pin-homogenized cross section generation is capable of delivering highly accurate full-core simulation of various PWR problems. Comparison to both Monte Carlo calculations and measured plant data is provided.
NASA Astrophysics Data System (ADS)
Rose, D. V.; Welch, D. R.; Clark, R. E.; Thoma, C.; Zimmerman, W. R.; Bruner, N.; Rambo, P. K.; Atherton, B. W.
2011-09-01
Streamer and leader formation in high pressure devices is dynamic process involving a broad range of physical phenomena. These include elastic and inelastic particle collisions in the gas, radiation generation, transport and absorption, and electrode interactions. Accurate modeling of these physical processes is essential for a number of applications, including high-current, laser-triggered gas switches. Towards this end, we present a new 3D implicit particle-in-cell simulation model of gas breakdown leading to streamer formation in electronegative gases. The model uses a Monte Carlo treatment for all particle interactions and includes discrete photon generation, transport, and absorption for ultra-violet and soft x-ray radiation. Central to the realization of this fully kinetic particle treatment is an algorithm that manages the total particle count by species while preserving the local momentum distribution functions and conserving charge [D. R. Welch, T. C. Genoni, R. E. Clark, and D. V. Rose, J. Comput. Phys. 227, 143 (2007)]. The simulation model is fully electromagnetic, making it capable of following, for example, the evolution of a gas switch from the point of laser-induced localized breakdown of the gas between electrodes through the successive stages of streamer propagation, initial electrode current connection, and high-current conduction channel evolution, where self-magnetic field effects are likely to be important. We describe the model details and underlying assumptions used and present sample results from 3D simulations of streamer formation and propagation in SF6.
A Software Framework for Aircraft Simulation
NASA Technical Reports Server (NTRS)
Curlett, Brian P.
2008-01-01
The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.
Providing a parallel and distributed capability for JMASS using SPEEDES
NASA Astrophysics Data System (ADS)
Valinski, Maria; Driscoll, Jonathan; McGraw, Robert M.; Meyer, Bob
2002-07-01
The Joint Modeling And Simulation System (JMASS) is a Tri-Service simulation environment that supports engineering and engagement-level simulations. As JMASS is expanded to support other Tri-Service domains, the current set of modeling services must be expanded for High Performance Computing (HPC) applications by adding support for advanced time-management algorithms, parallel and distributed topologies, and high speed communications. By providing support for these services, JMASS can better address modeling domains requiring parallel computationally intense calculations such clutter, vulnerability and lethality calculations, and underwater-based scenarios. A risk reduction effort implementing some HPC services for JMASS using the SPEEDES (Synchronous Parallel Environment for Emulation and Discrete Event Simulation) Simulation Framework has recently concluded. As an artifact of the JMASS-SPEEDES integration, not only can HPC functionality be brought to the JMASS program through SPEEDES, but an additional HLA-based capability can be demonstrated that further addresses interoperability issues. The JMASS-SPEEDES integration provided a means of adding HLA capability to preexisting JMASS scenarios through an implementation of the standard JMASS port communication mechanism that allows players to communicate.
NASA Technical Reports Server (NTRS)
Slafer, Loren I.
1989-01-01
Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.
High End Computer Network Testbedding at NASA Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Gary, James Patrick
1998-01-01
The Earth & Space Data Computing (ESDC) Division, at the Goddard Space Flight Center, is involved in development and demonstrating various high end computer networking capabilities. The ESDC has several high end super computers. These are used to run: (1) computer simulation of the climate systems; (2) to support the Earth and Space Sciences (ESS) project; (3) to support the Grand Challenge (GC) Science, which is aimed at understanding the turbulent convection and dynamos in stars. GC research occurs in many sites throughout the country, and this research is enabled by, in part, the multiple high performance network interconnections. The application drivers for High End Computer Networking use distributed supercomputing to support virtual reality applications, such as TerraVision, (i.e., three dimensional browser of remotely accessed data), and Cave Automatic Virtual Environments (CAVE). Workstations can access and display data from multiple CAVE's with video servers, which allows for group/project collaborations using a combination of video, data, voice and shared white boarding. The ESDC is also developing and demonstrating the high degree of interoperability between satellite and terrestrial-based networks. To this end, the ESDC is conducting research and evaluations of new computer networking protocols and related technologies which improve the interoperability of satellite and terrestrial networks. The ESDC is also involved in the Security Proof of Concept Keystone (SPOCK) program sponsored by National Security Agency (NSA). The SPOCK activity provides a forum for government users and security technology providers to share information on security requirements, emerging technologies and new product developments. Also, the ESDC is involved in the Trans-Pacific Digital Library Experiment, which aims to demonstrate and evaluate the use of high performance satellite communications and advanced data communications protocols to enable interactive digital library data access between the U. S. Library of Congress, the National Library of Japan and other digital library sites at 155 MegaBytes Per Second. The ESDC participation in this program is the Trans-Pacific access to GLOBE visualizations in real time. ESDC is participating in the Department of Defense's ATDNet with Multiwavelength Optical Network (MONET) a fully switched Wavelength Division Networking testbed. This presentation is in viewgraph format.
NASA Astrophysics Data System (ADS)
Hantry, Francois; Papazoglou, Mike; van den Heuvel, Willem-Jan; Haque, Rafique; Whelan, Eoin; Carroll, Noel; Karastoyanova, Dimka; Leymann, Frank; Nikolaou, Christos; Lammersdorf, Winfried; Hacid, Mohand-Said
Business process management is one of the core drivers of business innovation and is based on strategic technology and capable of creating and successfully executing end-to-end business processes. The trend will be to move from relatively stable, organization-specific applications to more dynamic, high-value ones where business process interactions and trends are examined closely to understand more accurately an application's requirements. Such collaborative, complex end-to-end service interactions give rise to the concept of Service Networks (SNs).
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
NASA Advanced Supercomputing Facility Expansion
NASA Technical Reports Server (NTRS)
Thigpen, William W.
2017-01-01
The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.
NASA Technical Reports Server (NTRS)
Barre, Jerome; Edwards, David; Worden, Helen; Da Silva, Arlindo; Lahoz, William
2015-01-01
By the end of the current decade, there are plans to deploy several geostationary Earth orbit (GEO) satellite missions for atmospheric composition over North America, East Asia and Europe with additional missions proposed. Together, these present the possibility of a constellation of geostationary platforms to achieve continuous time-resolved high-density observations over continental domains for mapping pollutant sources and variability at diurnal and local scales. In this paper, we use a novel approach to sample a very high global resolution model (GEOS-5 at 7 km horizontal resolution) to produce a dataset of synthetic carbon monoxide pollution observations representative of those potentially obtainable from a GEO satellite constellation with predicted measurement sensitivities based on current remote sensing capabilities. Part 1 of this study focuses on the production of simulated synthetic measurements for air quality OSSEs (Observing System Simulation Experiments). We simulate carbon monoxide nadir retrievals using a technique that provides realistic measurements with very low computational cost. We discuss the sampling methodology: the projection of footprints and areas of regard for geostationary geometries over each of the North America, East Asia and Europe regions; the regression method to simulate measurement sensitivity; and the measurement error simulation. A detailed analysis of the simulated observation sensitivity is performed, and limitations of the method are discussed. We also describe impacts from clouds, showing that the efficiency of an instrument making atmospheric composition measurements on a geostationary platform is dependent on the dominant weather regime over a given region and the pixel size resolution. These results demonstrate the viability of the "instrument simulator" step for an OSSE to assess the performance of a constellation of geostationary satellites for air quality measurements.
Overview of artificial neural networks.
Zou, Jinming; Han, Yi; So, Sung-Sau
2008-01-01
The artificial neural network (ANN), or simply neural network, is a machine learning method evolved from the idea of simulating the human brain. The data explosion in modem drug discovery research requires sophisticated analysis methods to uncover the hidden causal relationships between single or multiple responses and a large set of properties. The ANN is one of many versatile tools to meet the demand in drug discovery modeling. Compared to a traditional regression approach, the ANN is capable of modeling complex nonlinear relationships. The ANN also has excellent fault tolerance and is fast and highly scalable with parallel processing. This chapter introduces the background of ANN development and outlines the basic concepts crucially important for understanding more sophisticated ANN. Several commonly used learning methods and network setups are discussed briefly at the end of the chapter.
NASA Technical Reports Server (NTRS)
Smith, David A.; Hojnicki, Jeffrey S.; Sjauw, Waldy K.
2014-01-01
Recent NASA interest in utilizing solar electronic propulsion (SEP) technology to transfer payloads, e.g. from low-Earth orbit (LEO) to higher energy geostationary-Earth orbit (GEO) or to Earth escape, has necessitated the development of high fidelity SEP vehicle models and simulations. These models and simulations need to be capable of capturing vehicle dynamics and sub-system interactions experienced during the transfer trajectories which are typically accomplished with continuous-burn (potentially interrupted by solar eclipse), long duration "spiral out" maneuvers taking several months or more to complete. This paper presents details of an integrated simulation approach achieved by combining a high fidelity vehicle simulation code with a detailed solar array model. The combined simulation tool gives researchers the functionality to study the integrated effects of various vehicle sub-systems (e.g. vehicle guidance, navigation and control (GN&C), electric propulsion system (EP)) with time varying power production. Results from a simulation model of a vehicle with a 50 kW class SEP system using the integrated tool are presented and compared to the results from another simulation model employing a 50 kW end-of-life (EOL) fixed power level assumption. These models simulate a vehicle under three degree of freedom dynamics (i.e. translational dynamics only) and include the effects of a targeting guidance algorithm (providing a "near optimal" transfer) during a LEO to near Earth escape (C (sub 3) = -2.0 km (sup 2) / sec (sup -2) spiral trajectory. The presented results include the impact of the fully integrated, time-varying solar array model (e.g. cumulative array degradation from traversing the Van Allen belts, impact of solar eclipses on the vehicle and the related temperature responses in the solar arrays due to operating in the Earth's thermal environment, high fidelity array power module, etc.); these are used to assess the impact on vehicle performance (i.e. propellant consumption) and transit times.
High-fidelity simulation capability for virtual testing of seismic and acoustic sensors
NASA Astrophysics Data System (ADS)
Wilson, D. Keith; Moran, Mark L.; Ketcham, Stephen A.; Lacombe, James; Anderson, Thomas S.; Symons, Neill P.; Aldridge, David F.; Marlin, David H.; Collier, Sandra L.; Ostashev, Vladimir E.
2005-05-01
This paper describes development and application of a high-fidelity, seismic/acoustic simulation capability for battlefield sensors. The purpose is to provide simulated sensor data so realistic that they cannot be distinguished by experts from actual field data. This emerging capability provides rapid, low-cost trade studies of unattended ground sensor network configurations, data processing and fusion strategies, and signatures emitted by prototype vehicles. There are three essential components to the modeling: (1) detailed mechanical signature models for vehicles and walkers, (2) high-resolution characterization of the subsurface and atmospheric environments, and (3) state-of-the-art seismic/acoustic models for propagating moving-vehicle signatures through realistic, complex environments. With regard to the first of these components, dynamic models of wheeled and tracked vehicles have been developed to generate ground force inputs to seismic propagation models. Vehicle models range from simple, 2D representations to highly detailed, 3D representations of entire linked-track suspension systems. Similarly detailed models of acoustic emissions from vehicle engines are under development. The propagation calculations for both the seismics and acoustics are based on finite-difference, time-domain (FDTD) methodologies capable of handling complex environmental features such as heterogeneous geologies, urban structures, surface vegetation, and dynamic atmospheric turbulence. Any number of dynamic sources and virtual sensors may be incorporated into the FDTD model. The computational demands of 3D FDTD simulation over tactical distances require massively parallel computers. Several example calculations of seismic/acoustic wave propagation through complex atmospheric and terrain environments are shown.
High-Performance Algorithms and Complex Fluids | Computational Science |
only possible by combining experimental data with simulation. Capabilities Capabilities include: Block -laden, non-Newtonian, as well as traditional internal and external flows. Contact Ray Grout Group
NASA Astrophysics Data System (ADS)
Lin, S. J.
2015-12-01
The NOAA/Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions (e.g., tornado outbreak events and cat-5 hurricanes) and ultra-high-resolution (1-km) regional climate simulations within a consistent global modeling framework. The fundation of this flexible regional-global modeling system is the non-hydrostatic extension of the vertically Lagrangian dynamical core (Lin 2004, Monthly Weather Review) known in the community as FV3 (finite-volume on the cubed-sphere). Because of its flexability and computational efficiency, the FV3 is one of the final candidates of NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched (single) grid capability, a two-way (regional-global) multiple nested grid capability, and the combination of the stretched and two-way nests, so as to make convection-resolving regional climate simulation within a consistent global modeling system feasible using today's High Performance Computing System. One of our main scientific goals is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornadoes using a global model that was originally designed for century long climate simulations. As a unified weather-climate modeling system, we evaluated the performance of the model with horizontal resolution ranging from 1 km to as low as 200 km. In particular, for downscaling studies, we have developed various tests to ensure that the large-scale circulation within the global varaible resolution system is well simulated while at the same time the small-scale can be accurately captured within the targeted high resolution region.
Cybersecurity in Hospitals: A Systematic, Organizational Perspective.
Jalali, Mohammad S; Kaiser, Jessica P
2018-05-28
Cybersecurity incidents are a growing threat to the health care industry in general and hospitals in particular. The health care industry has lagged behind other industries in protecting its main stakeholder (ie, patients), and now hospitals must invest considerable capital and effort in protecting their systems. However, this is easier said than done because hospitals are extraordinarily technology-saturated, complex organizations with high end point complexity, internal politics, and regulatory pressures. The purpose of this study was to develop a systematic and organizational perspective for studying (1) the dynamics of cybersecurity capability development at hospitals and (2) how these internal organizational dynamics interact to form a system of hospital cybersecurity in the United States. We conducted interviews with hospital chief information officers, chief information security officers, and health care cybersecurity experts; analyzed the interview data; and developed a system dynamics model that unravels the mechanisms by which hospitals build cybersecurity capabilities. We then use simulation analysis to examine how changes to variables within the model affect the likelihood of cyberattacks across both individual hospitals and a system of hospitals. We discuss several key mechanisms that hospitals use to reduce the likelihood of cybercriminal activity. The variable that most influences the risk of cyberattack in a hospital is end point complexity, followed by internal stakeholder alignment. Although resource availability is important in fueling efforts to close cybersecurity capability gaps, low levels of resources could be compensated for by setting a high target level of cybersecurity. To enhance cybersecurity capabilities at hospitals, the main focus of chief information officers and chief information security officers should be on reducing end point complexity and improving internal stakeholder alignment. These strategies can solve cybersecurity problems more effectively than blindly pursuing more resources. On a macro level, the cyber vulnerability of a country's hospital infrastructure is affected by the vulnerabilities of all individual hospitals. In this large system, reducing variation in resource availability makes the whole system less vulnerable-a few hospitals with low resources for cybersecurity threaten the entire infrastructure of health care. In other words, hospitals need to move forward together to make the industry less attractive to cybercriminals. Moreover, although compliance is essential, it does not equal security. Hospitals should set their target level of cybersecurity beyond the requirements of current regulations and policies. As of today, policies mostly address data privacy, not data security. Thus, policy makers need to introduce policies that not only raise the target level of cybersecurity capabilities but also reduce the variability in resource availability across the entire health care system. ©Mohammad S Jalali, Jessica P Kaiser. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.05.2018.
Narrowband thermal radiation from closed-end microcavities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohiyama, Asaka; Shimizu, Makoto; Iguchi, Fumitada
2015-10-07
High spectral selectivity of thermal radiation is important for achieving high-efficiency energy systems. In this study, intense, narrowband, and low directional absorption/radiation were observed in closed-end microcavity which is a conventional open-end microcavity covered by a semi-transparent thin metal film. The quality factor (Q factor) of optical absorption band strongly depended on the film electrical conductivity. Asymmetric and narrow absorption band with a Q factor of 25 at 1.28 μm was obtained for a 6-nm-thick Au film. Numerical simulations suggest that the formation of a fixed-end mode at the cavity aperture contributes to the narrowband optical absorption. The closed-end microcavity filledmore » with SiO{sub 2} exhibits intense and isotropic thermal radiation over a wide solid angle according to numerical simulation. The narrow and asymmetric absorption spectrum was experimentally confirmed in a model of closed-end microcavity.« less
NASA Astrophysics Data System (ADS)
Li, Jing; Wu, Huayi; Yang, Chaowei; Wong, David W.; Xie, Jibo
2011-09-01
Geoscientists build dynamic models to simulate various natural phenomena for a better understanding of our planet. Interactive visualizations of these geoscience models and their outputs through virtual globes on the Internet can help the public understand the dynamic phenomena related to the Earth more intuitively. However, challenges arise when the volume of four-dimensional data (4D), 3D in space plus time, is huge for rendering. Datasets loaded from geographically distributed data servers require synchronization between ingesting and rendering data. Also the visualization capability of display clients varies significantly in such an online visualization environment; some may not have high-end graphic cards. To enhance the efficiency of visualizing dynamic volumetric data in virtual globes, this paper proposes a systematic framework, in which an octree-based multiresolution data structure is implemented to organize time series 3D geospatial data to be used in virtual globe environments. This framework includes a view-dependent continuous level of detail (LOD) strategy formulated as a synchronized part of the virtual globe rendering process. Through the octree-based data retrieval process, the LOD strategy enables the rendering of the 4D simulation at a consistent and acceptable frame rate. To demonstrate the capabilities of this framework, data of a simulated dust storm event are rendered in World Wind, an open source virtual globe. The rendering performances with and without the octree-based LOD strategy are compared. The experimental results show that using the proposed data structure and processing strategy significantly enhances the visualization performance when rendering dynamic geospatial phenomena in virtual globes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siranosian, Antranik Antonio; Schembri, Philip Edward; Miller, Nathan Andrew
The Benchmark Extensible Tractable Testbed Engineering Resource (BETTER) is proposed as a family of modular test bodies that are intended to support engineering capability development by helping to identify weaknesses and needs. Weapon systems, subassemblies, and components are often complex and difficult to test and analyze, resulting in low confidence and high uncertainties in experimental and simulated results. The complexities make it difficult to distinguish between inherent uncertainties and errors due to insufficient capabilities. BETTER test bodies will first use simplified geometries and materials such that testing, data collection, modeling and simulation can be accomplished with high confidence and lowmore » uncertainty. Modifications and combinations of simple and well-characterized BETTER test bodies can then be used to increase complexity in order to reproduce relevant mechanics and identify weaknesses. BETTER can provide both immediate and long-term improvements in testing and simulation capabilities. This document presents the motivation, concept, benefits and examples for BETTER.« less
Development of a Test Facility for Air Revitalization Technology Evaluation
NASA Technical Reports Server (NTRS)
Lu, Sao-Dung; Lin, Amy; Campbell, Melissa; Smith, Frederick; Curley, Su
2007-01-01
Development of new air revitalization system (ARS) technology can initially be performed in a subscale laboratory environment, but in order to advance the maturity level, the technology must be tested in an end-to-end integrated environment. The Air Revitalization Technology Evaluation Facility (ARTEF) at the NASA Johnson Space Center serves as a ground test bed for evaluating emerging ARS technologies in an environment representative of spacecraft atmospheres. At the center of the ARTEF is a hypobaric chamber which serves as a sealed atmospheric chamber for closed loop testing. A Human Metabolic Simulator (HMS) was custom-built to simulate the consumption of oxygen, and production of carbon dioxide, moisture and heat of up to eight persons. A multitude of gas analyzers and dew point sensors are used to monitor the chamber atmosphere upstream and downstream of a test article. A robust vacuum system is needed to simulate the vacuum of space. A reliable data acquisition and control system is required to connect all the subsystems together. This paper presents the capabilities of the integrated test facility and some of the issues encountered during the integration.
Study on Dissemination Patterns in Location-Aware Gossiping Networks
NASA Astrophysics Data System (ADS)
Kami, Nobuharu; Baba, Teruyuki; Yoshikawa, Takashi; Morikawa, Hiroyuki
We study the properties of information dissemination over location-aware gossiping networks leveraging location-based real-time communication applications. Gossiping is a promising method for quickly disseminating messages in a large-scale system, but in its application to information dissemination for location-aware applications, it is important to consider the network topology and patterns of spatial dissemination over the network in order to achieve effective delivery of messages to potentially interested users. To this end, we propose a continuous-space network model extended from Kleinberg's small-world model applicable to actual location-based applications. Analytical and simulation-based study shows that the proposed network achieves high dissemination efficiency resulting from geographically neutral dissemination patterns as well as selective dissemination to proximate users. We have designed a highly scalable location management method capable of promptly updating the network topology in response to node movement and have implemented a distributed simulator to perform dynamic target pursuit experiments as one example of applications that are the most sensitive to message forwarding delay. The experimental results show that the proposed network surpasses other types of networks in pursuit efficiency and achieves the desirable dissemination patterns.
The GOCE end-to-end system simulator
NASA Astrophysics Data System (ADS)
Catastini, G.; Cesare, S.; de Sanctis, S.; Detoma, E.; Dumontel, M.; Floberghagen, R.; Parisch, M.; Sechi, G.; Anselmi, A.
2003-04-01
The idea of an end-to-end simulator was conceived in the early stages of the GOCE programme, as an essential tool for assessing the satellite system performance, that cannot be fully tested on the ground. The simulator in its present form is under development at Alenia Spazio for ESA since the beginning of Phase B and is being used for checking the consistency of the spacecraft and of the payload specifications with the overall system requirements, supporting trade-off, sensitivity and worst-case analyses, and preparing and testing the on-ground and in-flight calibration concepts. The software simulates the GOCE flight along an orbit resulting from the application of Earth's gravity field, non-conservative environmental disturbances (atmospheric drag, coupling with Earth's magnetic field, etc.) and control forces/torques. The drag free control forces as well as the attitude control torques are generated by the current design of the dedicated algorithms. Realistic sensor models (star tracker, GPS receiver and gravity gradiometer) feed the control algorithms and the commanded forces are applied through realistic thruster models. The output of this stage of the simulator is a time series of Level-0 data, namely the gradiometer raw measurements and spacecraft ancillary data. The next stage of the simulator transforms Level-0 data into Level-1b (gravity gradient tensor) data, by implementing the following steps: - transformation of raw measurements of each pair of accelerometers into common and differential accelerations - calibration of the common and differential accelerations - application of the post-facto algorithm to rectify the phase of the accelerations and to estimate the GOCE angular velocity and attitude - computation of the Level-1b gravity gradient tensor from calibrated accelerations and estimated angular velocity in different reference frames (orbital, inertial, earth-fixed); computation of the spectral density of the error of the tensor diagonal components (measured gravity gradient minus input gravity gradient) in order to verify the requirement on the error of gravity gradient of 4 mE/sqrt(Hz) within the gradiometer measurement bandwidth (5 to 100 mHz); computation of the spectral density of the tensor trace in order to verify the requirement of 4 sqrt(3) mE/sqrt(Hz) within the measurement bandwidth - processing of GPS observations for orbit reconstruction within the required 10m accuracy and for gradiometer measurement geolocation. The current version of the end-to-end simulator, essentially focusing on the gradiometer payload, is undergoing detailed testing based on a time span of 10 days of simulated flight. This testing phase, ending in January 2003, will verify the current implementation and conclude the assessment of numerical stability and precision. Following that, the exercise will be repeated on a longer-duration simulated flight and the lesson learnt so far will be exploited to further improve the simulator's fidelity. The paper will describe the simulator's current status and will illustrate its capabilities for supporting the assessment of the quality of the scientific products resulting from the current spacecraft and payload design.
DEF: an automated dead-end filling approach based on quasi-endosymbiosis.
Liu, Lili; Zhang, Zijun; Sheng, Taotao; Chen, Ming
2017-02-01
Gap filling for the reconstruction of metabolic networks is to restore the connectivity of metabolites via finding high-confidence reactions that could be missed in target organism. Current methods for gap filling either fall into the network topology or have limited capability in finding missing reactions that are indirectly related to dead-end metabolites but of biological importance to the target model. We present an automated dead-end filling (DEF) approach, which is derived from the wisdom of endosymbiosis theory, to fill gaps by finding the most efficient dead-end utilization paths in a constructed quasi-endosymbiosis model. The recalls of reactions and dead ends of DEF reach around 73% and 86%, respectively. This method is capable of finding indirectly dead-end-related reactions with biological importance for the target organism and is applicable to any given metabolic model. In the E. coli iJR904 model, for instance, about 42% of the dead-end metabolites were fixed by our proposed method. DEF is publicly available at http://bis.zju.edu.cn/DEF/. mchen@zju.edu.cn Supplementary data are available at Bioinformatics online.
Observability analysis of DVL/PS aided INS for a maneuvering AUV.
Klein, Itzik; Diamant, Roee
2015-10-22
Recently, ocean exploration has increased considerably through the use of autonomous underwater vehicles (AUV). A key enabling technology is the precision of the AUV navigation capability. In this paper, we focus on understanding the limitation of the AUV navigation system. That is, what are the observable error-states for different maneuvering types of the AUV? Since analyzing the performance of an underwater navigation system is highly complex, to answer the above question, current approaches use simulations. This, of course, limits the conclusions to the emulated type of vehicle used and to the simulation setup. For this reason, we take a different approach and analyze the system observability for different types of vehicle dynamics by finding the set of observable and unobservable states. To that end, we apply the observability Gramian approach, previously used only for terrestrial applications. We demonstrate our analysis for an underwater inertial navigation system aided by a Doppler velocity logger or by a pressure sensor. The result is a first prediction of the performance of an AUV standing, rotating at a position and turning at a constant speed. Our conclusions of the observable and unobservable navigation error states for different dynamics are supported by extensive numerical simulation.
Observability Analysis of DVL/PS Aided INS for a Maneuvering AUV
Klein, Itzik; Diamant, Roee
2015-01-01
Recently, ocean exploration has increased considerably through the use of autonomous underwater vehicles (AUV). A key enabling technology is the precision of the AUV navigation capability. In this paper, we focus on understanding the limitation of the AUV navigation system. That is, what are the observable error-states for different maneuvering types of the AUV? Since analyzing the performance of an underwater navigation system is highly complex, to answer the above question, current approaches use simulations. This, of course, limits the conclusions to the emulated type of vehicle used and to the simulation setup. For this reason, we take a different approach and analyze the system observability for different types of vehicle dynamics by finding the set of observable and unobservable states. To that end, we apply the observability Gramian approach, previously used only for terrestrial applications. We demonstrate our analysis for an underwater inertial navigation system aided by a Doppler velocity logger or by a pressure sensor. The result is a first prediction of the performance of an AUV standing, rotating at a position and turning at a constant speed. Our conclusions of the observable and unobservable navigation error states for different dynamics are supported by extensive numerical simulation. PMID:26506356
An Efficient Framework Model for Optimizing Routing Performance in VANETs
Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala
2018-01-01
Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED). PMID:29462884
A New High-Speed Oil-Free Turbine Engine Rotordynamic Simulator Test Rig
NASA Technical Reports Server (NTRS)
Howard, Samuel A.
2007-01-01
A new test rig has been developed for simulating high-speed turbomachinery rotor systems using Oil-Free foil air bearing technology. Foil air bearings have been used in turbomachinery, primarily air cycle machines, for the past four decades to eliminate the need for oil lubrication. The goal of applying this bearing technology to other classes of turbomachinery has prompted the fabrication of this test rig. The facility gives bearing designers the capability to test potential bearing designs with shafts that simulate the rotating components of a target machine without the high cost of building "make-and-break" hardware. The data collected from this rig can be used to make design changes to the shaft and bearings in subsequent design iterations. This paper describes the new test rig and demonstrates its capabilities through the initial run with a simulated shaft system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu Na; Chen Shuo; Wang Hongtao
2008-10-15
A tetracycline hydrochloride (TC) molecularly imprinted polymer (MIP) modified TiO{sub 2} nanotube array electrode was prepared via surface molecular imprinting. Its surface was structured with surface voids and the nanotubes were open at top end with an average diameter of approximately 50 nm. The MIP-modified TiO{sub 2} nanotube array with anatase phase was identified by XRD and a distinguishable red shift in the absorption spectrum was observed. The MIP-modified electrode also exhibited a high adsorption capacity for TC due to its high surface area providing imprinted sites. Photocurrent was generated on the MIP-modified photoanode using the simulated solar spectrum andmore » increased with the increase of positive bias potential. Under simulated solar light irradiation, the MIP-modified TiO{sub 2} nanotube array electrode exhibited enhanced photoelectrocatalytic (PEC) activity with the apparent first-order rate constant being 1.2-fold of that with TiO{sub 2} nanotube array electrode. The effect of the thickness of the MIP layer on the PEC activity was also evaluated. - Graphical abstract: A tetracycline hydrochloride molecularly imprinted polymer modified TiO{sub 2} nanotube array electrode was prepared via surface molecular imprinting. It showed improved response to simulated solar light and higher adsorption capability for tetracycline hydrochloride, thereby exhibiting increased PEC activity under simulated solar light irradiation. The apparent first-order rate constant was 1.2-fold of that on TiO{sub 2} nanotube array electrode.« less
Simulating energy cascade of shock wave formation process in a resonator by gas kinetic scheme
NASA Astrophysics Data System (ADS)
Qu, Chengwu; Zhang, Xiaoqing; Feng, Heying
2017-12-01
The temporal-spatial evolution of gas oscillation was simulated by gas kinetic scheme (GKS) in a cylindrical resonator, driven by a piston at one end and rigidly closed at the other end. Periodic shock waves propagating back and forth were observed in the resonator under finite amplitude of gas oscillation. The studied results demonstrated that the acoustic pressure is a saw-tooth waveform and the oscillatory velocity is a square waveform at the central position of the resonant tube. Moreover, it was found by harmonic analysis that there was no presence of obvious feature for pressure node in such a typical standing wave resonator, and the distribution of acoustic fields displayed a one-dimensional feature for the acoustic pressure while a quasi-one-dimensional form for oscillatory velocity, which demonstrated the nonlinear effects. The simulation results for axial distribution of acoustic intensity showed a good consistency with the published experimental data in the open literature domain, which provides a verification for the effectiveness of the GKS model proposed. The influence of displacement amplitude of the driving piston on the formation of shock wave was numerically investigated, and the simulated results revealed the cascade process of harmonic wave energy from the fundamental wave to higher harmonics. In addition, this study found that the acoustic intensity at the driving end of the resonant tube would increase linearly with the displacement amplitude of the piston due to nonlinear effects, rather than the exponential variation by linear theory. This research demonstrates that the GKS model is strongly capable of simulating nonlinear acoustic problems.
NiftySim: A GPU-based nonlinear finite element package for simulation of soft tissue biomechanics.
Johnsen, Stian F; Taylor, Zeike A; Clarkson, Matthew J; Hipwell, John; Modat, Marc; Eiben, Bjoern; Han, Lianghao; Hu, Yipeng; Mertzanidou, Thomy; Hawkes, David J; Ourselin, Sebastien
2015-07-01
NiftySim, an open-source finite element toolkit, has been designed to allow incorporation of high-performance soft tissue simulation capabilities into biomedical applications. The toolkit provides the option of execution on fast graphics processing unit (GPU) hardware, numerous constitutive models and solid-element options, membrane and shell elements, and contact modelling facilities, in a simple to use library. The toolkit is founded on the total Lagrangian explicit dynamics (TLEDs) algorithm, which has been shown to be efficient and accurate for simulation of soft tissues. The base code is written in C[Formula: see text], and GPU execution is achieved using the nVidia CUDA framework. In most cases, interaction with the underlying solvers can be achieved through a single Simulator class, which may be embedded directly in third-party applications such as, surgical guidance systems. Advanced capabilities such as contact modelling and nonlinear constitutive models are also provided, as are more experimental technologies like reduced order modelling. A consistent description of the underlying solution algorithm, its implementation with a focus on GPU execution, and examples of the toolkit's usage in biomedical applications are provided. Efficient mapping of the TLED algorithm to parallel hardware results in very high computational performance, far exceeding that available in commercial packages. The NiftySim toolkit provides high-performance soft tissue simulation capabilities using GPU technology for biomechanical simulation research applications in medical image computing, surgical simulation, and surgical guidance applications.
NASA Technical Reports Server (NTRS)
Justh, Hilary L.; Justus, C. G.
2008-01-01
Engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL)1. Traditional Mars-GRAM options for representing the mean atmosphere along entry corridors include: a) TES Mapping Years 1 and 2, with Mars-GRAM data coming from MGCM model results driven by observed TES dust optical depth; and b) TES Mapping Year 0, with user-controlled dust optical depth and Mars-GRAM data interpolated from MGCM model results driven by selected values of globally-uniform dust optical depth. From the surface to 80 km altitude, Mars-GRAM is based on NASA Ames Mars General Circulation Model (MGCM). Mars-GRAM and MGCM use surface topography from Mars Global Surveyor Mars Orbiter Laser Altimeter (MOLA), with altitudes referenced to the MOLA areoid, or constant potential surface. Mars-GRAM 2005 has been validated2 against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES)
NASA Astrophysics Data System (ADS)
Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.
2010-12-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPoRT Center, with benefits provided to the operational forecasting community.
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.
2010-01-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPoRT Center, with benefits provided to the operational forecasting community.
CFD simulation and experimental validation of a GM type double inlet pulse tube refrigerator
NASA Astrophysics Data System (ADS)
Banjare, Y. P.; Sahoo, R. K.; Sarangi, S. K.
2010-04-01
Pulse tube refrigerator has the advantages of long life and low vibration over the conventional cryocoolers, such as GM and stirling coolers because of the absence of moving parts in low temperature. This paper performs a three-dimensional computational fluid dynamic (CFD) simulation of a GM type double inlet pulse tube refrigerator (DIPTR) vertically aligned, operating under a variety of thermal boundary conditions. A commercial computational fluid dynamics (CFD) software package, Fluent 6.1 is used to model the oscillating flow inside a pulse tube refrigerator. The simulation represents fully coupled systems operating in steady-periodic mode. The externally imposed boundary conditions are sinusoidal pressure inlet by user defined function at one end of the tube and constant temperature or heat flux boundaries at the external walls of the cold-end heat exchangers. The experimental method to evaluate the optimum parameters of DIPTR is difficult. On the other hand, developing a computer code for CFD analysis is equally complex. The objectives of the present investigations are to ascertain the suitability of CFD based commercial package, Fluent for study of energy and fluid flow in DIPTR and to validate the CFD simulation results with available experimental data. The general results, such as the cool down behaviours of the system, phase relation between mass flow rate and pressure at cold end, the temperature profile along the wall of the cooler and refrigeration load are presented for different boundary conditions of the system. The results confirm that CFD based Fluent simulations are capable of elucidating complex periodic processes in DIPTR. The results also show that there is an excellent agreement between CFD simulation results and experimental results.
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting themore » time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flueck, Alex
The “High Fidelity, Faster than RealTime Simulator for Predicting Power System Dynamic Behavior” was designed and developed by Illinois Institute of Technology with critical contributions from Electrocon International, Argonne National Laboratory, Alstom Grid and McCoy Energy. Also essential to the project were our two utility partners: Commonwealth Edison and AltaLink. The project was a success due to several major breakthroughs in the area of largescale power system dynamics simulation, including (1) a validated faster than real time simulation of both stable and unstable transient dynamics in a largescale positive sequence transmission grid model, (2) a threephase unbalanced simulation platform formore » modeling new grid devices, such as independently controlled singlephase static var compensators (SVCs), (3) the world’s first high fidelity threephase unbalanced dynamics and protection simulator based on Electrocon’s CAPE program, and (4) a firstofits kind implementation of a singlephase induction motor model with stall capability. The simulator results will aid power grid operators in their true time of need, when there is a significant risk of cascading outages. The simulator will accelerate performance and enhance accuracy of dynamics simulations, enabling operators to maintain reliability and steer clear of blackouts. In the longterm, the simulator will form the backbone of the newly conceived hybrid realtime protection and control architecture that will coordinate local controls, widearea measurements, widearea controls and advanced realtime prediction capabilities. The nation’s citizens will benefit in several ways, including (1) less down time from power outages due to the fasterthanrealtime simulator’s predictive capability, (2) higher levels of reliability due to the detailed dynamics plus protection simulation capability, and (3) more resiliency due to the three phase unbalanced simulator’s ability to model threephase and single phase networks and devices.« less
Li, Guibing; Yang, Jikuang; Simms, Ciaran
2016-07-03
The purpose of this study is to define a computationally efficient virtual test system (VTS) to assess the aggressivity of vehicle front-end designs to pedestrians considering the distribution of pedestrian impact configurations for future vehicle front-end optimization. The VTS should represent real-world impact configurations in terms of the distribution of vehicle impact speeds, pedestrian walking speeds, pedestrian gait, and pedestrian height. The distribution of injuries as a function of body region, vehicle impact speed, and pedestrian size produced using this VTS should match the distribution of injuries observed in the accident data. The VTS should have the predictive ability to distinguish the aggressivity of different vehicle front-end designs to pedestrians. The proposed VTS includes 2 parts: a simulation test sample (STS) and an injury weighting system (IWS). The STS was defined based on MADYMO multibody vehicle to pedestrian impact simulations accounting for the range of vehicle impact speeds, pedestrian heights, pedestrian gait, and walking speed to represent real world impact configurations using the Pedestrian Crash Data Study (PCDS) and anthropometric data. In total 1,300 impact configurations were accounted for in the STS. Three vehicle shapes were then tested using the STS. The IWS was developed to weight the predicted injuries in the STS using the estimated proportion of each impact configuration in the PCDS accident data. A weighted injury number (WIN) was defined as the resulting output of the VTS. The WIN is the weighted number of average Abbreviated Injury Scale (AIS) 2+ injuries recorded per impact simulation in the STS. Then the predictive capability of the VTS was evaluated by comparing the distributions of AIS 2+ injuries to different pedestrian body regions and heights, as well as vehicle types and impact speeds, with that from the PCDS database. Further, a parametric analysis was performed with the VTS to assess the sensitivity of the injury predictions to changes in vehicle shape (type) and stiffness to establish the potential for using the VTS for future vehicle front-end optimization. An STS of 1,300 multibody simulations and an IWS based on the distribution of impact speed, pedestrian height, gait stance, and walking speed is broadly capable of predicting the distribution of pedestrian injuries observed in the PCDS database when the same vehicle type distribution as the accident data is employed. The sensitivity study shows significant variations in the WIN when either vehicle type or stiffness is altered. Injury predictions derived from the VTS give a good representation of the distribution of injuries observed in the PCDS and distinguishing ability on the aggressivity of vehicle front-end designs to pedestrians. The VTS can be considered as an effective approach for assessing pedestrian safety performance of vehicle front-end designs at the generalized level. However, the absolute injury number is substantially underpredicted by the VTS, and this needs further development.
Grace: A cross-platform micromagnetic simulator on graphics processing units
NASA Astrophysics Data System (ADS)
Zhu, Ru
2015-12-01
A micromagnetic simulator running on graphics processing units (GPUs) is presented. Different from GPU implementations of other research groups which are predominantly running on NVidia's CUDA platform, this simulator is developed with C++ Accelerated Massive Parallelism (C++ AMP) and is hardware platform independent. It runs on GPUs from venders including NVidia, AMD and Intel, and achieves significant performance boost as compared to previous central processing unit (CPU) simulators, up to two orders of magnitude. The simulator paved the way for running large size micromagnetic simulations on both high-end workstations with dedicated graphics cards and low-end personal computers with integrated graphics cards, and is freely available to download.
Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling
NASA Technical Reports Server (NTRS)
Glaab, Patricia; Madden, Michael
2014-01-01
The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.
Transverse Diode Pumping of Solid-State Lasers
1992-05-29
more common apertures (laser rod end and cavity end mirror ) leads to a thin-film coating damage issue. The transverse pumped geometry avoids the...proprietary one-half inch square cooler developed for high-power adaptive optics mirror applications. The laser performance observed, with up to 35 watts of...including the development of active mirrors capable of sustaining high power loadings. As part of those efforts, TTC has developed a small (one-half inch
NASA Technical Reports Server (NTRS)
Goodrich, Kenneth H.
1993-01-01
A batch air combat simulation environment, the tactical maneuvering simulator (TMS), is presented. The TMS is a tool for developing and evaluating tactical maneuvering logics, but it can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS can simulate air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics, and propulsive characteristics equivalent to those used in high-fidelity piloted simulations. Data bases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system, the tactical autopilot (TA), is implemented in the aircraft simulation model. The TA converts guidance commands by computerized maneuvering logics from desired angle of attack and wind-axis bank-angle inputs to the inner loop control augmentation system of the aircraft. The capabilities and operation of the TMS and the TA are described.
An end-to-end communications architecture for condition-based maintenance applications
NASA Astrophysics Data System (ADS)
Kroculick, Joseph
2014-06-01
This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.
Hagland, Mark
2010-03-01
CIOs must ensure the creation of a technology foundation underlying the implementation of new applications, in order to guarantee continuous computing and other essential characteristics of IT service for end-users, going forward. Focusing on the needs of end-users will be essential to creating that foundation. End-user expectations are already outstripping technological capabilities, putting pressure on CIOs to carefully balance the offering of highly desired applications with the creation of a strong tech foundation to undergird those apps.
Autonomous Commanding of the WIRE Spacecraft
NASA Technical Reports Server (NTRS)
Prior, Mike; Walyus, Keith; Saylor, Rick
1999-01-01
This paper presents the end-to-end design architecture for an autonomous commanding capability to be used on the Wide Field Infrared Explorer (WIRE) mission for the uplink of command loads during unattended station contacts. The WIRE mission is the fifth and final mission of NASA's Goddard Space Flight Center Small Explorer (SMEX) series to be launched in March of 1999. Its primary mission is the targeting of deep space fields using an ultra-cooled infrared telescope. Due to its mission design WIRE command loads are large (approximately 40 Kbytes per 24 hours) and must be performed daily. To reduce the cost of mission operations support that would be required in order to uplink command loads, the WIRE Flight Operations Team has implemented an autonomous command loading capability. This capability allows completely unattended operations over a typical two- day weekend period. The key factors driving design and implementation of this capability were: 1) Integration with already existing ground system autonomous capabilities and systems, 2) The desire to evolve autonomous operations capabilities based upon previous SMEX operations experience 3) Integration with ground station operations - both autonomous and man-tended, 4) Low cost and quick implementation, and 5) End-to-end system robustness. A trade-off study was performed to examine these factors in light of the low-cost, higher-risk SMEX mission philosophy. The study concluded that a STOL (Spacecraft Test and Operations Language) based script, highly integrated with other scripts used to perform autonomous operations, was best suited given the budget and goals of the mission. Each of these factors is discussed to provide an overview of the autonomous operations capabilities implemented for the mission. The capabilities implemented on the WIRE mission are an example of a low-cost, robust, and efficient method for autonomous command loading when implemented with other autonomous features of the ground system. They can be used as a design and implementation template by other small satellite missions interested in evolving toward autonomous and lower cost operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Berry, R. A.; Martineau, R. C.
The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less
Teaching End-of-Life Care Using Interprofessional Simulation.
Gannon, Jane; Motycka, Carol; Egelund, Eric; Kraemer, Dale F; Smith, W Thomas; Solomon, Kathleen
2017-04-01
Competency in end-of-life (EOL) care is a growing expectation for health professions students. This study assessed the impact of four EOL care scenarios, using high-fidelity simulation, on the perceived learning needs and attitudes of pharmacy and nursing students. On three campuses, pharmacy students (N = 158) were exposed to standard paper EOL case scenarios, while a fourth campus exposed eight graduate nursing and 37 graduate pharmacy students to simulated versions of the same cases. The paper-based groups produced similar pre-post changes on the End of Life Professional Caregiver Survey. Results were pooled and compared with the simulation-only group, revealing significantly higher changes in pre-post scores for the simulation group. Students participating in the simulation group showed some significant differences in attitudes toward EOL care, compared with students in the classroom setting. [J Nurs Educ. 2017;56(4):205-210.]. Copyright 2017, SLACK Incorporated.
Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing
NASA Technical Reports Server (NTRS)
Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael
2013-01-01
Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.
Development of a High Dynamic Range Pixel Array Detector for Synchrotrons and XFELs
NASA Astrophysics Data System (ADS)
Weiss, Joel Todd
Advances in synchrotron radiation light source technology have opened new lines of inquiry in material science, biology, and everything in between. However, x-ray detector capabilities must advance in concert with light source technology to fully realize experimental possibilities. X-ray free electron lasers (XFELs) place particularly large demands on the capabilities of detectors, and developments towards diffraction-limited storage ring sources also necessitate detectors capable of measuring very high flux [1-3]. The detector described herein builds on the Mixed Mode Pixel Array Detector (MM-PAD) framework, developed previously by our group to perform high dynamic range imaging, and the Adaptive Gain Integrating Pixel Detector (AGIPD) developed for the European XFEL by a collaboration between Deustsches Elektronen-Synchrotron (DESY), the Paul-Scherrer-Institute (PSI), the University of Hamburg, and the University of Bonn, led by Heinz Graafsma [4, 5]. The feasibility of combining adaptive gain with charge removal techniques to increase dynamic range in XFEL experiments is assessed by simulating XFEL scatter with a pulsed infrared laser. The strategy is incorporated into pixel prototypes which are evaluated with direct current injection to simulate very high incident x-ray flux. A fully functional 16x16 pixel hybrid integrating x-ray detector featuring several different pixel architectures based on the prototypes was developed. This dissertation describes its operation and characterization. To extend dynamic range, charge is removed from the integration node of the front-end amplifier without interrupting integration. The number of times this process occurs is recorded by a digital counter in the pixel. The parameter limiting full well is thereby shifted from the size of an integration capacitor to the depth of a digital counter. The result is similar to that achieved by counting pixel array detectors, but the integrators presented here are designed to tolerate a sustained flux >1011 x-rays/pixel/second. In addition, digitization of residual analog signals allows sensitivity for single x-rays or low flux signals. Pixel high flux linearity is evaluated by direct exposure to an unattenuated synchrotron source x-ray beam and flux measurements of more than 1010 9.52 keV x-rays/pixel/s are made. Detector sensitivity to small signals is evaluated and dominant sources of error are identified. These new pixels boast multiple orders of magnitude improvement in maximum sustained flux over the MM-PAD, which is capable of measuring a sustained flux in excess of 108 x-rays/pixel/second while maintaining sensitivity to smaller signals, down to single x-rays.
NASA Technical Reports Server (NTRS)
Grabbe, Shon R.
2017-01-01
This presentation provides a high-level overview of NASA's Future ATM Concepts Evaluation Tool (FACET) with a high-level description of the system's inputs and outputs. This presentation is designed to support the joint simulations that NASA and the Chinese Aeronautical Establishment (CAE) will conduct under an existing Memorandum of Understanding.
Improving the Analysis Capabilities of the Synthetic Theater Operations Research Model (STORM)
2014-09-01
course of action CSG carrier strike group DMSO defense modeling and simulation DOD Department of Defense DOE design of experiments ESG...development of an overall objective or end-state; a ways ( courses of action); and a means (available resources). STORM is a campaign analysis tool that...refers to the courses of action (COA) that are carefully planned out in advance by individuals relevant to a specific campaign (such as N81). For
ERIC Educational Resources Information Center
Sole, Marla A.
2016-01-01
Open-ended questions that can be solved using different strategies help students learn and integrate content, and provide teachers with greater insights into students' unique capabilities and levels of understanding. This article provides a problem that was modified to allow for multiple approaches. Students tended to employ high-powered, complex,…
NASA Astrophysics Data System (ADS)
Riecken, Mark; Lessmann, Kurt; Schillero, David
2016-05-01
The Data Distribution Service (DDS) was started by the Object Management Group (OMG) in 2004. Currently, DDS is one of the contenders to support the Internet of Things (IoT) and the Industrial IOT (IIoT). DDS has also been used as a distributed simulation architecture. Given the anticipated proliferation of IoT and II devices, along with the explosive growth of sensor technology, can we expect this to have an impact on the broader community of distributed simulation? If it does, what is the impact and which distributed simulation domains will be most affected? DDS shares many of the same goals and characteristics of distributed simulation such as the need to support scale and an emphasis on Quality of Service (QoS) that can be tailored to meet the end user's needs. In addition, DDS has some built-in features such as security that are not present in traditional distributed simulation protocols. If the IoT and II realize their potential application, we predict a large base of technology to be built around this distributed data paradigm, much of which could be directly beneficial to the distributed M&S community. In this paper we compare some of the perceived gaps and shortfalls of current distributed M&S technology to the emerging capabilities of DDS built around the IoT. Although some trial work has been conducted in this area, we propose a more focused examination of the potential of these new technologies and their applicability to current and future problems in distributed M&S. The Internet of Things (IoT) and its data communications mechanisms such as the Data Distribution System (DDS) share properties in common with distributed modeling and simulation (M&S) and its protocols such as the High Level Architecture (HLA) and the Test and Training Enabling Architecture (TENA). This paper proposes a framework based on the sensor use case for how the two communities of practice (CoP) can benefit from one another and achieve greater capability in practical distributed computing.
Precipitation information from GNSS Polarimetric Radio Occultation observations
NASA Astrophysics Data System (ADS)
Padulles, R.; Cardellach, E.; Turk, J.; Tomás, S.; Ao, C. O.; de la Torre-Juárez, M.
2017-12-01
There is currently a gap in satellite observations of the moisture structure during heavy precipitation conditions, since infrared and microwave sounders cannot sense water vapor structure near the surface in the presence of intense precipitation. Conversely, Global Navigation Satellite System (GNSS) Radio Occultations (RO) can profile the moisture structure with high precision and vertical resolution, but cannot directly indicate the presence of precipitation. Polarimetric RO (PRO) measurements have been proposed as a method to characterize heavy rain in GNSS RO, by measuring the polarimetric differential phase delay induced by large size hydrometeors. The PRO concept will be tested from space for the first time on board the Spanish PAZ satellite, planned for launch by the end of 2017. Therefore, for the first time ever, GNSS RO measurements will be taken at two polarizations, to exploit the potential capabilities of polarimetric RO for detecting and quantifying heavy precipitation events. If the concept is proved, PAZ will mean a new application of the GNSS Radio-Occultation observations, by providing coincident thermodynamic and precipitation information with high vertical resolution within regions with thick clouds. Before the launch, a series of studies have been performed in order to assess the retrieval of precipitation information from the polarimetric observations. These studies have been based on coincident observations from the COSMIC / FORMOSAT-3 RO satellite constellation, and TRMM and GPM missions. This massive collocation exercise allowed us to build a series of Look Up Tables that relate probabilistically the precipitation intensity to the polarimetric observables. Such studies needed a previous characterization of the polarimetric observable, since it contains contributions from the ionosphere and the emitting and receiving systems. For this purpose, complete end-to-end simulations have been performed, where information from the ionospheric state, the Earth magnetic field, the along-ray precipitation, the impurities at emission, and the effects introduced by the receiver have been taken into account. The results of the simulations and the expected PRO products (vertical profiles of precipitation information and thermodynamic parameters) will be presented here.
Proto-Flight Manipulator Arm (P-FMA)
NASA Technical Reports Server (NTRS)
Britton, W. R.
1977-01-01
The technical development of the Proto-Flight Manipulator Arm (P-FMA) which is a seven-degree-of-freedom general-purpose arm capable of being remotely operated in an earth orbital environment is discussed. The P-FMA is a unique manipulator, combining the capabilities of significant dexterity, high tip forces, precise motion control, gear backdriveability, high end effector grip forces and torques, and the quality of flightworthiness. The 2.4-meter (8-foot) arm weighs 52.2 kilograms (115 pounds).
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.
2015-12-01
Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.
Interactive Correlation Analysis and Visualization of Climate Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Kwan-Liu
The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods formore » visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.« less
Data collection and simulation of high range resolution laser radar for surface mine detection
NASA Astrophysics Data System (ADS)
Steinvall, Ove; Chevalier, Tomas; Larsson, Håkan
2006-05-01
Rapid and efficient detection of surface mines, IED's (Improvised Explosive Devices) and UXO (Unexploded Ordnance) is of high priority in military conflicts. High range resolution laser radars combined with passive hyper/multispectral sensors offer an interesting concept to help solving this problem. This paper reports on laser radar data collection of various surface mines in different types of terrain. In order to evaluate the capability of 3D imaging for detecting and classifying the objects of interest a scanning laser radar was used to scan mines and surrounding terrain with high angular and range resolution. These data were then fed into a laser radar model capable of generating range waveforms for a variety of system parameters and combinations of different targets and backgrounds. We can thus simulate a potential system by down sampling to relevant pixel sizes and laser/receiver characteristics. Data, simulations and examples will be presented.
Hagerman, Amy D; Ward, Michael P; Anderson, David P; Looney, J Chris; McCarl, Bruce A
2013-07-01
In this study our aim was to value the benefits of rapid effective trace-back capability-based on a livestock identification system - in the event of a foot and mouth disease (FMD) outbreak. We simulated an FMD outbreak in the Texas High Plains, an area of high livestock concentration, beginning in a large feedlot. Disease spread was simulated under different time dependent animal tracing scenarios. In the specific scenario modeled (incursion of FMD within a large feedlot, detection within 14 days and 90% effective tracing), simulation suggested that control costs of the outbreak significantly increase if tracing does not occur until day 10 as compared to the baseline of tracing on day 2. In addition, control costs are significantly increased if effectiveness were to drop to 30% as compared to the baseline of 90%. Results suggest potential benefits from rapid effective tracing in terms of reducing government control costs; however, a variety of other scenarios need to be explored before determining in which situations rapid effective trace-back capability is beneficial. Copyright © 2012 Elsevier B.V. All rights reserved.
DIATOM (Data Initialization and Modification) Library Version 7.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, David A.; Schmitt, Robert G.; Hensinger, David M.
DIATOM is a library that provides numerical simulation software with a computational geometry front end that can be used to build up complex problem geometries from collections of simpler shapes. The library provides a parser which allows for application-independent geometry descriptions to be embedded in simulation software input decks. Descriptions take the form of collections of primitive shapes and/or CAD input files and material properties that can be used to describe complex spatial and temporal distributions of numerical quantities (often called “database variables” or “fields”) to help define starting conditions for numerical simulations. The capability is designed to be generalmore » purpose, robust and computationally efficient. By using a combination of computational geometry and recursive divide-and-conquer approximation techniques, a wide range of primitive shapes are supported to arbitrary degrees of fidelity, controllable through user input and limited only by machine resources. Through the use of call-back functions, numerical simulation software can request the value of a field at any time or location in the problem domain. Typically, this is used only for defining initial conditions, but the capability is not limited to just that use. The most recent version of DIATOM provides the ability to import the solution field from one numerical solution as input for another.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, Daniel J.; Meisner, Robert
The Advanced Simulation and Computing Campaign, herein referred to as the ASC Program, is a core element of the science-based Stockpile Stewardship Program (SSP), which enables assessment, certification, and maintenance of the safety, security, and reliability of the U.S. nuclear stockpile without the need to resume nuclear testing. The use of advanced parallel computing has transitioned from proof-of-principle to become a critical element for assessing and certifying the stockpile. As the initiative phase of the ASC Program came to an end in the mid-2000s, the National Nuclear Security Administration redirected resources to other urgent priorities, and resulting staff reductions inmore » ASC occurred without the benefit of analysis of the impact on modern stockpile stewardship that is dependent on these new simulation capabilities. Consequently, in mid-2008 the ASC Program management commissioned a study to estimate the essential size and balance needed to sustain advanced simulation as a core component of stockpile stewardship. The ASC Program requires a minimum base staff size of 930 (which includes the number of staff necessary to maintain critical technical disciplines as well as to execute required programmatic tasks) to sustain its essential ongoing role in stockpile stewardship.« less
Disaster Response Modeling Through Discrete-Event Simulation
NASA Technical Reports Server (NTRS)
Wang, Jeffrey; Gilmer, Graham
2012-01-01
Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.
A secure cluster-based multipath routing protocol for WMSNs.
Almalkawi, Islam T; Zapata, Manel Guerrero; Al-Karaki, Jamal N
2011-01-01
The new characteristics of Wireless Multimedia Sensor Network (WMSN) and its design issues brought by handling different traffic classes of multimedia content (video streams, audio, and still images) as well as scalar data over the network, make the proposed routing protocols for typical WSNs not directly applicable for WMSNs. Handling real-time multimedia data requires both energy efficiency and QoS assurance in order to ensure efficient utility of different capabilities of sensor resources and correct delivery of collected information. In this paper, we propose a Secure Cluster-based Multipath Routing protocol for WMSNs, SCMR, to satisfy the requirements of delivering different data types and support high data rate multimedia traffic. SCMR exploits the hierarchical structure of powerful cluster heads and the optimized multiple paths to support timeliness and reliable high data rate multimedia communication with minimum energy dissipation. Also, we present a light-weight distributed security mechanism of key management in order to secure the communication between sensor nodes and protect the network against different types of attacks. Performance evaluation from simulation results demonstrates a significant performance improvement comparing with existing protocols (which do not even provide any kind of security feature) in terms of average end-to-end delay, network throughput, packet delivery ratio, and energy consumption.
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Ng, Tak-Kwong; Davis, Mitchell J.; Adams, James K.; Bowen, Stephen C.; Fay, James J.; Hutchinson, Mark A.
2015-01-01
The project called High-Speed On-Board Data Processing for Science Instruments (HOPS) has been funded by NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) program since April, 2012. The HOPS team recently completed two flight campaigns during the summer of 2014 on two different aircrafts with two different science instruments. The first flight campaign was in July, 2014 based at NASA Langley Research Center (LaRC) in Hampton, VA on the NASA's HU-25 aircraft. The science instrument that flew with HOPS was Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) funded by NASA's Instrument Incubator Program (IIP). The second campaign was in August, 2014 based at NASA Armstrong Flight Research Center (AFRC) in Palmdale, CA on the NASA's DC-8 aircraft. HOPS flew with the Multifunctional Fiber Laser Lidar (MFLL) instrument developed by Excelis Inc. The goal of the campaigns was to perform an end-to-end demonstration of the capabilities of the HOPS prototype system (HOPS COTS) while running the most computationally intensive part of the ASCENDS algorithm real-time on-board. The comparison of the two flight campaigns and the results of the functionality tests of the HOPS COTS are presented in this paper.
A Secure Cluster-Based Multipath Routing Protocol for WMSNs
Almalkawi, Islam T.; Zapata, Manel Guerrero; Al-Karaki, Jamal N.
2011-01-01
The new characteristics of Wireless Multimedia Sensor Network (WMSN) and its design issues brought by handling different traffic classes of multimedia content (video streams, audio, and still images) as well as scalar data over the network, make the proposed routing protocols for typical WSNs not directly applicable for WMSNs. Handling real-time multimedia data requires both energy efficiency and QoS assurance in order to ensure efficient utility of different capabilities of sensor resources and correct delivery of collected information. In this paper, we propose a Secure Cluster-based Multipath Routing protocol for WMSNs, SCMR, to satisfy the requirements of delivering different data types and support high data rate multimedia traffic. SCMR exploits the hierarchical structure of powerful cluster heads and the optimized multiple paths to support timeliness and reliable high data rate multimedia communication with minimum energy dissipation. Also, we present a light-weight distributed security mechanism of key management in order to secure the communication between sensor nodes and protect the network against different types of attacks. Performance evaluation from simulation results demonstrates a significant performance improvement comparing with existing protocols (which do not even provide any kind of security feature) in terms of average end-to-end delay, network throughput, packet delivery ratio, and energy consumption. PMID:22163854
Development of Network-based Communications Architectures for Future NASA Missions
NASA Technical Reports Server (NTRS)
Slywczak, Richard A.
2007-01-01
Since the Vision for Space Exploration (VSE) announcement, NASA has been developing a communications infrastructure that combines existing terrestrial techniques with newer concepts and capabilities. The overall goal is to develop a flexible, modular, and extensible architecture that leverages and enhances terrestrial networking technologies that can either be directly applied or modified for the space regime. In addition, where existing technologies leaves gaps, new technologies must be developed. An example includes dynamic routing that accounts for constrained power and bandwidth environments. Using these enhanced technologies, NASA can develop nodes that provide characteristics, such as routing, store and forward, and access-on-demand capabilities. But with the development of the new infrastructure, challenges and obstacles will arise. The current communications infrastructure has been developed on a mission-by-mission basis rather than an end-to-end approach; this has led to a greater ground infrastructure, but has not encouraged communications between space-based assets. This alone provides one of the key challenges that NASA must encounter. With the development of the new Crew Exploration Vehicle (CEV), NASA has the opportunity to provide an integration path for the new vehicles and provide standards for their development. Some of the newer capabilities these vehicles could include are routing, security, and Software Defined Radios (SDRs). To meet these needs, the NASA/Glenn Research Center s (GRC) Network Emulation Laboratory (NEL) has been using both simulation and emulation to study and evaluate these architectures. These techniques provide options to NASA that directly impact architecture development. This paper identifies components of the infrastructure that play a pivotal role in the new NASA architecture, develops a scheme using simulation and emulation for testing these architectures and demonstrates how NASA can strengthen the new infrastructure by implementing these concepts.
Validation of High-Fidelity CFD Simulations for Rocket Injector Design
NASA Technical Reports Server (NTRS)
Tucker, P. Kevin; Menon, Suresh; Merkle, Charles L.; Oefelein, Joseph C.; Yang, Vigor
2008-01-01
Computational fluid dynamics (CFD) has the potential to improve the historical rocket injector design process by evaluating the sensitivity of performance and injector-driven thermal environments to the details of the injector geometry and key operational parameters. Methodical verification and validation efforts on a range of coaxial injector elements have shown the current production CFD capability must be improved in order to quantitatively impact the injector design process. This paper documents the status of a focused effort to compare and understand the predictive capabilities and computational requirements of a range of CFD methodologies on a set of single element injector model problems. The steady Reynolds-Average Navier-Stokes (RANS), unsteady Reynolds-Average Navier-Stokes (URANS) and three different approaches using the Large Eddy Simulation (LES) technique were used to simulate the initial model problem, a single element coaxial injector using gaseous oxygen and gaseous hydrogen propellants. While one high-fidelity LES result matches the experimental combustion chamber wall heat flux very well, there is no monotonic convergence to the data with increasing computational tool fidelity. Systematic evaluation of key flow field regions such as the flame zone, the head end recirculation zone and the downstream near wall zone has shed significant, though as of yet incomplete, light on the complex, underlying causes for the performance level of each technique. 1 Aerospace Engineer and Combustion CFD Team Leader, MS ER42, NASA MSFC, AL 35812, Senior Member, AIAA. 2 Professor and Director, Computational Combustion Laboratory, School of Aerospace Engineering, 270 Ferst Dr., Atlanta, GA 30332, Associate Fellow, AIAA. 3 Reilly Professor of Engineering, School of Mechanical Engineering, 585 Purdue Mall, West Lafayette, IN 47907, Fellow, AIAA. 4 Principal Member of Technical Staff, Combustion Research Facility, 7011 East Avenue, MS9051, Livermore, CA 94550, Associate Fellow, AIAA. 5 J. L. and G. H. McCain Endowed Chair, Mechanical Engineering, 104 Research Building East, University Park, PA 16802, Fellow, AIAA. American Institute of Aeronautics and Astronautics 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amerio, S.; Behari, S.; Boyd, J.
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Deep Discharge Reconditioning and Shorted Storage of Batteries. [nickel cadmium batteries
NASA Technical Reports Server (NTRS)
Ritterman, P. F.
1982-01-01
The identification and measurement of hydrogen recombination in sealed nickel-cadium cells makes deep reconditioning on a battery basis safe and feasible. Deep reconditioning improves performance and increases life of nickel-cadium batteries in geosynchronous orbit applications. The hydrogen mechanism and supporting data are presented. Parameter cell design experiments are described which led to the definition of nickel-cadium cells capable of high rate overdischarge without detriment to specific energy. Nickel-cadium calls of identical optimum design were successfully cycled for 7 seasons in simulation of geosynchronous orbit at 75 percent depth-of-discharge with extensive midseason and end-of-season overdischarge at rates varying from C/20 to C/4. Destructive physical analysis and cyclin data indicated no deterioration or the development of dangerous pressures as a result of the cycling with overdischarge.
Role of High-End Computing in Meeting NASA's Science and Engineering Challenges
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Tu, Eugene L.; Van Dalsem, William R.
2006-01-01
Two years ago, NASA was on the verge of dramatically increasing its HEC capability and capacity. With the 10,240-processor supercomputer, Columbia, now in production for 18 months, HEC has an even greater impact within the Agency and extending to partner institutions. Advanced science and engineering simulations in space exploration, shuttle operations, Earth sciences, and fundamental aeronautics research are occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. This talk describes how the integrated production environment fostered at the NASA Advanced Supercomputing (NAS) facility at Ames Research Center is accelerating scientific discovery, achieving parametric analyses of multiple scenarios, and enhancing safety for NASA missions. We focus on Columbia s impact on two key engineering and science disciplines: Aerospace, and Climate. We also discuss future mission challenges and plans for NASA s next-generation HEC environment.
Data preservation at the Fermilab Tevatron
NASA Astrophysics Data System (ADS)
Amerio, S.; Behari, S.; Boyd, J.; Brochmann, M.; Culbertson, R.; Diesburg, M.; Freeman, J.; Garren, L.; Greenlee, H.; Herner, K.; Illingworth, R.; Jayatilaka, B.; Jonckheere, A.; Li, Q.; Naymola, S.; Oleynik, G.; Sakumoto, W.; Varnes, E.; Vellidis, C.; Watts, G.; White, S.
2017-04-01
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. These efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.
Time Dependent Simulation of Turbopump Flows
NASA Technical Reports Server (NTRS)
Kiris, Cetin C.; Kwak, Dochan; Chan, William; Williams, Robert
2001-01-01
The objective of this viewgraph presentation is to enhance incompressible flow simulation capability for developing aerospace vehicle components, especially unsteady flow phenomena associated with high speed turbo pumps. Unsteady Space Shuttle Main Engine (SSME)-rig1 1 1/2 rotations are completed for the 34.3 million grid points model. The moving boundary capability is obtained by using the DCF module. MLP shared memory parallelism has been implemented and benchmarked in INS3D. The scripting capability from CAD geometry to solution is developed. Data compression is applied to reduce data size in post processing and fluid/structure coupling is initiated.
NASA Astrophysics Data System (ADS)
Block, J.; Crawl, D.; Artes, T.; Cowart, C.; de Callafon, R.; DeFanti, T.; Graham, J.; Smarr, L.; Srivas, T.; Altintas, I.
2016-12-01
The NSF-funded WIFIRE project has designed a web-based wildfire modeling simulation and visualization tool called FireMap. The tool executes FARSITE to model fire propagation using dynamic weather and fire data, configuration settings provided by the user, and static topography and fuel datasets already built-in. Using GIS capabilities combined with scalable big data integration and processing, FireMap enables simple execution of the model with options for running ensembles by taking the information uncertainty into account. The results are easily viewable, sharable, repeatable, and can be animated as a time series. From these capabilities, users can model real-time fire behavior, analyze what-if scenarios, and keep a history of model runs over time for sharing with collaborators. Firemap runs FARSITE with national and local sensor networks for real-time weather data ingestion and High-Resolution Rapid Refresh (HRRR) weather for forecasted weather. The HRRR is a NOAA/NCEP operational weather prediction system comprised of a numerical forecast model and an analysis/assimilation system to initialize the model. It is run with a horizontal resolution of 3 km, has 50 vertical levels, and has a temporal resolution of 15 minutes. The HRRR requires an Environmental Data Exchange (EDEX) server to receive the feed and generate secondary products out of it for the modeling. UCSD's EDEX server, funded by NSF, makes high-resolution weather data available to researchers worldwide and enables visualization of weather systems and weather events lasting months or even years. The high-speed server aggregates weather data from the University Consortium for Atmospheric Research by way of a subscription service from the Consortium called the Internet Data Distribution system. These features are part of WIFIRE's long term goals to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. Although Firemap is a research product of WIFIRE, developed in collaboration with a number of fire departments, the tool is operational in pilot form for providing big data-driven predictive fire spread modeling. Most recently, FireMap was used for situational awareness in the July 2016 Sand Fire by LA City and LA County Fire Departments.
2016-06-16
procedure. The predictive capabilities of the high-resolution computational fluid dynamics ( CFD ) simulations of urban flow are validated against a very...turbulence over a 2D building array using high-resolution CFD and a distributed drag force approach a Department of Mechanical Engineering, University
DNS of Flow in a Low-Pressure Turbine Cascade Using a Discontinuous-Galerkin Spectral-Element Method
NASA Technical Reports Server (NTRS)
Garai, Anirban; Diosady, Laslo Tibor; Murman, Scott; Madavan, Nateri
2015-01-01
A new computational capability under development for accurate and efficient high-fidelity direct numerical simulation (DNS) and large eddy simulation (LES) of turbomachinery is described. This capability is based on an entropy-stable Discontinuous-Galerkin spectral-element approach that extends to arbitrarily high orders of spatial and temporal accuracy and is implemented in a computationally efficient manner on a modern high performance computer architecture. A validation study using this method to perform DNS of flow in a low-pressure turbine airfoil cascade are presented. Preliminary results indicate that the method captures the main features of the flow. Discrepancies between the predicted results and the experiments are likely due to the effects of freestream turbulence not being included in the simulation and will be addressed in the final paper.
An energy-efficient transmission scheme for real-time data in wireless sensor networks.
Kim, Jin-Woo; Barrado, José Ramón Ramos; Jeon, Dong-Keun
2015-05-20
The Internet of things (IoT) is a novel paradigm where all things or objects in daily life can communicate with other devices and provide services over the Internet. Things or objects need identifying, sensing, networking and processing capabilities to make the IoT paradigm a reality. The IEEE 802.15.4 standard is one of the main communication protocols proposed for the IoT. The IEEE 802.15.4 standard provides the guaranteed time slot (GTS) mechanism that supports the quality of service (QoS) for the real-time data transmission. In spite of some QoS features in IEEE 802.15.4 standard, the problem of end-to-end delay still remains. In order to solve this problem, we propose a cooperative medium access scheme (MAC) protocol for real-time data transmission. We also evaluate the performance of the proposed scheme through simulation. The simulation results demonstrate that the proposed scheme can improve the network performance.
An Energy-Efficient Transmission Scheme for Real-Time Data in Wireless Sensor Networks
Kim, Jin-Woo; Barrado, José Ramón Ramos; Jeon, Dong-Keun
2015-01-01
The Internet of things (IoT) is a novel paradigm where all things or objects in daily life can communicate with other devices and provide services over the Internet. Things or objects need identifying, sensing, networking and processing capabilities to make the IoT paradigm a reality. The IEEE 802.15.4 standard is one of the main communication protocols proposed for the IoT. The IEEE 802.15.4 standard provides the guaranteed time slot (GTS) mechanism that supports the quality of service (QoS) for the real-time data transmission. In spite of some QoS features in IEEE 802.15.4 standard, the problem of end-to-end delay still remains. In order to solve this problem, we propose a cooperative medium access scheme (MAC) protocol for real-time data transmission. We also evaluate the performance of the proposed scheme through simulation. The simulation results demonstrate that the proposed scheme can improve the network performance. PMID:26007722
NASA Technical Reports Server (NTRS)
Stephens, Craig A.; Crawford, Michael E.
1990-01-01
Assessments were made of the simulation capabilities of transition models developed at the University of Minnesota, as applied to the Launder-Sharma and Lam-Bremhorst two-equation turbulence models, and at The University of Texas at Austin, as applied to the K. Y. Chien two-equation turbulence model. A major shortcoming in the use of the basic K. Y. Chien turbulence model for low-Reynolds number flows was identified. The problem with the Chien model involved premature start of natural transition and a damped response as the simulation moved to fully turbulent flow at the end of transition. This is in contrast to the other two-equation turbulence models at comparable freestream turbulence conditions. The damping of the transition response of the Chien turbulence model leads to an inaccurate estimate of the start and end of transition for freestream turbulence levels greater than 1.0 percent and to difficulty in calculating proper model constants for the transition model.
Extending BPM Environments of Your Choice with Performance Related Decision Support
NASA Astrophysics Data System (ADS)
Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
The NASA Auralization Framework and Plugin Architecture
NASA Technical Reports Server (NTRS)
Aumann, Aric R.; Tuttle, Brian C.; Chapin, William L.; Rizzi, Stephen A.
2015-01-01
NASA has a long history of investigating human response to aircraft flyover noise and in recent years has developed a capability to fully auralize the noise of aircraft during their design. This capability is particularly useful for unconventional designs with noise signatures significantly different from the current fleet. To that end, a flexible software architecture has been developed to facilitate rapid integration of new simulation techniques for noise source synthesis and propagation, and to foster collaboration amongst researchers through a common releasable code base. The NASA Auralization Framework (NAF) is a skeletal framework written in C++ with basic functionalities and a plugin architecture that allows users to mix and match NAF capabilities with their own methods through the development and use of dynamically linked libraries. This paper presents the NAF software architecture and discusses several advanced auralization techniques that have been implemented as plugins to the framework.
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin
2014-05-01
During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560,640 equivalent cores. Scientific applications, such as CESM, are also required to demonstrate a "computational readiness capability" to efficiently scale across and utilize 20% of the entire system. The 0,25 deg configuration of the spectral element dynamical core of the Community Atmosphere Model (CAM-SE), the atmospheric component of CESM, has been demonstrated to scale efficiently across more than 5,000 nodes (80,000 CPU cores) on Titan. The tracer transport routines of CAM-SE have also been ported to take advantage of the hybrid many-core architecture of Titan using GPUs [see EGU2014-4233], yielding over 2X speedup when transporting over 100 tracers. The high throughput I/O in CESM, based on the Parallel IO Library (PIO), is being further augmented to support even higher resolutions and enhance resiliency. The application performance of the individual runs are archived in a database and routinely analyzed to identify and rectify performance degradation during the course of the experiments. The various resources available at the OLCF now support a scientific workflow to facilitate high-resolution climate modelling. A high-speed center-wide parallel file system, called ATLAS, capable of 1 TB/s, is available on Titan as well as on the clusters used for analysis (Rhea) and visualization (Lens/EVEREST). Long-term archive is facilitated by the HPSS storage system. The Earth System Grid (ESG), featuring search & discovery, is also used to deliver data. The end-to-end workflow allows OLCF users to efficiently share data and publish results in a timely manner.
NASA Astrophysics Data System (ADS)
Molthan, A.; Seepersad, J.; Shute, J.; Carriere, L.; Duffy, D.; Tisdale, B.; Kirschbaum, D.; Green, D. S.; Schwizer, L.
2017-12-01
NASA's Earth Science Disasters Program promotes the use of Earth observations to improve the prediction of, preparation for, response to, and recovery from natural and technological disasters. NASA Earth observations and those of domestic and international partners are combined with in situ observations and models by NASA scientists and partners to develop products supporting disaster mitigation, response, and recovery activities among several end-user partners. These products are accompanied by training to ensure proper integration and use of these materials in their organizations. Many products are integrated along with other observations available from other sources in GIS-capable formats to improve situational awareness and response efforts before, during and after a disaster. Large volumes of NASA observations support the generation of disaster response products by NASA field center scientists, partners in academia, and other institutions. For example, a prediction of high streamflows and inundation from a NASA-supported model may provide spatial detail of flood extent that can be combined with GIS information on population density, infrastructure, and land value to facilitate a prediction of who will be affected, and the economic impact. To facilitate the sharing of these outputs in a common framework that can be easily ingested by downstream partners, the NASA Earth Science Disasters Program partnered with Esri and the NASA Center for Climate Simulation (NCCS) to establish a suite of Esri/ArcGIS services to support the dissemination of routine and event-specific products to end users. This capability has been demonstrated to key partners including the Federal Emergency Management Agency using a case-study example of Hurricane Matthew, and will also help to support future domestic and international disaster events. The Earth Science Disasters Program has also established a longer-term vision to leverage scientists' expertise in the development and delivery of end-user training, increase public awareness of NASA's Disasters Program, and facilitate new partnerships with disaster response organizations. Future research and development will foster generation of products that leverage NASA's Earth observations for disaster prediction, preparation and mitigation, response, and recovery.
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less
Modeling a maintenance simulation of the geosynchronous platform
NASA Technical Reports Server (NTRS)
Kleiner, A. F., Jr.
1980-01-01
A modeling technique used to conduct a simulation study comparing various maintenance routines for a space platform is dicussed. A system model is described and illustrated, the basic concepts of a simulation pass are detailed, and sections on failures and maintenance are included. The operation of the system across time is best modeled by a discrete event approach with two basic events - failure and maintenance of the system. Each overall simulation run consists of introducing a particular model of the physical system, together with a maintenance policy, demand function, and mission lifetime. The system is then run through many passes, each pass corresponding to one mission and the model is re-initialized before each pass. Statistics are compiled at the end of each pass and after the last pass a report is printed. Items of interest typically include the time to first maintenance, total number of maintenance trips for each pass, average capability of the system, etc.
Adapter assembly prevents damage to tubing during high pressure tests
NASA Technical Reports Server (NTRS)
Stinett, L. L.
1965-01-01
Portable adapter assembly prevents damage to tubing and injury to personnel when pressurizing a system or during high pressure tests. The assembly is capable of withstanding high pressure. It is securely attached to the tubing stub end and may be removed without brazing, cutting or cleaning the tube.
Integrated simulations for fusion research in the 2030's time frame (white paper outline)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.
This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less
NASA Technical Reports Server (NTRS)
Srivastava, Priyaka; Kraus, Jeff; Murawski, Robert; Golden, Bertsel, Jr.
2015-01-01
NASAs Space Communications and Navigation (SCaN) program manages three active networks: the Near Earth Network, the Space Network, and the Deep Space Network. These networks simultaneously support NASA missions and provide communications services to customers worldwide. To efficiently manage these resources and their capabilities, a team of student interns at the NASA Glenn Research Center is developing a distributed system to model the SCaN networks. Once complete, the system shall provide a platform that enables users to perform capacity modeling of current and prospective missions with finer-grained control of information between several simulation and modeling tools. This will enable the SCaN program to access a holistic view of its networks and simulate the effects of modifications in order to provide NASA with decisional information. The development of this capacity modeling system is managed by NASAs Strategic Center for Education, Networking, Integration, and Communication (SCENIC). Three primary third-party software tools offer their unique abilities in different stages of the simulation process. MagicDraw provides UMLSysML modeling, AGIs Systems Tool Kit simulates the physical transmission parameters and de-conflicts scheduled communication, and Riverbed Modeler (formerly OPNET) simulates communication protocols and packet-based networking. SCENIC developers are building custom software extensions to integrate these components in an end-to-end space communications modeling platform. A central control module acts as the hub for report-based messaging between client wrappers. Backend databases provide information related to mission parameters and ground station configurations, while the end user defines scenario-specific attributes for the model. The eight SCENIC interns are working under the direction of their mentors to complete an initial version of this capacity modeling system during the summer of 2015. The intern team is composed of four students in Computer Science, two in Computer Engineering, one in Electrical Engineering, and one studying Space Systems Engineering.
Confinement of hydrogen at high pressure in carbon nanotubes
Lassila, David H [Aptos, CA; Bonner, Brian P [Livermore, CA
2011-12-13
A high pressure hydrogen confinement apparatus according to one embodiment includes carbon nanotubes capped at one or both ends thereof with a hydrogen-permeable membrane to enable the high pressure confinement of hydrogen and release of the hydrogen therethrough. A hydrogen confinement apparatus according to another embodiment includes an array of multi-walled carbon nanotubes each having first and second ends, the second ends being capped with palladium (Pd) to enable the high pressure confinement of hydrogen and release of the hydrogen therethrough as a function of palladium temperature, wherein the array of carbon nanotubes is capable of storing hydrogen gas at a pressure of at least 1 GPa for greater than 24 hours. Additional apparatuses and methods are also presented.
NASA Astrophysics Data System (ADS)
Huang, Feng; Sun, Lifeng; Zhong, Yuzhuo
2006-01-01
Robust transmission of live video over ad hoc wireless networks presents new challenges: high bandwidth requirements are coupled with delay constraints; even a single packet loss causes error propagation until a complete video frame is coded in the intra-mode; ad hoc wireless networks suffer from bursty packet losses that drastically degrade the viewing experience. Accordingly, we propose a novel UMD coder capable of quickly recovering from losses and ensuring continuous playout. It uses 'peg' frames to prevent error propagation in the High-Resolution (HR) description and improve the robustness of key frames. The Low-Resolution (LR) coder works independent of the HR one, but they can also help each other recover from losses. Like many UMD coders, our UMD coder is drift-free, disruption-tolerant and able to make good use of the asymmetric available bandwidths of multiple paths. The simulation results under different conditions show that the proposed UMD coder has the highest decoded quality and lowest probability of pause when compared with concurrent UMDC techniques. The coder also has a comparable decoded quality, lower startup delay and lower probability of pause than a state-of-the-art FEC-based scheme. To provide robustness for video multicast applications, we propose non-end-to-end UMDC-based video distribution over a multi-tree multicast network. The multiplicity of parents decorrelates losses and the non-end-to-end feature increases the throughput of UMDC video data. We deploy an application-level service of LR description reconstruction in some intermediate nodes of the LR multicast tree. The principle behind this is to reconstruct the disrupted LR frames by the correctly received HR frames. As a result, the viewing experience at the downstream nodes benefits from the protection reconstruction at the upstream nodes.
NASA Astrophysics Data System (ADS)
Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D.
2012-09-01
We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration driven simulations (with 10-90 percentile ranges of 1.7 K for the aggressive mitigation scenario up to 3.9 K for the high end business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 degrees (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission driven experiments, they do not change existing expectations (based on previous concentration driven experiments) on the timescale that different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration pathways used to drive GCM ensembles lies towards the lower end of our simulated distribution. This design decision (a legecy of previous assessments) is likely to lead concentration driven experiments to under-sample strong feedback responses in concentration driven projections. Our ensemble of emission driven simulations span the global temperature response of other multi-model frameworks except at the low end, where combinations of low climate sensitivity and low carbon cycle feedbacks lead to responses outside our ensemble range. The ensemble simulates a number of high end responses which lie above the CMIP5 carbon cycle range. These high end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real world climate sensitivity constraints which, if achieved, would lead to reductions on the uppper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present day observables and future changes while the large spread of future projected changes, highlights the ongoing need for such work.
Solar Sail Spaceflight Simulation
NASA Technical Reports Server (NTRS)
Lisano, Michael; Evans, James; Ellis, Jordan; Schimmels, John; Roberts, Timothy; Rios-Reyes, Leonel; Scheeres, Daniel; Bladt, Jeff; Lawrence, Dale; Piggott, Scott
2007-01-01
The Solar Sail Spaceflight Simulation Software (S5) toolkit provides solar-sail designers with an integrated environment for designing optimal solar-sail trajectories, and then studying the attitude dynamics/control, navigation, and trajectory control/correction of sails during realistic mission simulations. Unique features include a high-fidelity solar radiation pressure model suitable for arbitrarily-shaped solar sails, a solar-sail trajectory optimizer, capability to develop solar-sail navigation filter simulations, solar-sail attitude control models, and solar-sail high-fidelity force models.
NASA Technical Reports Server (NTRS)
Gill, E. N.
1986-01-01
The requirements are identified for a very high order natural language to be used by crew members on board the Space Station. The hardware facilities, databases, realtime processes, and software support are discussed. The operations and capabilities that will be required in both normal (routine) and abnormal (nonroutine) situations are evaluated. A structure and syntax for an interface (front-end) language to satisfy the above requirements are recommended.
PISCES High Contrast Integral Field Spectrograph Simulations and Data Reduction Pipeline
NASA Technical Reports Server (NTRS)
Llop Sayson, Jorge Domingo; Memarsadeghi, Nargess; McElwain, Michael W.; Gong, Qian; Perrin, Marshall; Brandt, Timothy; Grammer, Bryan; Greeley, Bradford; Hilton, George; Marx, Catherine
2015-01-01
The PISCES (Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies) is a lenslet array based integral field spectrograph (IFS) designed to advance the technology readiness of the WFIRST (Wide Field Infrared Survey Telescope)-AFTA (Astrophysics Focused Telescope Assets) high contrast Coronagraph Instrument. We present the end to end optical simulator and plans for the data reduction pipeline (DRP). The optical simulator was created with a combination of the IDL (Interactive Data Language)-based PROPER (optical propagation) library and Zemax (a MatLab script), while the data reduction pipeline is a modified version of the Gemini Planet Imager's (GPI) IDL pipeline. The simulations of the propagation of light through the instrument are based on Fourier transform algorithms. The DRP enables transformation of the PISCES IFS data to calibrated spectral data cubes.
Evaluation of haptic interfaces for simulation of drill vibration in virtual temporal bone surgery.
Ghasemloonia, Ahmad; Baxandall, Shalese; Zareinia, Kourosh; Lui, Justin T; Dort, Joseph C; Sutherland, Garnette R; Chan, Sonny
2016-11-01
Surgical training is evolving from an observership model towards a new paradigm that includes virtual-reality (VR) simulation. In otolaryngology, temporal bone dissection has become intimately linked with VR simulation as the complexity of anatomy demands a high level of surgeon aptitude and confidence. While an adequate 3D visualization of the surgical site is available in current simulators, the force feedback rendered during haptic interaction does not convey vibrations. This lack of vibration rendering limits the simulation fidelity of a surgical drill such as that used in temporal bone dissection. In order to develop an immersive simulation platform capable of haptic force and vibration feedback, the efficacy of hand controllers for rendering vibration in different drilling circumstances needs to be investigated. In this study, the vibration rendering ability of four different haptic hand controllers were analyzed and compared to find the best commercial haptic hand controller. A test-rig was developed to record vibrations encountered during temporal bone dissection and a software was written to render the recorded signals without adding hardware to the system. An accelerometer mounted on the end-effector of each device recorded the rendered vibration signals. The newly recorded vibration signal was compared with the input signal in both time and frequency domains by coherence and cross correlation analyses to quantitatively measure the fidelity of these devices in terms of rendering vibrotactile drilling feedback in different drilling conditions. This method can be used to assess the vibration rendering ability in VR simulation systems and selection of ideal haptic devices. Copyright © 2016 Elsevier Ltd. All rights reserved.
Warthog: A MOOSE-Based Application for the Direct Code Coupling of BISON and PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.; Slattery, Stuart; Billings, Jay Jay
The Nuclear Energy Advanced Modeling and Simulation (NEAMS) program from the Department of Energy's Office of Nuclear Energy provides a robust toolkit for the modeling and simulation of current and future advanced nuclear reactor designs. This toolkit provides these technologies organized across product lines: two divisions targeted at fuels and end-to-end reactor modeling, and a third for integration, coupling, and high-level workflow management. The Fuels Product Line and the Reactor Product line provide advanced computational technologies that serve each respective field well, however, their current lack of integration presents a major impediment to future improvements of simulation solution fidelity. Theremore » is a desire for the capability to mix and match tools across Product Lines in an effort to utilize the best from both to improve NEAMS modeling and simulation technologies. This report details a new effort to provide this Product Line interoperability through the development of a new application called Warthog. This application couples the BISON Fuel Performance application from the Fuels Product Line and the PROTEUS Core Neutronics application from the Reactors Product Line in an effort to utilize the best from all parts of the NEAMS toolkit and improve overall solution fidelity of nuclear fuel simulations. To achieve this, Warthog leverages as much prior work from the NEAMS program as possible, and in doing so, enables interoperability between the disparate MOOSE and SHARP frameworks, and the libMesh and MOAB mesh data formats. This report describes this work in full. We begin with a detailed look at the individual NEAMS framework technologies used and developed in the various Product Lines, and the current status of their interoperability. We then introduce the Warthog application: its overall architecture and the ways it leverages the best existing tools from across the NEAMS toolkit to enable BISON-PROTEUS integration. Furthermore, we show how Warthog leverages a tool known as DataTransferKit to seamlessly enable the transfer for solution data between disparate frameworks and mesh formats. To end, we demonstrate tests for the direct software coupling of BISON and PROTEUS using Warthog, and discuss current impediments and solutions to the construction of physically realistic input models for this coupled BISON-PROTEUS system.« less
Synthesis for Lunar Simulants: Glass, Agglutinate, Plagioclase, Breccia
NASA Technical Reports Server (NTRS)
Weinstein, Michael; Wilson, Stephen A.; Rickman, Douglas L.; Stoeser, Douglas
2012-01-01
The video describes a process for making glass for lunar regolith simulants that was developed from a patented glass-producing technology. Glass composition can be matched to simulant design and specification. Production of glass, pseudo agglutinates, plagioclase, and breccias is demonstrated. The system is capable of producing hundreds of kilograms of high quality glass and simulants per day.
Bidirectional converter for high-efficiency fuel cell powertrain
NASA Astrophysics Data System (ADS)
Fardoun, Abbas A.; Ismail, Esam H.; Sabzali, Ahmad J.; Al-Saffar, Mustafa A.
2014-03-01
In this paper, a new wide conversion ratio step-up and step-down converter is presented. The proposed converter is derived from the conventional Single Ended Primary Inductor Converter (SEPIC) topology and it is integrated with a capacitor-diode voltage multiplier, which offers a simple structure, reduced electromagnetic interference (EMI), and reduced semiconductors' voltage stresses. Other advantages include: continuous input and output current, extended step-up and step-down voltage conversion ratio without extreme low or high duty-cycle, simple control circuitry, and near-zero input and output ripple currents compared to other converter topologies. The low charging/discharging current ripple and wide gain features result in a longer life-span and lower cost of the energy storage battery system. In addition, the "near-zero" ripple capability improves the fuel cell durability. Theoretical analysis results obtained with the proposed structure are compared with other bi-direction converter topologies. Simulation and experimental results are presented to verify the performance of the proposed bi-directional converter.
Recent Progress in the Development of a Multi-Layer Green's Function Code for Ion Beam Transport
NASA Technical Reports Server (NTRS)
Tweed, John; Walker, Steven A.; Wilson, John W.; Tripathi, Ram K.
2008-01-01
To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiation is needed. To address this need, a new Green's function code capable of simulating high charge and energy ions with either laboratory or space boundary conditions is currently under development. The computational model consists of combinations of physical perturbation expansions based on the scales of atomic interaction, multiple scattering, and nuclear reactive processes with use of the Neumann-asymptotic expansions with non-perturbative corrections. The code contains energy loss due to straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and downshifts. Previous reports show that the new code accurately models the transport of ion beams through a single slab of material. Current research efforts are focused on enabling the code to handle multiple layers of material and the present paper reports on progress made towards that end.
A High Resolution Phoswich Detector: LaBr{sub 3}(Ce) Coupled With LaCl{sub 3}(Ce)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmona-Gallardo, M.; Borge, M. J. G.; Briz, J. A.
2010-04-26
An innovative solution for the forward end-cap CALIFA calorimeter of R{sup 3}B is under investigation consisting of two scintillation crystals, LaBr{sub 3} and LaCl{sub 3}, stacked together in a phoswich configuration with one readout only. This dispositive should be capable of a good determination of the energy of protons and gamma radiation. This composite detector allows to deduce the initial energy of charged particles by DELTAE1+DELTAE2 identification. For gammas, the simulations show that there is a high probability that the first interaction occurs inside the scintillator at few centimeters, with a second layer, the rest of the energy is absorbed,more » or it can be used as veto event in case of no deposition in the first layer. One such a detector has been tested at the Centro de MicroAnalisis de Materiales (CMAM) in Madrid. Good resolution and time signal separation have been achieved.« less
Space Debris Detection on the HPDP, a Coarse-Grained Reconfigurable Array Architecture for Space
NASA Astrophysics Data System (ADS)
Suarez, Diego Andres; Bretz, Daniel; Helfers, Tim; Weidendorfer, Josef; Utzmann, Jens
2016-08-01
Stream processing, widely used in communications and digital signal processing applications, requires high- throughput data processing that is achieved in most cases using Application-Specific Integrated Circuit (ASIC) designs. Lack of programmability is an issue especially in space applications, which use on-board components with long life-cycles requiring applications updates. To this end, the High Performance Data Processor (HPDP) architecture integrates an array of coarse-grained reconfigurable elements to provide both flexible and efficient computational power suitable for stream-based data processing applications in space. In this work the capabilities of the HPDP architecture are demonstrated with the implementation of a real-time image processing algorithm for space debris detection in a space-based space surveillance system. The implementation challenges and alternatives are described making trade-offs to improve performance at the expense of negligible degradation of detection accuracy. The proposed implementation uses over 99% of the available computational resources. Performance estimations based on simulations show that the HPDP can amply match the application requirements.
Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain
NASA Astrophysics Data System (ADS)
Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.
2018-04-01
The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.
Simulation Higher Order Language Requirements Study.
ERIC Educational Resources Information Center
Goodenough, John B.; Braun, Christine L.
The definitions provided for high order language (HOL) requirements for programming flight training simulators are based on the analysis of programs written for a variety of simulators. Examples drawn from these programs are used to justify the need for certain HOL capabilities. A description of the general structure and organization of the…
JacksonBot - Design, Simulation and Optimal Control of an Action Painting Robot
NASA Astrophysics Data System (ADS)
Raschke, Michael; Mombaur, Katja; Schubert, Alexander
We present the robotics platform JacksonBot which is capable to produce paintings inspired by the Action Painting style of Jackson Pollock. A dynamically moving robot arm splashes color from a container at the end effector on the canvas. The paintings produced by this platform rely on a combination of the algorithmic generation of robot arm motions with random effects of the splashing color. The robot can be considered as a complex and powerful tool to generate art works programmed by a user. Desired end effector motions can be prescribed either by mathematical functions, by point sequences or by data glove motions. We have evaluated the effect of different shapes of input motions on the resulting painting. In order to compute the robot joint trajectories necessary to move along a desired end effector path, we use an optimal control based approach to solve the inverse kinematics problem.
SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys
Nord, B.; Amara, A.; Refregier, A.; ...
2016-03-03
The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less
CERNBox + EOS: end-user storage for science
NASA Astrophysics Data System (ADS)
Mascetti, L.; Gonzalez Labrador, H.; Lamanna, M.; Mościcki, JT; Peters, AJ
2015-12-01
CERNBox is a cloud synchronisation service for end-users: it allows syncing and sharing files on all major mobile and desktop platforms (Linux, Windows, MacOSX, Android, iOS) aiming to provide offline availability to any data stored in the CERN EOS infrastructure. The successful beta phase of the service confirmed the high demand in the community for an easily accessible cloud storage solution such as CERNBox. Integration of the CERNBox service with the EOS storage back-end is the next step towards providing “sync and share” capabilities for scientific and engineering use-cases. In this report we will present lessons learnt in offering the CERNBox service, key technical aspects of CERNBox/EOS integration and new, emerging usage possibilities. The latter includes the ongoing integration of “sync and share” capabilities with the LHC data analysis tools and transfer services.
NASA Technical Reports Server (NTRS)
Negrut, Dan; Mazhar, Hammad; Melanz, Daniel; Lamb, David; Jayakumar, Paramsothy; Letherwood, Michael; Jain, Abhinandan; Quadrelli, Marco
2012-01-01
This paper is concerned with the physics-based simulation of light tracked vehicles operating on rough deformable terrain. The focus is on small autonomous vehicles, which weigh less than 100 lb and move on deformable and rough terrain that is feature rich and no longer representable using a continuum approach. A scenario of interest is, for instance, the simulation of a reconnaissance mission for a high mobility lightweight robot where objects such as a boulder or a ditch that could otherwise be considered small for a truck or tank, become major obstacles that can impede the mobility of the light autonomous vehicle and negatively impact the success of its mission. Analyzing and gauging the mobility and performance of these light vehicles is accomplished through a modeling and simulation capability called Chrono::Engine. Chrono::Engine relies on parallel execution on Graphics Processing Unit (GPU) cards.
Numerical Study of Solar Storms from the Sun to Earth
NASA Astrophysics Data System (ADS)
Feng, Xueshang; Jiang, Chaowei; Zhou, Yufen
2017-04-01
As solar storms are sweeping the Earth, adverse changes occur in geospace environment. How human can mitigate and avoid destructive damages caused by solar storms becomes an important frontier issue that we must face in the high-tech times. It is of both scientific significance to understand the dynamic process during solar storm's propagation in interplanetary space and realistic value to conduct physics-based numerical researches on the three-dimensional process of solar storms in interplanetary space with the aid of powerful computing capacity to predict the arrival times, intensities, and probable geoeffectiveness of solar storms at the Earth. So far, numerical studies based on magnetohydrodynamics (MHD) have gone through the transition from the initial qualitative principle researches to systematic quantitative studies on concrete events and numerical predictions. Numerical modeling community has a common goal to develop an end-to-end physics-based modeling system for forecasting the Sun-Earth relationship. It is hoped that the transition of these models to operational use depends on the availability of computational resources at reasonable cost and that the models' prediction capabilities may be improved by incorporating the observational findings and constraints into physics-based models, combining the observations, empirical models and MHD simulations in organic ways. In this talk, we briefly focus on our recent progress in using solar observations to produce realistic magnetic configurations of CMEs as they leave the Sun, and coupling data-driven simulations of CMEs to heliospheric simulations that then propagate the CME configuration to 1AU, and outlook the important numerical issues and their possible solutions in numerical space weather modeling from the Sun to Earth for future research.
Thermodynamic, Transport and Chemical Properties of Reference JP-8
2006-06-01
external diameter, 0.18 cm internal diameter) that are sealed on one end with a stainless steel plug welded by a clean tungsten-inert-gas ( TIG ) 15...tubing with an internal diameter of 0.02 cm, also TIG welded to the cell. Each cell and valve is capable of withstanding a pressure in excess of 105... process . Each cell is connected to a high-pressure high-temperature valve at the other end with a short length of 0.16 cm diameter 316 stainless steel
High-rate/high-temperature capability of a single-layer zicar-separator nickel-hydrogen cell
NASA Technical Reports Server (NTRS)
Wheeler, James R.
1995-01-01
A 50 Ampere-hour nickel-hydrogen cell with a single-layer Zircar separator stack design was fully charged and then discharged at a 2C current rate to an end voltage of 1 volt. This extreme test resulted in high temperatures which were recorded at three locations on the cell, i.e., the cell wall, the boss (barrel of the compression seal), and a terminal. The results provide new information about the high-temperature and high-discharge-rate capabilities of nickel-hydrogen cells. This information also adds to the growing data base for single-layer zirconium-oxide-cloth (Zircar) separator cell designs.
Overview of Experimental Capabilities - Supersonics
NASA Technical Reports Server (NTRS)
Banks, Daniel W.
2007-01-01
This viewgraph presentation gives an overview of experimental capabilities applicable to the area of supersonic research. The contents include: 1) EC Objectives; 2) SUP.11: Elements; 3) NRA; 4) Advanced Flight Simulator Flexible Aircraft Simulation Studies; 5) Advanced Flight Simulator Flying Qualities Guideline Development for Flexible Supersonic Transport Aircraft; 6) Advanced Flight Simulator Rigid/Flex Flight Control; 7) Advanced Flight Simulator Rapid Sim Model Exchange; 8) Flight Test Capabilities Advanced In-Flight Infrared (IR) Thermography; 9) Flight Test Capabilities In-Flight Schlieren; 10) Flight Test Capabilities CLIP Flow Calibration; 11) Flight Test Capabilities PFTF Flowfield Survey; 12) Ground Test Capabilities Laser-Induced Thermal Acoustics (LITA); 13) Ground Test Capabilities Doppler Global Velocimetry (DGV); 14) Ground Test Capabilities Doppler Global Velocimetry (DGV); and 15) Ground Test Capabilities EDL Optical Measurement Capability (PIV) for Rigid/Flexible Decelerator Models.
NASA Astrophysics Data System (ADS)
Gorzynski, Mark; Derocher, Mike; Mitchell, April Slayden
Research underway at Hewlett-Packard on remote communication resulted in the identification of three important components typically missing in existing systems. These missing components are: group nonverbal communication capabilities, high-resolution interactive data capabilities, and global services. Here we discuss some of the design elements in these three areas as part of the Halo program at HP, a remote communication system shown to be effective to end-users.
Fermentation method producing ethanol
Wang, Daniel I. C.; Dalal, Rajen
1986-01-01
Ethanol is the major end product of an anaerobic, thermophilic fermentation process using a mutant strain of bacterium Clostridium thermosaccharolyticum. This organism is capable of converting hexose and pentose carbohydrates to ethanol, acetic and lactic acids. Mutants of Clostridium thermosaccharolyticum are capable of converting these substrates to ethanol in exceptionally high yield and with increased productivity. Both the mutant organism and the technique for its isolation are provided.
NASA Astrophysics Data System (ADS)
Prasad, K.; Thorpe, A. K.; Duren, R. M.; Thompson, D. R.; Whetstone, J. R.
2016-12-01
The National Institute of Standards and Technology (NIST) has supported the development and demonstration of a measurement capability to accurately locate greenhouse gas sources and measure their flux to the atmosphere over urban domains. However, uncertainties in transport models which form the basis of all top-down approaches can significantly affect our capability to attribute sources and predict their flux to the atmosphere. Reducing uncertainties between bottom-up and top-down models will require high resolution transport models as well as validation and verification of dispersion models over an urban domain. Tracer experiments involving the release of Perfluorocarbon Tracers (PFTs) at known flow rates offer the best approach for validating dispersion / transport models. However, tracer experiments are limited by cost, ability to make continuous measurements, and environmental concerns. Natural tracer experiments, such as the leak from the Aliso Canyon underground storage facility offers a unique opportunity to improve and validate high resolution transport models, test leak hypothesis, and to estimate the amount of methane released.High spatial resolution (10 m) Large Eddy Simulations (LES) coupled with WRF atmospheric transport models were performed to simulate the dynamics of the Aliso Canyon methane plume and to quantify the source. High resolution forward simulation results were combined with aircraft and tower based in-situ measurements as well as data from NASA airborne imaging spectrometers. Comparison of simulation results with measurement data demonstrate the capability of the LES models to accurately model transport and dispersion of methane plumes over urban domains.
Health Worker Focused Distributed Simulation for Improving Capability of Health Systems in Liberia.
Gale, Thomas C E; Chatterjee, Arunangsu; Mellor, Nicholas E; Allan, Richard J
2016-04-01
The main goal of this study was to produce an adaptable learning platform using virtual learning and distributed simulation, which can be used to train health care workers, across a wide geographical area, key safety messages regarding infection prevention control (IPC). A situationally responsive agile methodology, Scrum, was used to develop a distributed simulation module using short 1-week iterations and continuous synchronous plus asynchronous communication including end users and IPC experts. The module contained content related to standard IPC precautions (including handwashing techniques) and was structured into 3 distinct sections related to donning, doffing, and hazard perception training. Using Scrum methodology, we were able to link concepts applied to best practices in simulation-based medical education (deliberate practice, continuous feedback, self-assessment, and exposure to uncommon events), pedagogic principles related to adult learning (clear goals, contextual awareness, motivational features), and key learning outcomes regarding IPC, as a rapid response initiative to the Ebola outbreak in West Africa. Gamification approach has been used to map learning mechanics to enhance user engagement. The developed IPC module demonstrates how high-frequency, low-fidelity simulations can be rapidly designed using scrum-based agile methodology. Analytics incorporated into the tool can help demonstrate improved confidence and competence of health care workers who are treating patients within an Ebola virus disease outbreak region. These concepts could be used in a range of evolving disasters where rapid development and communication of key learning messages are required.
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
Using simulation to improve the capability of undergraduate nursing students in mental health care.
Kunst, Elicia L; Mitchell, Marion; Johnston, Amy N B
2017-03-01
Mental health care is an increasing component of acute patient care and yet mental health care education can be limited in undergraduate nursing programs. The aim of this study was to establish if simulation learning can be an effective method of improving undergraduate nurses' capability in mental health care in an acute care environment. Undergraduate nursing students at an Australian university were exposed to several high-fidelity high-technology simulation activities that incorporated elements of acute emergency nursing practice and acute mental health intervention, scaffolded by theories of learning. This approach provided a safe environment for students to experience clinical practice, and develop their skills for dealing with complex clinical challenges. Using a mixed method approach, the primary domains of interest in this study were student confidence, knowledge and ability. These were self-reported and assessed before and after the simulation activities (intervention) using a pre-validated survey, to gauge the self-rated capacity of students to initiate and complete effective care episodes. Focus group interviews were subsequently held with students who attended placement in the emergency department to explore the impact of the intervention on student performance in this clinical setting. Students who participated in the simulation activity identified and reported significantly increased confidence, knowledge and ability in mental health care post-intervention. They identified key features of the intervention included the impact of its realism on the quality of learning. There is some evidence to suggest that the intervention had an impact on the performance and reflection of students in the clinical setting. This study provides evidence to support the use of simulation to enhance student nurses' clinical capabilities in providing mental health care in acute care environments. Nursing curriculum development should be based on best-evidence to ensure that future nursing graduates have the skills and capability to provide high-quality, holistic care. Copyright © 2016 Elsevier Ltd. All rights reserved.
A high-throughput, multi-channel photon-counting detector with picosecond timing
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.
2009-06-01
High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.
Post-acceleration of laser driven protons with a compact high field linac
NASA Astrophysics Data System (ADS)
Sinigardi, Stefano; Londrillo, Pasquale; Rossi, Francesco; Turchetti, Giorgio; Bolton, Paul R.
2013-05-01
We present a start-to-end 3D numerical simulation of a hybrid scheme for the acceleration of protons. The scheme is based on a first stage laser acceleration, followed by a transport line with a solenoid or a multiplet of quadrupoles, and then a post-acceleration section in a compact linac. Our simulations show that from a laser accelerated proton bunch with energy selection at ~ 30MeV, it is possible to obtain a high quality monochromatic beam of 60MeV with intensity at the threshold of interest for medical use. In the present day experiments using solid targets, the TNSA mechanism describes accelerated bunches with an exponential energy spectrum up to a cut-off value typically below ~ 60MeV and wide angular distribution. At the cut-off energy, the number of protons to be collimated and post-accelerated in a hybrid scheme are still too low. We investigate laser-plasma acceleration to improve the quality and number of the injected protons at ~ 30MeV in order to assure efficient post-acceleration in the hybrid scheme. The results are obtained with 3D PIC simulations using a code where optical acceleration with over-dense targets, transport and post-acceleration in a linac can all be investigated in an integrated framework. The high intensity experiments at Nara are taken as a reference benchmarks for our virtual laboratory. If experimentally confirmed, a hybrid scheme could be the core of a medium sized infrastructure for medical research, capable of producing protons for therapy and x-rays for diagnosis, which complements the development of all optical systems.
NASA's New High Intensity Solar Environment Test Capability
NASA Technical Reports Server (NTRS)
Schneider, Todd A.; Vaughn, Jason A.; Wright, Kenneth H.
2012-01-01
Across the world, new spaceflight missions are being designed and executed that will place spacecraft and instruments into challenging environments throughout the solar system. To aid in the successful completion of these new missions, NASA has developed a new flexible space environment test platform. The High Intensity Solar Environment Test (HISET) capability located at NASA fs Marshall Space Flight Center provides scientists and engineers with the means to test spacecraft materials and systems in a wide range of solar wind and solar photon environments. Featuring a solar simulator capable of delivering approximately 1 MW/m2 of broad spectrum radiation at maximum power, HISET provides a means to test systems or components that could explore the solar corona. The solar simulator consists of three high-power Xenon arc lamps that can be operated independently over a range of power to meet test requirements; i.e., the lamp power can be greatly reduced to simulate the solar intensity at several AU. Integral to the HISET capability are charged particle sources that can provide a solar wind (electron and proton) environment. Used individually or in combination, the charged particle sources can provide fluxes ranging from a few nA/cm2 to 100s of nA/cm2 over an energy range of 50 eV to 100 keV for electrons and 100 eV to 30 keV for protons. Anchored by a high vacuum facility equipped with a liquid nitrogen cold shroud for radiative cooling scenarios, HISET is able to accommodate samples as large as 1 meter in diameter. In this poster, details of the HISET capability will be presented, including the wide ]ranging configurability of the system.
Abdelgaied, Abdellatif; Fisher, John; Jennings, Louise M
2017-07-01
More robust preclinical experimental wear simulation methods are required in order to simulate a wider range of activities, observed in different patient populations such as younger more active patients, as well as to fully meet and be capable of going well beyond the existing requirements of the relevant international standards. A new six-station electromechanically driven simulator (Simulation Solutions, UK) with five fully independently controlled axes of articulation for each station, capable of replicating deep knee bending as well as other adverse conditions, which can be operated in either force or displacement control with improved input kinematic following, has been developed to meet these requirements. This study investigated the wear of a fixed-bearing total knee replacement using this electromechanically driven fully independent knee simulator and compared it to previous data from a predominantly pneumatically controlled simulator in which each station was not fully independently controlled. In addition, the kinematic performance and the repeatability of the simulators have been investigated and compared to the international standard requirements. The wear rates from the electromechanical and pneumatic knee simulators were not significantly different, with wear rates of 2.6 ± 0.9 and 2.7 ± 0.9 mm 3 /million cycles (MC; mean ± 95% confidence interval, p = 0.99) and 5.4 ± 1.4 and 6.7 ± 1.5 mm 3 /MC (mean ± 95 confidence interval, p = 0.54) from the electromechanical and pneumatic simulators under intermediate levels (maximum 5 mm) and high levels (maximum 10 mm) of anterior-posterior displacements, respectively. However, the output kinematic profiles of the control system, which drive the motion of the simulator, followed the input kinematic profiles more closely on the electromechanical simulator than the pneumatic simulator. In addition, the electromechanical simulator was capable of following kinematic and loading input cycles within the tolerances of the international standard requirements (ISO 14243-3). The new-generation electromechanical knee simulator with fully independent control has the potential to be used for a much wider range of kinematic conditions, including high-flexion and other severe conditions, due to its improved capability and performance in comparison to the previously used pneumatic-controlled simulators.
Abdelgaied, Abdellatif; Fisher, John; Jennings, Louise M
2017-01-01
More robust preclinical experimental wear simulation methods are required in order to simulate a wider range of activities, observed in different patient populations such as younger more active patients, as well as to fully meet and be capable of going well beyond the existing requirements of the relevant international standards. A new six-station electromechanically driven simulator (Simulation Solutions, UK) with five fully independently controlled axes of articulation for each station, capable of replicating deep knee bending as well as other adverse conditions, which can be operated in either force or displacement control with improved input kinematic following, has been developed to meet these requirements. This study investigated the wear of a fixed-bearing total knee replacement using this electromechanically driven fully independent knee simulator and compared it to previous data from a predominantly pneumatically controlled simulator in which each station was not fully independently controlled. In addition, the kinematic performance and the repeatability of the simulators have been investigated and compared to the international standard requirements. The wear rates from the electromechanical and pneumatic knee simulators were not significantly different, with wear rates of 2.6 ± 0.9 and 2.7 ± 0.9 mm3/million cycles (MC; mean ± 95% confidence interval, p = 0.99) and 5.4 ± 1.4 and 6.7 ± 1.5 mm3/MC (mean ± 95 confidence interval, p = 0.54) from the electromechanical and pneumatic simulators under intermediate levels (maximum 5 mm) and high levels (maximum 10 mm) of anterior–posterior displacements, respectively. However, the output kinematic profiles of the control system, which drive the motion of the simulator, followed the input kinematic profiles more closely on the electromechanical simulator than the pneumatic simulator. In addition, the electromechanical simulator was capable of following kinematic and loading input cycles within the tolerances of the international standard requirements (ISO 14243-3). The new-generation electromechanical knee simulator with fully independent control has the potential to be used for a much wider range of kinematic conditions, including high-flexion and other severe conditions, due to its improved capability and performance in comparison to the previously used pneumatic-controlled simulators. PMID:28661228
Simulation Assessment Validation Environment (SAVE). Software User’s Manual
2000-09-01
requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible
NASA Astrophysics Data System (ADS)
Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D. M. H.
2013-04-01
We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10-90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie above the CMIP5 carbon cycle range. These high-end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real-world climate-sensitivity constraints which, if achieved, would lead to reductions on the upper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present-day observables and future changes, while the large spread of future-projected changes highlights the ongoing need for such work.
Resistance of Titanium Aluminide to Domestic Object Damage Assessed
NASA Technical Reports Server (NTRS)
Lerch, Bradley A.; Draper, Susan L.; Pereira, J. Michael; Nathal, Michael V.; Austin, Curt
1999-01-01
A team consisting of GE Aircraft Engines, Precision Cast Parts, Oremet, and Chromalloy were awarded a NASA-sponsored Aerospace Industry Technology Program (AITP) to develop a design and manufacturing capability that will lead to the engine test demonstration and eventual implementation of a ?-Ti-47Al-2Nb-2Cr (at. %) titanium aluminide (TiAl) low-pressure turbine blade into commercial service. One of the main technical risks of implementing TiAl low-pressure turbine blades is the poor impact resistance of TiAl in comparison to the currently used nickel-based superalloy. The impact resistance of TiAl is being investigated at the NASA Lewis Research Center as part of the Aerospace Industry Technology Program and the Advanced High Temperature Engine Materials Program (HITEMP). The overall objective of this work is to determine the influence of impact damage on the high cycle fatigue life of TiAl-simulated low-pressure turbine blades. To this end, impact specimens were cast to size in a dog-bone configuration and given a typical processing sequence followed by an exposure to 650 degrees Celsius for 20 hours to simulate embrittlement at service conditions. Then, the specimens were impacted at 260 degrees Celsius under a 69-MPa load. Steel projectiles with diameters 1.6 and 3.2 mm were used to impact the specimens at 90 degrees Celsius to the leading edge. Two different impact energies (0.74 and 1.5 joules) were used to simulate fairly severe domestic object damage on a low-pressure turbine blade.
MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.
2016-01-01
MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.
Computer-Aided Engineering Tools | Water Power | NREL
energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department
Corrosion-Resistant Container for Molten-Material Processing
NASA Technical Reports Server (NTRS)
Stern, Theodore G.; McNaul, Eric
2010-01-01
In a carbothermal process, gaseous methane is passed over molten regolith, which is heated past its melting point to a temperature in excess of 1,625 C. At this temperature, materials in contact with the molten regolith (or regolith simulant) corrode and lose their structural properties. As a result, fabricating a crucible to hold the molten material and providing a method of contact heating have been problematic. Alternative containment approaches use a large crucible and limit the heat zone of the material being processed, which is inefficient because of volume and mass constraints. Alternative heating approaches use non-contact heating, such as by laser or concentrated solar energy, which can be inefficient in transferring heat and thus require higher power heat sources to accomplish processing. The innovation is a combination of materials, with a substrate material having high structural strength and stiffness and high-temperature capability, and a coating material with a high corrosion resistance and high-temperature capability. The material developed is a molybdenum substrate with an iridium coating. Creating the containment crucible or heater jacket using this material combination requires only that the molybdenum, which is easily processed by conventional methods such as milling, electric discharge machining, or forming and brazing, be fabricated into an appropriate shape, and that the iridium coating be applied to any surfaces that may come in contact with the corrosive molten material. In one engineering application, the molybdenum was fashioned into a container for a heat pipe. Since only the end of the heat pipe is used to heat the regolith, the container has a narrowing end with a nipple in which the heat pipe is snugly fit, and the external area of this nipple, which contacts the regolith to transfer heat into it, is coated with iridium. At the time of this reporting, no single material has been found that can perform the functions of this combination of materials, and other combinations of materials have not proven to be survivable to the corrosiveness of this environment. High-temperature processing of materials with similar constituencies as lunar regolith is fairly common. The carbo-thermal process is commonly used to make metallurgical-grade silicon for the semiconductor and solar-cell industries.
NASA Astrophysics Data System (ADS)
Carrasco, Ana; Semedo, Alvaro; Behrens, Arno; Weisse, Ralf; Breivik, Øyvind; Saetra, Øyvind; Håkon Christensen, Kai
2016-04-01
The global wave-induced current (the Stokes Drift - SD) is an important feature of the ocean surface, with mean values close to 10 cm/s along the extra-tropical storm tracks in both hemispheres. Besides the horizontal displacement of large volumes of water the SD also plays an important role in the ocean mix-layer turbulence structure, particularly in stormy or high wind speed areas. The role of the wave-induced currents in the ocean mix-layer and in the sea surface temperature (SST) is currently a hot topic of air-sea interaction research, from forecast to climate ranges. The SD is mostly driven by wind sea waves and highly sensitive to changes in the overlaying wind speed and direction. The impact of climate change in the global wave-induced current climate will be presented. The wave model WAM has been forced by the global climate model (GCM) ECHAM5 wind speed (at 10 m height) and ice, for present-day and potential future climate conditions towards the end of the end of the twenty-first century, represented by the Intergovernmental Panel for Climate Change (IPCC) CMIP3 (Coupled Model Inter-comparison Project phase 3) A1B greenhouse gas emission scenario (usually referred to as a ''medium-high emissions'' scenario). Several wave parameters were stored as output in the WAM model simulations, including the wave spectra. The 6 hourly and 0.5°×0.5°, temporal and space resolution, wave spectra were used to compute the SD global climate of two 32-yr periods, representative of the end of the twentieth (1959-1990) and twenty-first (1969-2100) centuries. Comparisons of the present climate run with the ECMWF (European Centre for Medium-Range Weather Forecasts) ERA-40 reanalysis are used to assess the capability of the WAM-ECHAM5 runs to produce realistic SD results. This study is part of the WRCP-JCOMM COWCLIP (Coordinated Ocean Wave Climate Project) effort.
Adsorption of hairy particles with mobile ligands: Molecular dynamics and density functional study
NASA Astrophysics Data System (ADS)
Borówko, M.; Sokołowski, S.; Staszewski, T.; Pizio, O.
2018-01-01
We study models of hairy nanoparticles in contact with a hard wall. Each particle is built of a spherical core with a number of ligands attached to it and each ligand is composed of several spherical, tangentially jointed segments. The number of segments is the same for all ligands. Particular models differ by the numbers of ligands and of segments per ligand, but the total number of segments is constant. Moreover, our model assumes that the ligands are tethered to the core in such a manner that they can "slide" over the core surface. Using molecular dynamics simulations we investigate the differences in the structure of a system close to the wall. In order to characterize the distribution of the ligands around the core, we have calculated the end-to-end distances of the ligands and the lengths and orientation of the mass dipoles. Additionally, we also employed a density functional approach to obtain the density profiles. We have found that if the number of ligands is not too high, the proposed version of the theory is capable to predict the structure of the system with a reasonable accuracy.
NASA Astrophysics Data System (ADS)
Barré, Jérôme; Edwards, David; Worden, Helen; Da Silva, Arlindo; Lahoz, William
2015-07-01
By the end of the current decade, there are plans to deploy several geostationary Earth orbit (GEO) satellite missions for atmospheric composition over North America, East Asia and Europe with additional missions proposed. Together, these present the possibility of a constellation of geostationary platforms to achieve continuous time-resolved high-density observations over continental domains for mapping pollutant sources and variability at diurnal and local scales. In this paper, we use a novel approach to sample a very high global resolution model (GEOS-5 at 7 km horizontal resolution) to produce a dataset of synthetic carbon monoxide pollution observations representative of those potentially obtainable from a GEO satellite constellation with predicted measurement sensitivities based on current remote sensing capabilities. Part 1 of this study focuses on the production of simulated synthetic measurements for air quality OSSEs (Observing System Simulation Experiments). We simulate carbon monoxide nadir retrievals using a technique that provides realistic measurements with very low computational cost. We discuss the sampling methodology: the projection of footprints and areas of regard for geostationary geometries over each of the North America, East Asia and Europe regions; the regression method to simulate measurement sensitivity; and the measurement error simulation. A detailed analysis of the simulated observation sensitivity is performed, and limitations of the method are discussed. We also describe impacts from clouds, showing that the efficiency of an instrument making atmospheric composition measurements on a geostationary platform is dependent on the dominant weather regime over a given region and the pixel size resolution. These results demonstrate the viability of the ;instrument simulator; step for an OSSE to assess the performance of a constellation of geostationary satellites for air quality measurements. We describe the OSSE results in a follow up paper (Part 2 of this study).
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.
Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V
2014-07-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology
Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.
2014-01-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914
Data preservation at the Fermilab Tevatron
Amerio, S.; Behari, S.; Boyd, J.; ...
2017-01-22
The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less
Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground
NASA Astrophysics Data System (ADS)
Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.
2011-11-01
U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.
Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground
NASA Astrophysics Data System (ADS)
Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.
2012-05-01
U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.
Man-in-the-control-loop simulation of manipulators
NASA Technical Reports Server (NTRS)
Chang, J. L.; Lin, Tsung-Chieh; Yae, K. Harold
1989-01-01
A method to achieve man-in-the-control-loop simulation is presented. Emerging real-time dynamics simulation suggests a potential for creating an interactive design workstation with a human operator in the control loop. The recursive formulation for multibody dynamics simulation is studied to determine requirements for man-in-the-control-loop simulation. High speed computer graphics techniques provides realistic visual cues for the simulator. Backhoe and robot arm simulations are implemented to demonstrate the capability of man-in-the-control-loop simulation.
NASA Technical Reports Server (NTRS)
Kunz, Robert F.
2014-01-01
This document represents the evolving formal documentation of the NPHASE-PSU computer code. Version 3.15 is being delivered along with the software to NASA in 2013.Significant upgrades to the NPHASE-PSU have been made since the first delivery of draft documentation to DARPA and USNRC in 2006. These include a much lighter, faster and memory efficient face based front end, support for arbitrary polyhedra in front end, flow-solver and back-end, a generalized homogeneous multiphase capability, and several two-fluid modelling and algorithmic elements. Specific capability installed for the NASA Gearbox Windage Aerodynamics NRA are included in this version: Hybrid Immersed Overset Boundary Method (HOIBM) [Noack et. al (2009)] Periodic boundary conditions for multiple frames of reference, Fully generalized immersed boundary method, Fully generalized conjugate heat transfer, Droplet deposition, bouncing, splashing models, and, Film transport and breakup.
High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications
NASA Technical Reports Server (NTRS)
Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad
2012-01-01
NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure shows an example of this capability where the Brazil Nut problem is simulated: as the container full of granular material is vibrated, the large ball slowly moves upwards. This capability was expanded to account for anchors of different shapes and penetration velocities, interacting with granular soils.
High Fidelity Simulations of Plume Impingement to the International Space Station
NASA Technical Reports Server (NTRS)
Lumpkin, Forrest E., III; Marichalar, Jeremiah; Stewart, Benedicte D.
2012-01-01
With the retirement of the Space Shuttle, the United States now depends on recently developed commercial spacecraft to supply the International Space Station (ISS) with cargo. These new vehicles supplement ones from international partners including the Russian Progress, the European Autonomous Transfer Vehicle (ATV), and the Japanese H-II Transfer Vehicle (HTV). Furthermore, to carry crew to the ISS and supplement the capability currently provided exclusively by the Russian Soyuz, new designs and a refinement to a cargo vehicle design are in work. Many of these designs include features such as nozzle scarfing or simultaneous firing of multiple thrusters resulting in complex plumes. This results in a wide variety of complex plumes impinging upon the ISS. Therefore, to ensure safe "proximity operations" near the ISS, the need for accurate and efficient high fidelity simulation of plume impingement to the ISS is as high as ever. A capability combining computational fluid dynamics (CFD) and the Direct Simulation Monte Carlo (DSMC) techniques has been developed to properly model the large density variations encountered as the plume expands from the high pressure in the combustion chamber to the near vacuum conditions at the orbiting altitude of the ISS. Details of the computational tools employed by this method, including recent software enhancements and the best practices needed to achieve accurate simulations, are discussed. Several recent examples of the application of this high fidelity capability are presented. These examples highlight many of the real world, complex features of plume impingement that occur when "visiting vehicles" operate in the vicinity of the ISS.
NASA's OCA Mirroring System: An Application of Multiagent Systems in Mission Control
NASA Technical Reports Server (NTRS)
Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron J. J.; Seah, Chin H.; Scott, Michael S.; Nado, Robert A.; Blumenberg, Susan F.; Shafto, Michael G.; Anderson, Brian L.; Bruins, Anthony C.;
2009-01-01
Orbital Communications Adaptor (OCA) Flight Controllers, in NASA's International Space Station Mission Control Center, use different computer systems to uplink, downlink, mirror, archive, and deliver files to and from the International Space Station (ISS) in real time. The OCA Mirroring System (OCAMS) is a multiagent software system (MAS) that is operational in NASA's Mission Control Center. This paper presents OCAMS and its workings in an operational setting where flight controllers rely on the system 24x7. We also discuss the return on investment, based on a simulation baseline, six months of 24x7 operations at NASA Johnson Space Center in Houston, Texas, and a projection of future capabilities. This paper ends with a discussion of the value of MAS and future planned functionality and capabilities.
NASA Astrophysics Data System (ADS)
Turinsky, Paul J.; Kothe, Douglas B.
2016-05-01
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry.
NASA Astrophysics Data System (ADS)
Hu, Sheng; Lv, Jiangtao; Si, Guangyuan
2016-10-01
A numerical model and simulation relative to an optoelectrofluidic chip has been presented in this article. Both dielectrophoretic and electroosmotic force attracting the nano-sized particles could be studied by the diffusion, convection, and migration equations. For the nano-sized particles, the protein with radius 3.6 nm is considered as the objective particle. The electroosmosis dependent upon applied frequency is calculated, which range 102 Hz from 108 Hz, and provides the much stronger force to enrich proteins than dielectrophoresis (DEP). Meanwhile, the induced light pattern size significantly affecting the concentration distribution is simulated. In this end, the concentration curve has verified that the optoelectrofluidic chip can be capable of manipulating and assembling the suspended submicron particles.
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...
2012-01-01
A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less
Simulation for analysis and control of superplastic forming. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zacharia, T.; Aramayo, G.A.; Simunovic, S.
1996-08-01
A joint study was conducted by Oak Ridge National Laboratory (ORNL) and the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy-Lightweight Materials (DOE-LWM) Program. the purpose of the study was to assess and benchmark the current modeling capabilities with respect to accuracy of predictions and simulation time. Two modeling capabilities with respect to accuracy of predictions and simulation time. Two simulation platforms were considered in this study, which included the LS-DYNA3D code installed on ORNL`s high- performance computers and the finite element code MARC used at PNL. both ORNL and PNL performed superplastic forming (SPF) analysis on amore » standard butter-tray geometry, which was defined by PNL, to better understand the capabilities of the respective models. The specific geometry was selected and formed at PNL, and the experimental results, such as forming time and thickness at specific locations, were provided for comparisons with numerical predictions. Furthermore, comparisons between the ORNL simulation results, using elasto-plastic analysis, and PNL`s results, using rigid-plastic flow analysis, were performed.« less
Ntofon, Okung-Dike; Channegowda, Mayur P; Efstathiou, Nikolaos; Rashidi Fard, Mehdi; Nejabati, Reza; Hunter, David K; Simeonidou, Dimitra
2013-02-25
In this paper, a novel Software-Defined Networking (SDN) architecture is proposed for high-end Ultra High Definition (UHD) media applications. UHD media applications require huge amounts of bandwidth that can only be met with high-capacity optical networks. In addition, there are requirements for control frameworks capable of delivering effective application performance with efficient network utilization. A novel SDN-based Controller that tightly integrates application-awareness with network control and management is proposed for such applications. An OpenFlow-enabled test-bed demonstrator is reported with performance evaluations of advanced online and offline media- and network-aware schedulers.
Outreach/education interface for Cryosphere models using the Virtual Ice Sheet Laboratory
NASA Astrophysics Data System (ADS)
Larour, E. Y.; Halkides, D. J.; Romero, V.; Cheng, D. L.; Perez, G.
2014-12-01
In the past decade, great strides have been made in the development of models capable of projecting the future evolution of glaciers and the polar ice sheets in a changing climate. These models are now capable of replicating some of the trends apparent in satellite observations. However, because this field is just now maturing, very few efforts have been dedicated to adapting these capabilities to education. Technologies that have been used in outreach efforts in Atmospheric and Oceanic sciences still have not been extended to Cryospheric Science. We present a cutting-edge, technologically driven virtual laboratory, geared towards outreach and k-12 education, dedicated to the polar ice sheets on Antarctica and Greenland, and their role as major contributors to sea level rise in coming decades. VISL (Virtual Ice Sheet Laboratory) relies on state-of-the art Web GL rendering of polar ice sheets, Android/iPhone and web portability using Javascript, as well as C++ simulations (back-end) based on the Ice Sheet System Model, the NASA model for simulating the evolution of polar ice sheets. Using VISL, educators and students can have an immersive experience into the world of polar ice sheets, while at the same exercising the capabilities of a state-of-the-art climate model, all of it embedded into an education experience that follows the new STEM standards for education.This work was performed at the California Institute of Technology's Jet Propulsion Laboratory under a contract with the National Aeronautics and Space Administration's Cryosphere Science Program.
Coupled Neutronics Thermal-Hydraulic Solution of a Full-Core PWR Using VERA-CS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clarno, Kevin T; Palmtag, Scott; Davidson, Gregory G
2014-01-01
The Consortium for Advanced Simulation of Light Water Reactors (CASL) is developing a core simulator called VERA-CS to model operating PWR reactors with high resolution. This paper describes how the development of VERA-CS is being driven by a set of progression benchmark problems that specify the delivery of useful capability in discrete steps. As part of this development, this paper will describe the current capability of VERA-CS to perform a multiphysics simulation of an operating PWR at Hot Full Power (HFP) conditions using a set of existing computer codes coupled together in a novel method. Results for several single-assembly casesmore » are shown that demonstrate coupling for different boron concentrations and power levels. Finally, high-resolution results are shown for a full-core PWR reactor modeled in quarter-symmetry.« less
Weaver, J L; Busquet, M; Colombant, D G; Mostovych, A N; Feldman, U; Klapisch, M; Seely, J F; Brown, C; Holland, G
2005-02-04
Absolutely calibrated, time-resolved spectral intensity measurements of soft-x-ray emission (hnu approximately 0.1-1.0 keV) from laser-irradiated polystyrene targets are compared to radiation-hydrodynamic simulations that include our new postprocessor, Virtual Spectro. This new capability allows a unified, detailed treatment of atomic physics and radiative transfer in nonlocal thermodynamic equilibrium conditions for simple spectra from low-Z materials as well as complex spectra from high-Z materials. The excellent agreement (within a factor of approximately 1.5) demonstrates the powerful predictive capability of the codes for the complex conditions in the ablating plasma. A comparison to data with high spectral resolution (E/deltaE approximately 1000) emphasizes the importance of including radiation coupling in the quantitative simulation of emission spectra.
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...
2015-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha
2014-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205
A New Approach to Modeling Jupiter's Magnetosphere
NASA Astrophysics Data System (ADS)
Fukazawa, K.; Katoh, Y.; Walker, R. J.; Kimura, T.; Tsuchiya, F.; Murakami, G.; Kita, H.; Tao, C.; Murata, K. T.
2017-12-01
The scales in planetary magnetospheres range from 10s of planetary radii to kilometers. For a number of years we have studied the magnetospheres of Jupiter and Saturn by using 3-dimensional magnetohydrodynamic (MHD) simulations. However, we have not been able to reach even the limits of the MHD approximation because of the large amount of computer resources required. Recently thanks to the progress in supercomputer systems, we have obtained the capability to simulate Jupiter's magnetosphere with 1000 times the number of grid points used in our previous simulations. This has allowed us to combine the high resolution global simulation with a micro-scale simulation of the Jovian magnetosphere. In particular we can combine a hybrid (kinetic ions and fluid electrons) simulation with the MHD simulation. In addition, the new capability enables us to run multi-parameter survey simulations of the Jupiter-solar wind system. In this study we performed a high-resolution simulation of Jovian magnetosphere to connect with the hybrid simulation, and lower resolution simulations under the various solar wind conditions to compare with Hisaki and Juno observations. In the high-resolution simulation we used a regular Cartesian gird with 0.15 RJ grid spacing and placed the inner boundary at 7 RJ. From these simulation settings, we provide the magnetic field out to around 20 RJ from Jupiter as a background field for the hybrid simulation. For the first time we have been able to resolve Kelvin Helmholtz waves on the magnetopause. We have investigated solar wind dynamic pressures between 0.01 and 0.09 nPa for a number of IMF values. These simulation data are open for the registered users to download the raw data. We have compared the results of these simulations with Hisaki auroral observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Sung, Yixing; Kucukboyaci, Vefa
The Virtual Environment for Reactor Applications core simulator (VERA-CS) being developed by the Consortium for the Advanced Simulation of Light Water Reactors (CASL) includes coupled neutronics, thermal-hydraulics, and fuel temperature components with an isotopic depletion capability. The neutronics capability employed is based on MPACT, a three-dimensional (3-D) whole core transport code. The thermal-hydraulics and fuel temperature models are provided by the COBRA-TF (CTF) subchannel code. As part of the CASL development program, the VERA-CS (MPACT/CTF) code system was applied to model and simulate reactor core response with respect to departure from nucleate boiling ratio (DNBR) at the limiting time stepmore » of a postulated pressurized water reactor (PWR) main steamline break (MSLB) event initiated at the hot zero power (HZP), either with offsite power available and the reactor coolant pumps in operation (high-flow case) or without offsite power where the reactor core is cooled through natural circulation (low-flow case). The VERA-CS simulation was based on core boundary conditions from the RETRAN-02 system transient calculations and STAR-CCM+ computational fluid dynamics (CFD) core inlet distribution calculations. The evaluation indicated that the VERA-CS code system is capable of modeling and simulating quasi-steady state reactor core response under the steamline break (SLB) accident condition, the results are insensitive to uncertainties in the inlet flow distributions from the CFD simulations, and the high-flow case is more DNB limiting than the low-flow case.« less
Chemical vapor deposition fluid flow simulation modelling tool
NASA Technical Reports Server (NTRS)
Bullister, Edward T.
1992-01-01
Accurate numerical simulation of chemical vapor deposition (CVD) processes requires a general purpose computational fluid dynamics package combined with specialized capabilities for high temperature chemistry. In this report, we describe the implementation of these specialized capabilities in the spectral element code NEKTON. The thermal expansion of the gases involved is shown to be accurately approximated by the low Mach number perturbation expansion of the incompressible Navier-Stokes equations. The radiative heat transfer between multiple interacting radiating surfaces is shown to be tractable using the method of Gebhart. The disparate rates of reaction and diffusion in CVD processes are calculated via a point-implicit time integration scheme. We demonstrate the use above capabilities on prototypical CVD applications.
Design and Simulations of an Energy Harvesting Capable CMOS Pixel for Implantable Retinal Prosthesis
NASA Astrophysics Data System (ADS)
Ansaripour, Iman; Karami, Mohammad Azim
2017-12-01
A new pixel is designed with the capability of imaging and energy harvesting for the retinal prosthesis implant in 0.18 µm standard Complementary Metal Oxide Semiconductor technology. The pixel conversion gain and dynamic range, are 2.05 \\upmu{{V}}/{{e}}^{ - } and 63.2 dB. The power consumption 53.12 pW per pixel while energy harvesting performance is 3.87 nW in 60 klx of illuminance per pixel. These results have been obtained using post layout simulation. In the proposed pixel structure, the high power production capability in energy harvesting mode covers the demanded energy by using all available p-n junction photo generated currents.
NASA Technical Reports Server (NTRS)
Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh
2017-01-01
Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies that will inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. This paper also describes the initial validation of individual components of the automated simulation capability, and an example application comparing the performance of the IDM concept under two TBFM scheduling paradigms. The results and conclusions from this simulation compare closely to those from previous HITL simulations using similar scenarios, providing an initial validation of the automated simulation capability.
ARES Modeling of High-foot Implosions (NNSA Milestone #5466)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurricane, O. A.
ARES “capsule only” simulations demonstrated results of applying an ASC code to a suite of high-foot ICF implosion experiments. While a capability to apply an asymmetric FDS drive to the capsule-only model using add-on Python routines exists, it was not exercised here. The ARES simulation results resemble the results from HYDRA simulations documented in A. Kritcher, et al., Phys. Plasmas, 23, 052709 (2016); namely, 1D simulation and data are in reasonable agreement for the lowest velocity experiments, but diverge from each other at higher velocities.
Solute and heat transport model of the Henry and Hilleke laboratory experiment
Langevin, C.D.; Dausman, A.M.; Sukop, M.C.
2010-01-01
SEAWAT is a coupled version of MODFLOW and MT3DMS designed to simulate variable-density ground water flow and solute transport. The most recent version of SEAWAT, called SEAWAT Version 4, includes new capabilities to represent simultaneous multispecies solute and heat transport. To test the new features in SEAWAT, the laboratory experiment of Henry and Hilleke (1972) was simulated. Henry and Hilleke used warm fresh water to recharge a large sand-filled glass tank. A cold salt water boundary was represented on one side. Adjustable heating pads were used to heat the bottom and left sides of the tank. In the laboratory experiment, Henry and Hilleke observed both salt water and fresh water flow systems separated by a narrow transition zone. After minor tuning of several input parameters with a parameter estimation program, results from the SEAWAT simulation show good agreement with the experiment. SEAWAT results suggest that heat loss to the room was more than expected by Henry and Hilleke, and that multiple thermal convection cells are the likely cause of the widened transition zone near the hot end of the tank. Other computer programs with similar capabilities may benefit from benchmark testing with the Henry and Hilleke laboratory experiment. Journal Compilation ?? 2009 National Ground Water Association.
Test Capability Enhancements to the NASA Langley 8-Foot High Temperature Tunnel
NASA Technical Reports Server (NTRS)
Harvin, S. F.; Cabell, K. F.; Gallimore, S. D.; Mekkes, G. L.
2006-01-01
The NASA Langley 8-Foot High Temperature Tunnel produces true enthalpy environments simulating flight from Mach 4 to Mach 7, primarily for airbreathing propulsion and aerothermal/thermo-structural testing. Flow conditions are achieved through a methane-air heater and nozzles producing aerodynamic Mach numbers of 4, 5 or 7 and have exit diameters of 8 feet or 4.5 feet. The 12-ft long free-jet test section, housed inside a 26-ft vacuum sphere, accommodates large test articles. Recently, the facility underwent significant upgrades to support hydrocarbon fueled scramjet engine testing and to expand flight simulation capability. The upgrades were required to meet engine system development and flight clearance verification requirements originally defined by the joint NASA-Air Force X-43C Hypersonic Flight Demonstrator Project and now the Air Force X-51A Program. Enhancements to the 8-Ft. HTT were made in four areas: 1) hydrocarbon fuel delivery; 2) flight simulation capability; 3) controls and communication; and 4) data acquisition/processing. The upgrades include the addition of systems to supply ethylene and liquid JP-7 to test articles; a Mach 5 nozzle with dynamic pressure simulation capability up to 3200 psf, the addition of a real-time model angle-of-attack system; a new programmable logic controller sub-system to improve process controls and communication with model controls; the addition of MIL-STD-1553B and high speed data acquisition systems and a classified data processing environment. These additions represent a significant increase to the already unique test capability and flexibility of the facility, and complement the existing array of test support hardware such as a model injection system, radiant heaters, six-component force measurement system, and optical flow field visualization hardware. The new systems support complex test programs that require sophisticated test sequences and precise management of process fluids. Furthermore, the new systems, such as the real-time angle of attack system and the new programmable logic controller enhance the test efficiency of the facility. The motivation for the upgrades and the expanded capabilities is described here.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pannala, S; D'Azevedo, E; Zacharia, T
The goal of the radiation modeling effort was to develop and implement a radiation algorithm that is fast and accurate for the underhood environment. As part of this CRADA, a net-radiation model was chosen to simulate radiative heat transfer in an underhood of a car. The assumptions (diffuse-gray and uniform radiative properties in each element) reduce the problem tremendously and all the view factors for radiation thermal calculations can be calculated once and for all at the beginning of the simulation. The cost for online integration of heat exchanges due to radiation is found to be less than 15% ofmore » the baseline CHAD code and thus very manageable. The off-line view factor calculation is constructed to be very modular and has been completely integrated to read CHAD grid files and the output from this code can be read into the latest version of CHAD. Further integration has to be performed to accomplish the same with STAR-CD. The main outcome of this effort is to obtain a highly scalable and portable simulation capability to model view factors for underhood environment (for e.g. a view factor calculation which took 14 hours on a single processor only took 14 minutes on 64 processors). The code has also been validated using a simple test case where analytical solutions are available. This simulation capability gives underhood designers in the automotive companies the ability to account for thermal radiation - which usually is critical in the underhood environment and also turns out to be one of the most computationally expensive components of underhood simulations. This report starts off with the original work plan as elucidated in the proposal in section B. This is followed by Technical work plan to accomplish the goals of the project in section C. In section D, background to the current work is provided with references to the previous efforts this project leverages on. The results are discussed in section 1E. This report ends with conclusions and future scope of work in section F.« less
NASA Astrophysics Data System (ADS)
Benallou, Amina; Hadri, Baghdad; Martinez-Vega, Juan; El Islam Boukortt, Nour
2018-04-01
The effect of percolation threshold on the behaviour of electrical conductivity at high electric field of insulating polymers has been briefly investigated in literature. Sometimes the dead ends links are not taken into account in the study of the electric field effect on the electrical properties. In this work, we present a theoretical framework and Monte Carlo simulation of the behaviour of the electric conductivity at high electric field based on the percolation theory using the traps energies levels which are distributed according to distribution law (uniform, Gaussian, and power-law). When a solid insulating material is subjected to a high electric field, and during trapping mechanism the dead ends of traps affect with decreasing the electric conductivity according to the traps energies levels, the correlation length of the clusters, the length of the dead ends, and the concentration of the accessible positions for the electrons. A reasonably good agreement is obtained between simulation results and the theoretical framework.
Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, Peyman; Madnia, Cyrus K.; Steinberger, Craig J.
1990-01-01
This research is involved with the implementation of advanced computational schemes based on large eddy simulations (LES) and direct numerical simulations (DNS) to study the phenomenon of mixing and its coupling with chemical reactions in compressible turbulent flows. In the efforts related to LES, a research program to extend the present capabilities of this method was initiated for the treatment of chemically reacting flows. In the DNS efforts, the focus is on detailed investigations of the effects of compressibility, heat release, and non-equilibrium kinetics modelings in high speed reacting flows. Emphasis was on the simulations of simple flows, namely homogeneous compressible flows, and temporally developing high speed mixing layers.
SLiM 2: Flexible, Interactive Forward Genetic Simulations.
Haller, Benjamin C; Messer, Philipp W
2017-01-01
Modern population genomic datasets hold immense promise for revealing the evolutionary processes operating in natural populations, but a crucial prerequisite for this goal is the ability to model realistic evolutionary scenarios and predict their expected patterns in genomic data. To that end, we present SLiM 2: an evolutionary simulation framework that combines a powerful, fast engine for forward population genetic simulations with the capability of modeling a wide variety of complex evolutionary scenarios. SLiM achieves this flexibility through scriptability, which provides control over most aspects of the simulated evolutionary scenarios with a simple R-like scripting language called Eidos. An example SLiM simulation is presented to illustrate the power of this approach. SLiM 2 also includes a graphical user interface for simulation construction, interactive runtime control, and dynamic visualization of simulation output, facilitating easy and fast model development with quick prototyping and visual debugging. We conclude with a performance comparison between SLiM and two other popular forward genetic simulation packages. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The Deep Impact Network Experiment Operations Center Monitor and Control System
NASA Technical Reports Server (NTRS)
Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan
2009-01-01
The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.
NASA Technical Reports Server (NTRS)
Justh, H. L.; Justus, C. G.
2007-01-01
The new Mars-GRAM auxiliary profile capability, using data from TES observations, mesoscale model output, or other sources, allows a potentially higher fidelity representation of the atmosphere, and a more accurate way of estimating inherent uncertainty in atmospheric density and winds. Figure 3 indicates that, with nominal value rpscale=1, Mars-GRAM perturbations would tend to overestimate observed or mesoscale-modeled variability. To better represent TES and mesoscale model density perturbations, rpscale values as low as about 0.4 could be used. Some trajectory model implementations of Mars-GRAM allow the user to dynamically change rpscale and rwscale values with altitude. Figure 4 shows that an mscale value of about 1.2 would better replicate wind standard deviations from MRAMS or MMM5 simulations at the Gale, Terby, or Melas sites. By adjusting the rpscale and rwscale values in Mars-GRAM based on figures such as Figure 3 and 4, we can provide more accurate end-to-end simulations for EDL at the candidate MSL landing sites.
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Barbee, Brent W.; Baldwin, Philip J.; Luquette, Richard J.
2007-01-01
The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility continues to evolve as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation, and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, are reviewed with a focus on recent improvements. With the most recent improvement, in support of Technology Readiness Level (TRL) 6 testing of the Inter-spacecraft Ranging and Alarm System (IRAS) for the Magnetospheric Multiscale (MMS) mission, the FFTB has significantly expanded its ability to perform realistic simulations that require Radio Frequency (RF) ranging sensors for relative navigation with the Path Emulator for RF Signals (PERFS). The PERFS, currently under development at NASA GSFC, modulates RF signals exchanged between spacecraft. The RF signals are modified to accurately reflect the dynamic environment through which they travel, including the effects of medium, moving platforms, and radiated power.
VERA Core Simulator methodology for pressurized water reactor cycle depletion
Kochunas, Brendan; Collins, Benjamin; Stimpson, Shane; ...
2017-01-12
This paper describes the methodology developed and implemented in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS) to perform high-fidelity, pressurized water reactor (PWR), multicycle, core physics calculations. Depletion of the core with pin-resolved power and nuclide detail is a significant advance in the state of the art for reactor analysis, providing the level of detail necessary to address the problems of the U.S. Department of Energy Nuclear Reactor Simulation Hub, the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS has three main components: the neutronics solver MPACT, the thermal-hydraulic (T-H) solver COBRA-TF (CTF), and the nuclidemore » transmutation solver ORIGEN. This paper focuses on MPACT and provides an overview of the resonance self-shielding methods, macroscopic-cross-section calculation, two-dimensional/one-dimensional (2-D/1-D) transport, nuclide depletion, T-H feedback, and other supporting methods representing a minimal set of the capabilities needed to simulate high-fidelity models of a commercial nuclear reactor. Results are presented from the simulation of a model of the first cycle of Watts Bar Unit 1. The simulation is within 16 parts per million boron (ppmB) reactivity for all state points compared to cycle measurements, with an average reactivity bias of <5 ppmB for the entire cycle. Comparisons to cycle 1 flux map data are also provided, and the average 2-D root-mean-square (rms) error during cycle 1 is 1.07%. To demonstrate the multicycle capability, a state point at beginning of cycle (BOC) 2 was also simulated and compared to plant data. The comparison of the cycle 2 BOC state has a reactivity difference of +3 ppmB from measurement, and the 2-D rms of the comparison in the flux maps is 1.77%. Lastly, these results provide confidence in VERA-CS’s capability to perform high-fidelity calculations for practical PWR reactor problems.« less
An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation
Nutaro, James
2014-11-03
In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.
MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.
Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan
2016-02-01
A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.
Chemical Modeling for Studies of GeoTRACE Capabilities
NASA Technical Reports Server (NTRS)
2005-01-01
Geostationary measurements of tropospheric pollutants with high spatial and temporal resolution will revolutionize the understanding and predictions of the chemically linked global pollutants aerosols and ozone. However, the capabilities of proposed geostationary instruments, particularly GeoTRACE, have not been thoroughly studied with model simulations. Such model simulations are important to answer the questions and allay the concerns that have been expressed in the atmospheric sciences community about the feasibility of such measurements. We proposed a suite of chemical transport model simulations using the EPA Models 3 chemical transport model, which obtains its meteorology from the MM-5 mesoscale model. The model output consists of gridded abundances of chemical pollutants and meteorological parameters every 30-60 minutes for cases that have occurred in the Eastern United States. This output was intended to be used to test the GeoTRACE capability to retrieve the tropospheric columns of these pollutants.
NASA Technical Reports Server (NTRS)
Konradi, A.; Mccoy, J. E.; Garriott, O. K.
1979-01-01
To simulate the behavior of a high voltage solar cell array in the ionospheric plasma environment, the large (90 ft x 55 ft diameter) vacuum chamber was used to measure the high-voltage plasma interactions of a 3 ft x 30 ft conductive panel. The chamber was filled with Nitrogen and Argon plasma at electron densities of up to 1,000,000 per cu cm. Measurements of current flow to the plasma were made in three configurations: (a) with one end of the panel grounded, (b) with the whole panel floating while a high bias was applied between the ends of the panel, and (c) with the whole panel at high negative voltage with respect to the chamber walls. The results indicate that a simple model with a constant panel conductivity and plasma resistance can adequately describe the voltage distribution along the panel and the plasma current flow. As expected, when a high potential difference is applied to the panel ends more than 95% of the panel floats negative with respect to the plasma.
NASA Astrophysics Data System (ADS)
Brigos, Miguel; Perez-Poch, Antoni; Alpiste, Francesc; Torner, Jordi; González Alonso, Daniel Ventura
2014-11-01
We report the results of residual acceleration obtained from initial tests of parabolic flights (more than 100 hours) performed with a small single-engine aerobatic aircraft (CAP10B), and propose a method that improves these figures. Such aircraft have proved capable of providing researchers with periods of up to 8 seconds of reduced gravity in the cockpit, with a gravity quality in the range of 0.1 g 0, where g 0 is the gravitational acceleration of the Earth. Such parabolas may be of interest to experimenters in the reduced gravity field, when this range of reduced gravity is acceptable for the experiment undertaken. They have also proven to be useful for motivational and educational campaigns. Furthermore, these flights may be of interest to researchers as a test-bed for obtaining a proof-of-concept for subsequent access to parabolic flights with larger aircraft or other microgravity platforms. The limited cost of the operations with these small aircraft allows us to perform them as part of a non-commercial joint venture between the Universitat Politècnica de Catalunya - BarcelonaTech (UPC), the Barcelona cluster BAIE and the Aeroclub Barcelona-Sabadell. Any improvements in the length and quality of reduced gravity would increase the capabilities of these small aircraft. To that end, we have developed a method based on a simulator for training aerobatic pilots. The simulation is performed with the CAD software for mechanical design Solidworks Motion{circledR }, which is widely distributed in industry and in universities. It specifically simulates the parabolic flight manoeuvre for our small aircraft and enables us to improve different aspects of the manoeuvre. The simulator is first validated with experimental data from the test flights. We have conducted an initial intensive period of specific pilot training with the aid of the simulator output. After such initial simulation-aided training, results show that the reduced gravity quality has significantly improved from 0.1 g 0 to 0.05 g 0. We conclude that single-engine aerobatic aircraft are capable of conducting small hypogravity experiments with the limitations described in the paper.
Simulation of Extreme Surface Winds by Regional Climate Models in the NARCCAP Archive
NASA Astrophysics Data System (ADS)
Hatteberg, R.; Takle, E. S.
2011-12-01
Surface winds play a significant role in many natural processes as well as providing a very important ecological service for many human activities. Surface winds ventilate pollutants and heat from our cities, contribute to pollination for our crops, and regulate the fluxes of heat, moisture, and carbon dioxide from the earth's surface. Many environmental models such as biogeochemical models, crop models, lake models, pollutant transport models, etc., use surface winds as a key variable. Studies of the impacts of climate change and climate variability on a wide range of natural systems and coupled human-natural systems frequently need information on how surface wind speeds will change as greenhouse gas concentrations in the earth's atmosphere change. We have studied the characteristics of extreme winds - both high winds and low winds - created by regional climate models (RCMs) in the NARCCAP archives. We evaluated the capabilities of five RCMs forced by NCEP reanalysis data as well as global climate model (GCM) data for contemporary and future scenario climates to capture the observed statistical distribution of surface winds, both high-wind events and low-wind conditions. Our domain is limited to the Midwest (37°N to 49°N, -82°W to -101°W) with the Great Lakes masked out, which eliminates orographic effects that may contribute to regional circulations. The majority of this study focuses on the warm seasonal in order to examine derechos on the extreme high end and air pollution and plant processes on the low wind speed end. To examine extreme high winds we focus on derechos, which are long-lasting convectively driven extreme wind events that frequently leave a swath of damage extending across multiple states. These events are unusual in that, despite their relatively small spatial scale, they can persist for hours or even days, drawing energy from well-organized larger mesoscale or synoptic scale processes. We examine the ability of NARCCAP RCMs to reproduce these isolated extreme events by assessing their existence, location, magnitude, synoptic linkage, initiation time and duration as compared to the record of observations of derechos in the Midwest and Northeast US. We find that RCMs do reproduce features with close resemblance to derechos although their magnitudes are considerably below those observed (which may be expected given the 50-km grid spacing of the RCM models). Extreme low wind speeds in summer are frequently associated with stagnation conditions leading to high air pollution events in major cities. Low winds also lead to reduced evapotranspiration by crops, which can impact phenological processes (e.g. pollination and seed fertilization, carbon uptake by plants). We evaluate whether RCMs can simulate climatic distributions of low-wind conditions in the northern US. Results show differences among models in their ability to reproduce observed characteristics of low summer-time winds. Only one model reproduces observed high frequency of calm night-time surface winds in summer, which suggests a need to improve model capabilities for simulating extreme stagnation events.
Kim, Ji-hoon; Ma, Xiangcheng; Grudić, Michael Y.; ...
2017-11-23
Using a state-of-the-art cosmological simulation of merging proto-galaxies at high redshift from the FIRE project, with explicit treatments of star formation and stellar feedback in the interstellar medium, we investigate the formation of star clusters and examine one of the formation hypotheses of present-day metal-poor globular clusters. Here, we find that frequent mergers in high-redshift proto-galaxies could provide a fertile environment to produce long-lasting bound star clusters. The violent merger event disturbs the gravitational potential and pushes a large gas mass of ≳ 10 5–6 M ⊙ collectively to high density, at which point it rapidly turns into stars beforemore » stellar feedback can stop star formation. The high dynamic range of the reported simulation is critical in realizing such dense star-forming clouds with a small dynamical time-scale, tff ≲ 3 Myr, shorter than most stellar feedback time-scales. Our simulation then allows us to trace how clusters could become virialized and tightly bound to survive for up to ~420 Myr till the end of the simulation. Finally, because the cluster's tightly bound core was formed in one short burst, and the nearby older stars originally grouped with the cluster tend to be preferentially removed, at the end of the simulation the cluster has a small age spread.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ji-hoon; Ma, Xiangcheng; Grudić, Michael Y.
Using a state-of-the-art cosmological simulation of merging proto-galaxies at high redshift from the FIRE project, with explicit treatments of star formation and stellar feedback in the interstellar medium, we investigate the formation of star clusters and examine one of the formation hypotheses of present-day metal-poor globular clusters. Here, we find that frequent mergers in high-redshift proto-galaxies could provide a fertile environment to produce long-lasting bound star clusters. The violent merger event disturbs the gravitational potential and pushes a large gas mass of ≳ 10 5–6 M ⊙ collectively to high density, at which point it rapidly turns into stars beforemore » stellar feedback can stop star formation. The high dynamic range of the reported simulation is critical in realizing such dense star-forming clouds with a small dynamical time-scale, tff ≲ 3 Myr, shorter than most stellar feedback time-scales. Our simulation then allows us to trace how clusters could become virialized and tightly bound to survive for up to ~420 Myr till the end of the simulation. Finally, because the cluster's tightly bound core was formed in one short burst, and the nearby older stars originally grouped with the cluster tend to be preferentially removed, at the end of the simulation the cluster has a small age spread.« less
NASA Astrophysics Data System (ADS)
Kim, Ji-hoon; Ma, Xiangcheng; Grudić, Michael Y.; Hopkins, Philip F.; Hayward, Christopher C.; Wetzel, Andrew; Faucher-Giguère, Claude-André; Kereš, Dušan; Garrison-Kimmel, Shea; Murray, Norman
2018-03-01
Using a state-of-the-art cosmological simulation of merging proto-galaxies at high redshift from the FIRE project, with explicit treatments of star formation and stellar feedback in the interstellar medium, we investigate the formation of star clusters and examine one of the formation hypotheses of present-day metal-poor globular clusters. We find that frequent mergers in high-redshift proto-galaxies could provide a fertile environment to produce long-lasting bound star clusters. The violent merger event disturbs the gravitational potential and pushes a large gas mass of ≳ 105-6 M⊙ collectively to high density, at which point it rapidly turns into stars before stellar feedback can stop star formation. The high dynamic range of the reported simulation is critical in realizing such dense star-forming clouds with a small dynamical time-scale, tff ≲ 3 Myr, shorter than most stellar feedback time-scales. Our simulation then allows us to trace how clusters could become virialized and tightly bound to survive for up to ˜420 Myr till the end of the simulation. Because the cluster's tightly bound core was formed in one short burst, and the nearby older stars originally grouped with the cluster tend to be preferentially removed, at the end of the simulation the cluster has a small age spread.
NASA Astrophysics Data System (ADS)
Petrila, S.; Brabie, G.; Chirita, B.
2016-08-01
The analysis performed on manufacturing flows within industrial enterprises producing hydrostatic components twos made on a number of factors that influence smooth running of production such: distance between pieces, waiting time from one surgery to another; time achievement of setups on CNC machines; tool changing in case of a large number of operators and manufacturing complexity of large files [2]. To optimize the manufacturing flow it was used the software Tecnomatix. This software represents a complete portfolio of manufacturing solutions digital manufactured by Siemens. It provides innovation by linking all production methods of a product from process design, process simulation, validation and ending the manufacturing process. Among its many capabilities to create a wide range of simulations, the program offers various demonstrations regarding the behavior manufacturing cycles. This program allows the simulation and optimization of production systems and processes in several areas such as: car suppliers, production of industrial equipment; electronics manufacturing, design and production of aerospace and defense parts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David; Shaver, Dillon; Liu, Yang
The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brajnik, G., E-mail: gabriele.brajnik@elettra.eu; Carrato, S.; Bassanese, S.
At Elettra, the Italian synchrotron light source, an internal project has been started to develop an electron beam position monitor capable of achieving sub-micron resolution with a self-compensation feature. In order to fulfil these requirements, a novel RF front end has been designed. A high isolation coupler combines the input signals with a known pilot tone which is generated by the readout system. This allows the parameters of the four channels to be continuously calibrated, by compensating the different responses of each channel. A similar technique is already known, but for the first time experimental results have shown the improvementmore » in resolution due to this method. The RF chain was coupled with a 4-channel digitizer based on 160 MHz, 16 bits ADCs and an Altera Stratix FPGA. At first, no additional processing was done in the FPGA, collecting only the raw data from the ADCs; the position was calculated through the FFT of each signal. A simulation was also performed to verify the analytic relation between spatial resolution and signal-to-noise ratio; this was very useful to better understand the behaviour of the system with different sources of noise (aperture jitter, thermal noise, etc.). The experimental data were compared with the simulation, showing indeed a perfect agreement with the latter and confirming the capability of the system to reach sub-micrometric accuracy. Therefore, the use of the pilot tone greatly improves the quality of the system, correcting the drifts and increasing the spatial resolution by a factor of 4 in a time window of 24 hours.« less
Scientific Visualization in High Speed Network Environments
NASA Technical Reports Server (NTRS)
Vaziri, Arsi; Kutler, Paul (Technical Monitor)
1997-01-01
In several cases, new visualization techniques have vastly increased the researcher's ability to analyze and comprehend data. Similarly, the role of networks in providing an efficient supercomputing environment have become more critical and continue to grow at a faster rate than the increase in the processing capabilities of supercomputers. A close relationship between scientific visualization and high-speed networks in providing an important link to support efficient supercomputing is identified. The two technologies are driven by the increasing complexities and volume of supercomputer data. The interaction of scientific visualization and high-speed networks in a Computational Fluid Dynamics simulation/visualization environment are given. Current capabilities supported by high speed networks, supercomputers, and high-performance graphics workstations at the Numerical Aerodynamic Simulation Facility (NAS) at NASA Ames Research Center are described. Applied research in providing a supercomputer visualization environment to support future computational requirements are summarized.
Brydges, Ryan; Carnahan, Heather; Rose, Don; Dubrowski, Adam
2010-08-01
In this paper, we tested the over-arching hypothesis that progressive self-guided learning offers equivalent learning benefit vs. proficiency-based training while limiting the need to set proficiency standards. We have shown that self-guided learning is enhanced when students learn on simulators that progressively increase in fidelity during practice. Proficiency-based training, a current gold-standard training approach, requires achievement of a criterion score before students advance to the next learning level. Baccalaureate nursing students (n = 15/group) practised intravenous catheterization using simulators that differed in fidelity (i.e. students' perceived realism). Data were collected in 2008. Proficiency-based students advanced from low- to mid- to high-fidelity after achieving a proficiency criterion at each level. Progressive students self-guided their progression from low- to mid- to high-fidelity. Yoked control students followed an experimenter-defined progressive practice schedule. Open-ended students moved freely between the simulators. One week after practice, blinded experts evaluated students' skill transfer on a standardized patient simulation. Group differences were examined using analyses of variance. Proficiency-based students scored highest on the high-fidelity post-test (effect size = 1.22). An interaction effect showed that the Progressive and Open-ended groups maintained their performance from post-test to transfer test, whereas the Proficiency-based and Yoked control groups experienced a significant decrease (P < 0.05). Surprisingly, most Open-ended students (73%) chose the progressive practice schedule. Progressive training and proficiency-based training resulted in equivalent transfer test performance, suggesting that progressive students effectively self-guided when to transition between simulators. Students' preference for the progressive practice schedule indicates that educators should consider this sequence for simulation-based training.
High-Fidelity Simulations of Electromagnetic Propagation and RF Communication Systems
2017-05-01
addition to high -fidelity RF propagation modeling, lower-fidelity mod- els, which are less computationally burdensome, are available via a C++ API...expensive to perform, requiring roughly one hour of computer time with 36 available cores and ray tracing per- formed by a single high -end GPU...ER D C TR -1 7- 2 Military Engineering Applied Research High -Fidelity Simulations of Electromagnetic Propagation and RF Communication
Understanding product cost vs. performance through an in-depth system Monte Carlo analysis
NASA Astrophysics Data System (ADS)
Sanson, Mark C.
2017-08-01
The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.
NASA Astrophysics Data System (ADS)
Fuchs, Eran; Tuell, Grady
2010-04-01
The CZMIL system is a new generation airborne bathymetric and topographic remote sensing platform composed of an active lidar, passive hyperspectral imager, high resolution frame camera, navigation system, and storage media running on a linux-based Gigabit Ethernet network. The lidar is a hybrid scanned-flash system employing a 10 KHz green laser and novel circular scanner, with a large aperture receiver (0.20m) having multiple channels. A PMT-based segmented detector is used on one channel to support simultaneous topographic and bathymetric data collection, and multiple fields-of- view are measured to support bathymetric measurements. The measured laser returns are digitized at 1 GHz to produce the waveforms required for ranging measurements, and unique data compression and storage techniques are used to address the large data volume. Simulated results demonstrate CZMIL's capability to discriminate bottom and surface returns in very shallow water conditions without compromising performance in deep water. Simulated waveforms are compared with measured data from the SHOALS system and show promising expected results. The system's prototype is expected to be completed by end of 2010, and ready for initial calibration tests in the spring of 2010.
MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation
Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.; ...
2016-01-01
We present MADNESS (multiresolution adaptive numerical environment for scientific simulation) that is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision that are based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.
Notes on modeling and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redondo, Antonio
These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.
End-to-End QoS for Differentiated Services and ATM Internetworking
NASA Technical Reports Server (NTRS)
Su, Hongjun; Atiquzzaman, Mohammed
2001-01-01
The Internet was initially design for non real-time data communications and hence does not provide any Quality of Service (QoS). The next generation Internet will be characterized by high speed and QoS guarantee. The aim of this paper is to develop a prioritized early packet discard (PEPD) scheme for ATM switches to provide service differentiation and QoS guarantee to end applications running over next generation Internet. The proposed PEPD scheme differs from previous schemes by taking into account the priority of packets generated from different application. We develop a Markov chain model for the proposed scheme and verify the model with simulation. Numerical results show that the results from the model and computer simulation are in close agreement. Our PEPD scheme provides service differentiation to the end-to-end applications.
NASA Technical Reports Server (NTRS)
Noble, Viveca K.
1994-01-01
When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.
NASA Astrophysics Data System (ADS)
Silvia, Devin W.
2013-12-01
The chemical evolution of the Universe is a complicated process with countless facets that define its properties over the course of time. In the early Universe, the metal-free first stars were responsible for originally introducing metals into the pristine gas left over from the Big Bang. Once these metals became prevalent, they forever altered the thermodynamics of the Universe. Understanding precisely where these metals originated, where they end up, and the conditions they experience along the way is of great interest in the astrophysical community. In this work, I have used numerical simulations as a means of understanding two separate phenomena related to the chemical evolution the Universe. The first topic focuses on the question as to whether or not core-collapse supernovae in the high-redshift universe are capable of being "dust factories" for the production of galactic dust. To achieve this, I carried out idealized simulations of supernova ejecta clouds being impacted by reverse-shock blast waves. By post-processing the results of these simulations, I was able to estimate the amount of dust destruction that would occur due to thermal sputtering. In the most extreme scenarios, simulated with high relative velocities between the shock and the ejecta cloud and high gas metallicities, I find complete destruction for some grains species and only 44% dust mass survival for even the most robust species. This raises the question as to whether or not high-redshift supernova can produce dust masses in sufficient excess of the ˜1 Msun per event required to match observations of high-z galaxies. The second investigation was driven by the desire to find an answer to the missing baryon problem and a curiosity as to the impact that including a full non-equilibrium treatment of ionization chemistry has on simulations of the intergalactic medium. To address these questions, I have helped to develop Dengo, a new software package for solving complex chemical networks. Once this new package was integrated into Enzo, I carried out a set of cosmological simulations that served as both a test of the new solver and a confirmation that non-equilibrium ionization chemistry produces results that are drastically different from those that assume collisional ionization equilibrium. Although my analysis of these simulations is in its early stages, I find that the observable properties of the intergalactic medium change considerably. Continued efforts to run state-of-the-art simulations of the intergalactic medium using Dengo are warranted.
Tough high performance composite matrix
NASA Technical Reports Server (NTRS)
Pater, Ruth H. (Inventor); Johnston, Norman J. (Inventor)
1994-01-01
This invention is a semi-interpentrating polymer network which includes a high performance thermosetting polyimide having a nadic end group acting as a crosslinking site and a high performance linear thermoplastic polyimide. Provided is an improved high temperature matrix resin which is capable of performing in the 200 to 300 C range. This resin has significantly improved toughness and microcracking resistance, excellent processability, mechanical performance, and moisture and solvent resistances.
Simulation, design, and testing of a high power collimator for the RDS-112 cyclotron.
Peeples, Johanna L; Stokely, Matthew H; Poorman, Michael C; Bida, Gerald T; Wieland, Bruce W
2015-03-01
A high power [F-18] fluoride target package for the RDS-112 cyclotron has been designed, tested, and commercially deployed. The upgrade includes the CF-1000 target, a 1.3kW water target with an established commercial history on RDS-111/Eclipse cyclotrons, and a redesigned collimator with improved heat rejection capabilities. Conjugate heat transfer analyses were employed to both evaluate the existing collimator capabilities and design a suitable high current replacement. Copyright © 2014 Elsevier Ltd. All rights reserved.
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
The Modular Aero-Propulsion System Simulation (MAPSS) Users' Guide
NASA Technical Reports Server (NTRS)
Parker, Khary I.; Melcher, Kevin J.
2004-01-01
The Modular Aero-Propulsion System Simulation is a flexible turbofan engine simulation environment that provides the user a platform to develop advanced control algorithms. It is capable of testing the performance of control designs on a validated and verified generic engine model. In addition, it is able to generate state-space linear models of the engine model to aid in controller design. The engine model used in MAPSS is a generic high-pressure ratio, dual-spool, lowbypass, military-type, variable cycle turbofan engine with a digital controller. MAPSS is controlled by a graphical user interface (GUI) and this guide explains how to use it to take advantage of the capabilities of MAPSS.
Additions and improvements to the high energy density physics capabilities in the FLASH code
NASA Astrophysics Data System (ADS)
Lamb, D. Q.; Flocke, N.; Graziani, C.; Tzeferacos, P.; Weide, K.
2016-10-01
FLASH is an open source, finite-volume Eulerian, spatially adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities have been added to FLASH to make it an open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. In particular, we showcase the ability of FLASH to simulate the Faraday Rotation Measure produced by the presence of magnetic fields; and proton radiography, proton self-emission, and Thomson scattering diagnostics with and without the presence of magnetic fields. We also describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at the University of Chicago by the DOE NNSA ASC through the Argonne Institute for Computing in Science under field work proposal 57789; and the NSF under Grant PHY-0903997.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bland, Arthur S Buddy; Hack, James J; Baker, Ann E
Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, D. S.; Weber, C. R.; Milovich, J. L.
In order to achieve the several hundred Gbar stagnation pressures necessary for inertial confinement fusion ignition, implosion experiments on the National Ignition Facility (NIF) [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] require the compression of deuterium-tritium fuel layers by a convergence ratio as high as forty. Such high convergence implosions are subject to degradation by a range of perturbations, including the growth of small-scale defects due to hydrodynamic instabilities, as well as longer scale modulations due to radiation flux asymmetries in the enclosing hohlraum. Due to the broad range of scales involved, and also the genuinely three-dimensionalmore » (3D) character of the flow, accurately modeling NIF implosions remains at the edge of current simulation capabilities. This study describes the current state of progress of 3D capsule-only simulations of NIF implosions aimed at accurately describing the performance of specific NIF experiments. Current simulations include the effects of hohlraum radiation asymmetries, capsule surface defects, the capsule support tent and fill tube, and use a grid resolution shown to be converged in companion two-dimensional simulations. The results of detailed simulations of low foot implosions from the National Ignition Campaign are contrasted against results for more recent high foot implosions. While the simulations suggest that low foot performance was dominated by ablation front instability growth, especially the defect seeded by the capsule support tent, high foot implosions appear to be dominated by hohlraum flux asymmetries, although the support tent still plays a significant role. Finally, for both implosion types, the simulations show reasonable, though not perfect, agreement with the data and suggest that a reliable predictive capability is developing to guide future implosions toward ignition.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, D. S.; Weber, C. R.; Milovich, J. L.
In order to achieve the several hundred Gbar stagnation pressures necessary for inertial confinement fusion ignition, implosion experiments on the National Ignition Facility (NIF) [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] require the compression of deuterium-tritium fuel layers by a convergence ratio as high as forty. Such high convergence implosions are subject to degradation by a range of perturbations, including the growth of small-scale defects due to hydrodynamic instabilities, as well as longer scale modulations due to radiation flux asymmetries in the enclosing hohlraum. Due to the broad range of scales involved, and also the genuinely three-dimensionalmore » (3D) character of the flow, accurately modeling NIF implosions remains at the edge of current simulation capabilities. This paper describes the current state of progress of 3D capsule-only simulations of NIF implosions aimed at accurately describing the performance of specific NIF experiments. Current simulations include the effects of hohlraum radiation asymmetries, capsule surface defects, the capsule support tent and fill tube, and use a grid resolution shown to be converged in companion two-dimensional simulations. The results of detailed simulations of low foot implosions from the National Ignition Campaign are contrasted against results for more recent high foot implosions. While the simulations suggest that low foot performance was dominated by ablation front instability growth, especially the defect seeded by the capsule support tent, high foot implosions appear to be dominated by hohlraum flux asymmetries, although the support tent still plays a significant role. For both implosion types, the simulations show reasonable, though not perfect, agreement with the data and suggest that a reliable predictive capability is developing to guide future implosions toward ignition.« less
Clark, D. S.; Weber, C. R.; Milovich, J. L.; ...
2016-03-14
In order to achieve the several hundred Gbar stagnation pressures necessary for inertial confinement fusion ignition, implosion experiments on the National Ignition Facility (NIF) [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] require the compression of deuterium-tritium fuel layers by a convergence ratio as high as forty. Such high convergence implosions are subject to degradation by a range of perturbations, including the growth of small-scale defects due to hydrodynamic instabilities, as well as longer scale modulations due to radiation flux asymmetries in the enclosing hohlraum. Due to the broad range of scales involved, and also the genuinely three-dimensionalmore » (3D) character of the flow, accurately modeling NIF implosions remains at the edge of current simulation capabilities. This study describes the current state of progress of 3D capsule-only simulations of NIF implosions aimed at accurately describing the performance of specific NIF experiments. Current simulations include the effects of hohlraum radiation asymmetries, capsule surface defects, the capsule support tent and fill tube, and use a grid resolution shown to be converged in companion two-dimensional simulations. The results of detailed simulations of low foot implosions from the National Ignition Campaign are contrasted against results for more recent high foot implosions. While the simulations suggest that low foot performance was dominated by ablation front instability growth, especially the defect seeded by the capsule support tent, high foot implosions appear to be dominated by hohlraum flux asymmetries, although the support tent still plays a significant role. Finally, for both implosion types, the simulations show reasonable, though not perfect, agreement with the data and suggest that a reliable predictive capability is developing to guide future implosions toward ignition.« less
A Summary of Proceedings for the Advanced Deployable Day/Night Simulation Symposium
2009-07-01
initiated to design , develop, and deliver transportable visual simulations that jointly provide night-vision and high-resolution daylight capability. The...Deployable Day/Night Simulation (ADDNS) Technology Demonstration Project was initiated to design , develop, and deliver transportable visual...was Dr. Richard Wildes (York University); Mr. Vitaly Zholudev (Department of Computer Science, York University), Mr. X. Zhu (Neptec Design Group), and
High fidelity studies of exploding foil initiator bridges, Part 2: Experimental results
NASA Astrophysics Data System (ADS)
Neal, William; Bowden, Mike
2017-01-01
Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA MHD, it is now possible to simulate these components in three dimensions and predict greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this second paper of a three part study, data is presented from a flexible foil EFI header experiment. This study has shown that there is significant bridge expansion before time of peak voltage and that heating within the bridge material is spatially affected by the microstructure of the metal foil.
NASA Technical Reports Server (NTRS)
Kikuchi, Hideaki; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya; Shimojo, Fuyuki; Saini, Subhash
2003-01-01
Scalability of a low-cost, Intel Xeon-based, multi-Teraflop Linux cluster is tested for two high-end scientific applications: Classical atomistic simulation based on the molecular dynamics method and quantum mechanical calculation based on the density functional theory. These scalable parallel applications use space-time multiresolution algorithms and feature computational-space decomposition, wavelet-based adaptive load balancing, and spacefilling-curve-based data compression for scalable I/O. Comparative performance tests are performed on a 1,024-processor Linux cluster and a conventional higher-end parallel supercomputer, 1,184-processor IBM SP4. The results show that the performance of the Linux cluster is comparable to that of the SP4. We also study various effects, such as the sharing of memory and L2 cache among processors, on the performance.
Unified Approach to Modeling and Simulation of Space Communication Networks and Systems
NASA Technical Reports Server (NTRS)
Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth
2010-01-01
Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks
NASA Astrophysics Data System (ADS)
Farina, Simone; Thepsonti, Thanongsak; Ceretti, Elisabetta; Özel, Tugrul
2011-05-01
Titanium alloys offer superb properties in strength, corrosion resistance and biocompatibility and are commonly utilized in medical devices and implants. Micro-end milling process is a direct and rapid fabrication method for manufacturing medical devices and implants in titanium alloys. Process performance and quality depend upon an understanding of the relationship between cutting parameters and forces and resultant tool deflections to avoid tool breakage. For this purpose, FE simulations of chip formation during micro-end milling of Ti-6Al-4V alloy with an ultra-fine grain solid carbide two-flute micro-end mill are investigated using DEFORM software. At first, specific forces in tangential and radial directions of cutting during micro-end milling for varying feed advance and rotational speeds have been determined using designed FE simulations for chip formation process. Later, these forces are applied to the micro-end mill geometry along the axial depth of cut in 3D analysis of ABAQUS. Consequently, 3D distributions for tool deflections & von Misses stress are determined. These analyses will yield in establishing integrated multi-physics process models for high performance micro-end milling and a leap-forward to process improvements.
Remote observations of reentering spacecraft including the space shuttle orbiter
NASA Astrophysics Data System (ADS)
Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, Jay H.; Gibson, David M.
Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.
Remote Observations of Reentering Spacecraft Including the Space Shuttle Orbiter
NASA Technical Reports Server (NTRS)
Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, jay H.; Gibson, David
2013-01-01
Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.
Wideband monolithically integrated front-end subsystems and components
NASA Astrophysics Data System (ADS)
Mruk, Joseph Rene
This thesis presents the analysis, design, and measurements of passive, monolithically integrated, wideband recta-coax and printed circuit board front-end components. Monolithic fabrication of antennas, impedance transformers, filters, and transitions lowers manufacturing costs by reducing assembly time and enhances performance by removing connectors and cabling between the devices. Computational design, fabrication, and measurements are used to demonstrate the capabilities of these front-end assemblies. Two-arm wideband planar log-periodic antennas fed using a horizontal feed that allows for filters and impedance transformers to be readily fabricated within the radiating region of the antenna are demonstrated. At microwave frequencies, low-cost printed circuit board processes are typically used to produce planar devices. A 1.8 to 11 GHz two-arm planar log-periodic antenna is designed with a monolithically integrated impedance transformer. Band rejection methods based on modifying the antenna aperture, use of an integrated filter, and the application of both methods are investigated with realized gain suppressions of over 25 dB achieved. The ability of standard circuit board technology to fabricate millimeter-wave devices up to 110 GHz is severely limited. Thin dielectrics are required to prevent the excitation of higher order modes in the microstrip substrate. Fabricating the thin line widths required for the antenna aperture also becomes prohibitively challenging. Surface micro-machining typically used in the fabrication of MEMS devices is capable of producing the extremely small features that can be used to fabricate antennas extending through W-band. A directly RF fed 18 to 110 GHz planar log-periodic antenna is developed. The antenna is fabricated with an integrated impedance transformer and additional transitions for measurement characterization. Singly terminated low-loss wideband millimeter-wave filters operating over V- and W- band are developed. High quality performance of an 18 to 100 GHz front-end is realized by dividing the single instantaneous antenna into two apertures operating from 18 to 50 and 50 to 100 GHz. Each channel features an impedance transformer, low-pass (low-frequency) or band-pass (high-frequency) filter, and grounded CPW launch. This dual-aperture front-end demonstrates that micromachining technology is now capable of fabricating broadband millimeter-wave components with a high degree of integration.
Schmid, Margareta; Zellweger, Ueli; Bosshard, Georg; Bopp, Matthias
2016-01-01
In Switzerland, the prevalence of medical end-of-life practices had been assessed on a population level only once - in 2001 - until in 2013/14 an identical study was conducted. We aimed to compare the results of the 2001 and 2013 studies with a special focus on shared decision-making and patients' decision-making capacity. Our study encompassed a 21.3% sample of deaths among residents of the German-speaking part of Switzerland aged 1 year or older. From 4998 mailed questionnaires, 3173 (63.5%) were returned. All data were weighted to adjust for age- and sex-specific differences in response rates. Cases with at least one reported end-of-life practice significantly increased from 74.5% (2001) to 82.3% (2013) of all deaths eligible for an end-of-life decision (p <0.001). In 51.2% there was a combination of at least two different end-of-life decisions in one case. In relation to discussion with patients or relatives and otherwise expressed preferences of the patient, 76.5% (74.5-78.4%) of all cases with reported medical end-of-life practice in 2013 (2001: 74.4%) relied on shared decision-making, varying from 79.8% (76.5-82.7%) among not at all capable patients to 87.8% (85.0-90.2%) among fully capable patients. In contrast to a generally increasing trend, the prevalence of end-of-life practices discussed with fully capable patients decreased from 79.0% (75.3-82.3%) in 2001 to 73.2% (69.6-76.0%) in 2013 (p = 0.037). Despite a generally high incidence of end-of-life practices in Switzerland, there remains potential for further improvement in shared decision-making. Efforts to motivate physicians to involve patients and relatives may be a win-win situation.
Soleimani, Hamid; Drakakis, Emmanuel M
2017-06-01
Recent studies have demonstrated that calcium is a widespread intracellular ion that controls a wide range of temporal dynamics in the mammalian body. The simulation and validation of such studies using experimental data would benefit from a fast large scale simulation and modelling tool. This paper presents a compact and fully reconfigurable cellular calcium model capable of mimicking Hopf bifurcation phenomenon and various nonlinear responses of the biological calcium dynamics. The proposed cellular model is synthesized on a digital platform for a single unit and a network model. Hardware synthesis, physical implementation on FPGA, and theoretical analysis confirm that the proposed cellular model can mimic the biological calcium behaviors with considerably low hardware overhead. The approach has the potential to speed up large-scale simulations of slow intracellular dynamics by sharing more cellular units in real-time. To this end, various networks constructed by pipelining 10 k to 40 k cellular calcium units are compared with an equivalent simulation run on a standard PC workstation. Results show that the cellular hardware model is, on average, 83 times faster than the CPU version.
Accelerating gravitational microlensing simulations using the Xeon Phi coprocessor
NASA Astrophysics Data System (ADS)
Chen, B.; Kantowski, R.; Dai, X.; Baron, E.; Van der Mark, P.
2017-04-01
Recently Graphics Processing Units (GPUs) have been used to speed up very CPU-intensive gravitational microlensing simulations. In this work, we use the Xeon Phi coprocessor to accelerate such simulations and compare its performance on a microlensing code with that of NVIDIA's GPUs. For the selected set of parameters evaluated in our experiment, we find that the speedup by Intel's Knights Corner coprocessor is comparable to that by NVIDIA's Fermi family of GPUs with compute capability 2.0, but less significant than GPUs with higher compute capabilities such as the Kepler. However, the very recently released second generation Xeon Phi, Knights Landing, is about 5.8 times faster than the Knights Corner, and about 2.9 times faster than the Kepler GPU used in our simulations. We conclude that the Xeon Phi is a very promising alternative to GPUs for modern high performance microlensing simulations.
NASA Technical Reports Server (NTRS)
Kremic, Tibor; Vento, Dan; Lalli, Nick; Palinski, Timothy
2014-01-01
Science, technology, and planetary mission communities have a growing interest in components and systems that are capable of working in extreme (high) temperature and pressure conditions. Terrestrial applications range from scientific research, aerospace, defense, automotive systems, energy storage and power distribution, deep mining and others. As the target environments get increasingly extreme, capabilities to develop and test the sensors and systems designed to operate in such environments will be required. An application of particular importance to the planetary science community is the ability for a robotic lander to survive on the Venus surface where pressures are nearly 100 times that of Earth and temperatures approach 500C. The scientific importance and relevance of Venus missions are stated in the current Planetary Decadal Survey. Further, several missions to Venus were proposed in the most recent Discovery call. Despite this interest, the ability to accurately simulate Venus conditions at a scale that can test and validate instruments and spacecraft systems and accurately simulate the Venus atmosphere has been lacking. This paper discusses and compares the capabilities that are known to exist within and outside the United States to simulate the extreme environmental conditions found in terrestrial or planetary surfaces including the Venus atmosphere and surface. The paper then focuses on discussing the recent additional capability found in the NASA Glenn Extreme Environment Rig (GEER). The GEER, located at the NASA Glenn Research Center in Cleveland, Ohio, is designed to simulate not only the temperature and pressure extremes described, but can also accurately reproduce the atmospheric compositions of bodies in the solar system including those with acidic and hazardous elements. GEER capabilities and characteristics are described along with operational considerations relevant to potential users. The paper presents initial operating results and concludes with a sampling of investigations or tests that have been requested or expected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kochunas, Brendan; Collins, Benjamin; Stimpson, Shane
This paper describes the methodology developed and implemented in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS) to perform high-fidelity, pressurized water reactor (PWR), multicycle, core physics calculations. Depletion of the core with pin-resolved power and nuclide detail is a significant advance in the state of the art for reactor analysis, providing the level of detail necessary to address the problems of the U.S. Department of Energy Nuclear Reactor Simulation Hub, the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS has three main components: the neutronics solver MPACT, the thermal-hydraulic (T-H) solver COBRA-TF (CTF), and the nuclidemore » transmutation solver ORIGEN. This paper focuses on MPACT and provides an overview of the resonance self-shielding methods, macroscopic-cross-section calculation, two-dimensional/one-dimensional (2-D/1-D) transport, nuclide depletion, T-H feedback, and other supporting methods representing a minimal set of the capabilities needed to simulate high-fidelity models of a commercial nuclear reactor. Results are presented from the simulation of a model of the first cycle of Watts Bar Unit 1. The simulation is within 16 parts per million boron (ppmB) reactivity for all state points compared to cycle measurements, with an average reactivity bias of <5 ppmB for the entire cycle. Comparisons to cycle 1 flux map data are also provided, and the average 2-D root-mean-square (rms) error during cycle 1 is 1.07%. To demonstrate the multicycle capability, a state point at beginning of cycle (BOC) 2 was also simulated and compared to plant data. The comparison of the cycle 2 BOC state has a reactivity difference of +3 ppmB from measurement, and the 2-D rms of the comparison in the flux maps is 1.77%. Lastly, these results provide confidence in VERA-CS’s capability to perform high-fidelity calculations for practical PWR reactor problems.« less
Satellite-based Analysis of CO Variability over the Amazon Basin
NASA Astrophysics Data System (ADS)
Deeter, M. N.; Emmons, L. K.; Martinez-Alonso, S.; Tilmes, S.; Wiedinmyer, C.
2017-12-01
Pyrogenic emissions from the Amazon Basin exert significant influence on both climate and air quality but are highly variable from year to year. The ability of models to simulate the impact of biomass burning emissions on downstream atmospheric concentrations depends on (1) the quality of surface flux estimates (i.e., emissions inventories), (2) model dynamics (e.g., horizontal winds, large-scale convection and mixing) and (3) the representation of atmospheric chemical processes. With an atmospheric lifetime of a few months, carbon monoxide (CO) is a commonly used diagnostic for biomass burning. CO products are available from several satellite instruments and allow analyses of CO variability over extended regions such as the Amazon Basin with useful spatial and temporal sampling characteristics. The MOPITT ('Measurements of Pollution in the Troposphere') instrument was launched on the NASA Terra platform near the end of 1999 and is still operational. MOPITT is uniquely capable of measuring tropospheric CO concentrations using both thermal-infrared and near-infrared observations, resulting in the ability to independently retrieve lower- and upper-troposphere CO concentrations. We exploit the 18-year MOPITT record and related datasets to analyze the variability of CO over the Amazon Basin and evaluate simulations performed with the CAM-chem chemical transport model. We demonstrate that observed differences between MOPITT observations and model simulations provide important clues regarding emissions inventories, convective mixing and long-range transport.
Mothers’ Repartnering after a Nonmarital Birth
Bzostek, Sharon H.; McLanahan, Sara S.; Carlson, Marcia J.
2012-01-01
This paper examines the prevalence, predictors and outcomes of unmarried mothers’ repartnering patterns following a nonmarital birth. Results indicate that, within five years after a birth, approximately two-thirds of unmarried mothers ended their relationship with the focal child’s biological father, and over half of these mothers entered new partnerships. Among those who repartnered, 60 percent of mothers formed unions with men with higher economic capabilities than their former partners, 20 percent formed unions with men with similar capabilities, and 20 percent formed unions with men with lower capabilities. This pattern holds for both nonresidential and coresidential unions. Our findings are consistent with marriage market, learning, and evolutionary biology theories about union formation, and they provide support for qualitative evidence that unmarried mothers have high standards for new partners. While many mothers are able to successfully find new partners with better economic capabilities, many other mothers remain unpartnered, likely due (at least in part) to the limited pool of potential partners with relatively high levels of economic capabilities. PMID:23015762
NASA Technical Reports Server (NTRS)
Collatz, G. James; Kawa, R.
2007-01-01
Progress in better determining CO2 sources and sinks will almost certainly rely on utilization of more extensive and intensive CO2 and related observations including those from satellite remote sensing. Use of advanced data requires improved modeling and analysis capability. Under NASA Carbon Cycle Science support we seek to develop and integrate improved formulations for 1) atmospheric transport, 2) terrestrial uptake and release, 3) biomass and 4) fossil fuel burning, and 5) observational data analysis including inverse calculations. The transport modeling is based on meteorological data assimilation analysis from the Goddard Modeling and Assimilation Office. Use of assimilated met data enables model comparison to CO2 and other observations across a wide range of scales of variability. In this presentation we focus on the short end of the temporal variability spectrum: hourly to synoptic to seasonal. Using CO2 fluxes at varying temporal resolution from the SIB 2 and CASA biosphere models, we examine the model's ability to simulate CO2 variability in comparison to observations at different times, locations, and altitudes. We find that the model can resolve much of the variability in the observations, although there are limits imposed by vertical resolution of boundary layer processes. The influence of key process representations is inferred. The high degree of fidelity in these simulations leads us to anticipate incorporation of realtime, highly resolved observations into a multiscale carbon cycle analysis system that will begin to bridge the gap between top-down and bottom-up flux estimation, which is a primary focus of NACP.
NASA Technical Reports Server (NTRS)
Follen, Gregory; auBuchon, M.
2000-01-01
Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.
iClimate: a climate data and analysis portal
NASA Astrophysics Data System (ADS)
Goodman, P. J.; Russell, J. L.; Merchant, N.; Miller, S. J.; Juneja, A.
2015-12-01
We will describe a new climate data and analysis portal called iClimate that facilitates direct comparisons between available climate observations and climate simulations. Modeled after the successful iPlant Collaborative Discovery Environment (www.iplantcollaborative.org) that allows plant scientists to trade and share environmental, physiological and genetic data and analyses, iClimate provides an easy-to-use platform for large-scale climate research, including the storage, sharing, automated preprocessing, analysis and high-end visualization of large and often disparate observational and model datasets. iClimate will promote data exploration and scientific discovery by providing: efficient and high-speed transfer of data from nodes around the globe (e.g. PCMDI and NASA); standardized and customized data/model metrics; efficient subsampling of datasets based on temporal period, geographical region or variable; and collaboration tools for sharing data, workflows, analysis results, and data visualizations with collaborators or with the community at large. We will present iClimate's capabilities, and demonstrate how it will simplify and enhance the ability to do basic or cutting-edge climate research by professionals, laypeople and students.
Detection and Attribution of Regional Climate Change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bala, G; Mirin, A
2007-01-19
We developed a high resolution global coupled modeling capability to perform breakthrough studies of the regional climate change. The atmospheric component in our simulation uses a 1{sup o} latitude x 1.25{sup o} longitude grid which is the finest resolution ever used for the NCAR coupled climate model CCSM3. Substantial testing and slight retuning was required to get an acceptable control simulation. The major accomplishment is the validation of this new high resolution configuration of CCSM3. There are major improvements in our simulation of the surface wind stress and sea ice thickness distribution in the Arctic. Surface wind stress and oceanmore » circulation in the Antarctic Circumpolar Current are also improved. Our results demonstrate that the FV version of the CCSM coupled model is a state of the art climate model whose simulation capabilities are in the class of those used for IPCC assessments. We have also provided 1000 years of model data to Scripps Institution of Oceanography to estimate the natural variability of stream flow in California. In the future, our global model simulations will provide boundary data to high-resolution mesoscale model that will be used at LLNL. The mesoscale model would dynamically downscale the GCM climate to regional scale on climate time scales.« less
Interplanetary Transit Simulations Using the International Space Station
NASA Technical Reports Server (NTRS)
Charles, J. B.; Arya, Maneesh
2010-01-01
It has been suggested that the International Space Station (ISS) be utilized to simulate the transit portion of long-duration missions to Mars and near-Earth asteroids (NEA). The ISS offers a unique environment for such simulations, providing researchers with a high-fidelity platform to study, enhance, and validate technologies and countermeasures for these long-duration missions. From a space life sciences perspective, two major categories of human research activities have been identified that will harness the various capabilities of the ISS during the proposed simulations. The first category includes studies that require the use of the ISS, typically because of the need for prolonged weightlessness. The ISS is currently the only available platform capable of providing researchers with access to a weightless environment over an extended duration. In addition, the ISS offers high fidelity for other fundamental space environmental factors, such as isolation, distance, and accessibility. The second category includes studies that do not require use of the ISS in the strictest sense, but can exploit its use to maximize their scientific return more efficiently and productively than in ground-based simulations. In addition to conducting Mars and NEA simulations on the ISS, increasing the current increment duration on the ISS from 6 months to a longer duration will provide opportunities for enhanced and focused research relevant to long-duration Mars and NEA missions. Although it is currently believed that increasing the ISS crew increment duration to 9 or even 12 months will pose little additional risk to crewmembers, additional medical monitoring capabilities may be required beyond those currently used for the ISS operations. The use of the ISS to simulate aspects of Mars and NEA missions seems practical, and it is recommended that planning begin soon, in close consultation with all international partners.
High power transcranial beam steering for ultrasonic brain therapy
Pernot, Mathieu; Aubry, Jean-François; Tanter, Mickaël; Thomas, Jean-Louis; Fink, Mathias
2003-01-01
A sparse phased array is specially designed for non-invasive ultrasound transskull brain therapy. The array is made of 200 single-elements corresponding to a new generation of high power transducers developed in collaboration with Imasonic (Besançon, France). Each element has a surface of 0.5cm2 and works at 0.9 MHz central frequency with a maximum 20W.cm−2 intensity on the transducer surface. In order to optimize the steering capabilities of the array, several transducers distributions on a spherical surface are simulated: hexagonal, annular, and quasi-random distributions. Using a quasi-random distribution significantly reduces the grating lobes. Furthermore, the simulations show the capability of the quasi-random array to electronically move the focal spot in the vicinity of the geometrical focus (up to +/− 15 mm). Based on the simulation study, the array is constructed and tested. The skull aberrations are corrected by using a time reversal mirror with amplitude correction achieved thanks to an implantable hydrophone, and a sharp focus is obtained through a human skull. Several lesions are induced in fresh liver and brain samples through human skulls, demonstrating the accuracy and the steering capabilities of the system. PMID:12974575
High power transcranial beam steering for ultrasonic brain therapy
NASA Astrophysics Data System (ADS)
Pernot, M.; Aubry, J.-F.; Tanter, M.; Thomas, J.-L.; Fink, M.
2003-08-01
A sparse phased array is specially designed for non-invasive ultrasound transskull brain therapy. The array is made of 200 single elements corresponding to a new generation of high power transducers developed in collaboration with Imasonic (Besançon, France). Each element has a surface of 0.5 cm2 and works at 0.9 MHz central frequency with a maximum 20 W cm-2 intensity on the transducer surface. In order to optimize the steering capabilities of the array, several transducer distributions on a spherical surface are simulated: hexagonal, annular and quasi-random distributions. Using a quasi-random distribution significantly reduces the grating lobes. Furthermore, the simulations show the capability of the quasi-random array to electronically move the focal spot in the vicinity of the geometrical focus (up to +/-15 mm). Based on the simulation study, the array is constructed and tested. The skull aberrations are corrected by using a time reversal mirror with amplitude correction achieved thanks to an implantable hydrophone, and a sharp focus is obtained through a human skull. Several lesions are induced in fresh liver and brain samples through human skulls, demonstrating the accuracy and the steering capabilities of the system.
Test Results From a Simulated High-Voltage Lunar Power Transmission Line
NASA Technical Reports Server (NTRS)
Birchenough, Arthur; Hervol, David
2008-01-01
The Alternator Test Unit (ATU) in the Lunar Power System Facility (LPSF) located at the NASA Glenn Research Center (GRC) in Cleveland, Ohio was modified to simulate high-voltage transmission capability. The testbed simulated a 1 km transmission cable length from the ATU to the LPSF using resistors and inductors installed between the distribution transformers. Power factor correction circuitry was used to compensate for the reactance of the distribution system to improve the overall power factor. This test demonstrated that a permanent magnet alternator can successfully provide high-frequency ac power to a lunar facility located at a distance.
Test Results from a Simulated High Voltage Lunar Power Transmission Line
NASA Technical Reports Server (NTRS)
Birchenough, Arthur; Hervol, David
2008-01-01
The Alternator Test Unit (ATU) in the Lunar Power System Facility (LPSF) located at the NASA Glenn Research Center (GRC) in Cleveland, OH was modified to simulate high voltage transmission capability. The testbed simulated a 1 km transmission cable length from the ATU to the LPSF using resistors and inductors installed between the distribution transformers. Power factor correction circuitry was used to compensate for the reactance of the distribution system to improve the overall power factor. This test demonstrated that a permanent magnet alternator can successfully provide high frequency AC power to a lunar facility located at a distance.
Vortex-flow aerodynamics - An emerging design capability
NASA Technical Reports Server (NTRS)
Campbell, J. F.
1981-01-01
Promising current theoretical and simulational developments in the field of leading edge vortex-generating delta, arrow ogival wings are reported, along with the history of theory and experiment leading to them. The effects of wing slenderness, leading edge nose radius, Mach number and incidence variations, and planform on the onset of vortex generation and redistribution of aerodynamic loads are considered. The range of design possibilities in this field are consequential for the future development of strategic aircraft, supersonic transports and commercial cargo aircraft which will possess low-speed, high-lift capability by virtue of leading edge vortex generation and control without recourse to heavy and expensive leading edge high-lift devices and compound airfoils. Attention is given to interactive graphics simulation devices recently developed.
Global Weather Prediction and High-End Computing at NASA
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert; Yeh, Kao-San
2003-01-01
We demonstrate current capabilities of the NASA finite-volume General Circulation Model an high-resolution global weather prediction, and discuss its development path in the foreseeable future. This model can be regarded as a prototype of a future NASA Earth modeling system intended to unify development activities cutting across various disciplines within the NASA Earth Science Enterprise.
NASA Technical Reports Server (NTRS)
Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.
2009-01-01
A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.
End-to-end System Performance Simulation: A Data-Centric Approach
NASA Astrophysics Data System (ADS)
Guillaume, Arnaud; Laffitte de Petit, Jean-Luc; Auberger, Xavier
2013-08-01
In the early times of space industry, the feasibility of Earth observation missions was directly driven by what could be achieved by the satellite. It was clear to everyone that the ground segment would be able to deal with the small amount of data sent by the payload. Over the years, the amounts of data processed by the spacecrafts have been increasing drastically, leading to put more and more constraints on the ground segment performances - and in particular on timeliness. Nowadays, many space systems require high data throughputs and short response times, with information coming from multiple sources and involving complex algorithms. It has become necessary to perform thorough end-to-end analyses of the full system in order to optimise its cost and efficiency, but even sometimes to assess the feasibility of the mission. This paper presents a novel framework developed by Astrium Satellites in order to meet these needs of timeliness evaluation and optimisation. This framework, named ETOS (for “End-to-end Timeliness Optimisation of Space systems”), provides a modelling process with associated tools, models and GUIs. These are integrated thanks to a common data model and suitable adapters, with the aim of building suitable space systems simulators of the full end-to-end chain. A big challenge of such environment is to integrate heterogeneous tools (each one being well-adapted to part of the chain) into a relevant timeliness simulation.
Hyperthermal Environments Simulator for Nuclear Rocket Engine Development
NASA Technical Reports Server (NTRS)
Litchford, Ron J.; Foote, John P.; Clifton, W. B.; Hickman, Robert R.; Wang, Ten-See; Dobson, Christopher C.
2011-01-01
An arc-heater driven hyperthermal convective environments simulator was recently developed and commissioned for long duration hot hydrogen exposure of nuclear thermal rocket materials. This newly established non-nuclear testing capability uses a high-power, multi-gas, wall-stabilized constricted arc-heater to produce hightemperature pressurized hydrogen flows representative of nuclear reactor core environments, excepting radiation effects, and is intended to serve as a low-cost facility for supporting non-nuclear developmental testing of hightemperature fissile fuels and structural materials. The resulting reactor environments simulator represents a valuable addition to the available inventory of non-nuclear test facilities and is uniquely capable of investigating and characterizing candidate fuel/structural materials, improving associated processing/fabrication techniques, and simulating reactor thermal hydraulics. This paper summarizes facility design and engineering development efforts and reports baseline operational characteristics as determined from a series of performance mapping and long duration capability demonstration tests. Potential follow-on developmental strategies are also suggested in view of the technical and policy challenges ahead. Keywords: Nuclear Rocket Engine, Reactor Environments, Non-Nuclear Testing, Fissile Fuel Development.
A tough high performance composite matrix
NASA Technical Reports Server (NTRS)
Pater, Ruth H. (Inventor); Johnston, Norman J. (Inventor)
1992-01-01
This invention is a semi-interpenetrating polymer network which includes a high performance thermosetting polyimide having a nadic end group acting as a crosslinking site and a high performance linear thermoplastic polyimide. An improved high temperature matrix resin is provided which is capable of performing in the 200 to 300 C range. This resin has significantly improved toughness and microcracking resistance, excellent processability, mechanical performance and moisture and solvent resistances.
Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.
2015-05-01
The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactormore » innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
Vector scattering analysis of TPF coronagraph pupil masks
NASA Astrophysics Data System (ADS)
Ceperley, Daniel P.; Neureuther, Andrew R.; Lieber, Michael D.; Kasdin, N. Jeremy; Shih, Ta-Ming
2004-10-01
Rigorous finite-difference time-domain electromagnetic simulation is used to simulate the scattering from proto-typical pupil mask cross-section geometries and to quantify the differences from the normally assumed ideal on-off behavior. Shaped pupil plane masks are a promising technology for the TPF coronagraph mission. However the stringent requirements placed on the optics require that the detailed behavior of the edge-effects of these masks be examined carefully. End-to-end optical system simulation is essential and an important aspect is the polarization and cross-section dependent edge-effects which are the subject of this paper. Pupil plane masks are similar in many respects to photomasks used in the integrated circuit industry. Simulation capabilities such as the FDTD simulator, TEMPEST, developed for analyzing polarization and intensity imbalance effects in nonplanar phase-shifting photomasks, offer a leg-up in analyzing coronagraph masks. However, the accuracy in magnitude and phase required for modeling a chronograph system is extremely demanding and previously inconsequential errors may be of the same order of magnitude as the physical phenomena under study. In this paper, effects of thick masks, finite conductivity metals, and various cross-section geometries on the transmission of pupil-plane masks are illustrated. Undercutting the edge shape of Cr masks improves the effective opening width to within λ/5 of the actual opening but TE and TM polarizations require opposite compensations. The deviation from ideal is examined at the reference plane of the mask opening. Numerical errors in TEMPEST, such as numerical dispersion, perfectly matched layer reflections, and source haze are also discussed along with techniques for mitigating their impacts.
NASA Technical Reports Server (NTRS)
Mikhaylov, Rebecca; Kwack, Eug; Stegman, Matthew; Dawson, Douglas; Hoffman, Pamela
2015-01-01
NASA's SMAP Mission launched in January 2015 into a 685 km near-polar, sun-synchronous orbit. The SMAP instrument architecture incorporates an L-band radar and radiometer which share a common feedhorn and mesh reflector. The instrument rotates about the nadir axis at approximately 15 rpm, thereby providing a conically scanning wide swath antenna beam that is capable of achieving global coverage within three days. The radiometer and its associated electronics have tight thermal stability requirements in order to meet the required surface emittance measurement precision from space. Maintaining the thermal stabilities is quite challenging because the radiometer is located on a spinning platform that can either be in full sunlight or eclipse, and thus exposed to a highly transient environment. Stability requirements were met by integrating a light-weight Expanded Polystyrene (EPS) radome into the design to prevent solar illumination of the feed horn interior. The radome was painted white since the thermo-optical properties of bare sunlit EPS degrade rapidly over the three-year mission. Milling of the EPS and solvent within the white paint created cavities on the EPS surface which may introduce localized hot spots possibly violating the EPS glass transition temperature of 96degC and leading to structural integrity concerns. A three-day thermal test was conducted in a vacuum chamber to verify survivability of the radome during a simulated non-spin fault condition at end of mission. A portable solar simulator illuminated the test article and the beam irradiance was kept nearly constant during the entire 50 hour test, except during the first hour which simulated the expected 79degC on-orbit surface temperature of the radome. The test article survived based on the established pass criteria for three separate metrics: dimensional, optical property, and color. If any hot spots exist locally, they did not cause any observable permanent deformation when compared to pre- and post-test images. The test results increase confidence that there is a high probability that the radome will survive the worst-case scenario of a no-spin fault condition at the end of mission.
The Influence of End-Stop Buffer Characteristics on the Severity of Suspension Seat End-Stop Impacts
NASA Astrophysics Data System (ADS)
Wu, X.; Griffin, M. J.
1998-08-01
Suspension seat end-stop impacts may be a source of increased risk of injury for the drivers of some machines and work vehicles, such as off-road vehicles. Most suspension seats use rubber buffers to reduce the severity of end-stop impacts, but they still result in a high magnitude of acceleration being transmitted to drivers when an end-stop impact occurs. An experimental study has been conducted to investigate the effect of buffer stiffness and buffer damping on the severity of end-stop impacts. The results show that the end-stop impact performance of suspension seats with only bottom buffers can be improved by the use of both top and bottom buffers. The force-deflection characteristics of rubber buffers had a significant influence on the severity of end-stop impacts. The optimum buffer should have medium stiffness which is nearly linear and occurs over a long deflection, without being compressed to its high stiffness stage. It is shown, theoretically, that buffer damping is capable of significantly reducing the severity of end-stop impacts. However, since current rubber material provides only low damping, alternative materials to those in current use, or either passive or active damping devices, are required.
NASA Technical Reports Server (NTRS)
Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh
2017-01-01
Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, while automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies to be carried out that can inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. The paper describes the initial validation of the automated simulation capability against results from previous IDM HITL experiments, quantifying the differences. The simulator is then used to explore the performance of the IDM concept under the simple scenario of a capacity constrained airport under a wide range of wind conditions.
SpaceCubeX: A Framework for Evaluating Hybrid Multi-Core CPU FPGA DSP Architectures
NASA Technical Reports Server (NTRS)
Schmidt, Andrew G.; Weisz, Gabriel; French, Matthew; Flatley, Thomas; Villalpando, Carlos Y.
2017-01-01
The SpaceCubeX project is motivated by the need for high performance, modular, and scalable on-board processing to help scientists answer critical 21st century questions about global climate change, air quality, ocean health, and ecosystem dynamics, while adding new capabilities such as low-latency data products for extreme event warnings. These goals translate into on-board processing throughput requirements that are on the order of 100-1,000 more than those of previous Earth Science missions for standard processing, compression, storage, and downlink operations. To study possible future architectures to achieve these performance requirements, the SpaceCubeX project provides an evolvable testbed and framework that enables a focused design space exploration of candidate hybrid CPU/FPGA/DSP processing architectures. The framework includes ArchGen, an architecture generator tool populated with candidate architecture components, performance models, and IP cores, that allows an end user to specify the type, number, and connectivity of a hybrid architecture. The framework requires minimal extensions to integrate new processors, such as the anticipated High Performance Spaceflight Computer (HPSC), reducing time to initiate benchmarking by months. To evaluate the framework, we leverage a wide suite of high performance embedded computing benchmarks and Earth science scenarios to ensure robust architecture characterization. We report on our projects Year 1 efforts and demonstrate the capabilities across four simulation testbed models, a baseline SpaceCube 2.0 system, a dual ARM A9 processor system, a hybrid quad ARM A53 and FPGA system, and a hybrid quad ARM A53 and DSP system.
NASA Astrophysics Data System (ADS)
Biercamp, Joachim; Adamidis, Panagiotis; Neumann, Philipp
2017-04-01
With the exa-scale era approaching, length and time scales used for climate research on one hand and numerical weather prediction on the other hand blend into each other. The Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) represents a European consortium comprising partners from climate, weather and HPC in their effort to address key scientific challenges that both communities have in common. A particular challenge is to reach global models with spatial resolutions that allow simulating convective clouds and small-scale ocean eddies. These simulations would produce better predictions of trends and provide much more fidelity in the representation of high-impact regional events. However, running such models in operational mode, i.e with sufficient throughput in ensemble mode clearly will require exa-scale computing and data handling capability. We will discuss the ESiWACE initiative and relate it to work-in-progress on high-resolution simulations in Europe. We present recent strong scalability measurements from ESiWACE to demonstrate current computability in weather and climate simulation. A special focus in this particular talk is on the Icosahedal Nonhydrostatic (ICON) model used for a comparison of high resolution regional and global simulations with high quality observation data. We demonstrate that close-to-optimal parallel efficiency can be achieved in strong scaling global resolution experiments on Mistral/DKRZ, e.g. 94% for 5km resolution simulations using 36k cores on Mistral/DKRZ. Based on our scalability and high-resolution experiments, we deduce and extrapolate future capabilities for ICON that are expected for weather and climate research at exascale.
High Fidelity System Simulation of Multiple Components in Support of the UEET Program
NASA Technical Reports Server (NTRS)
Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton
2006-01-01
The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.
NASA Technical Reports Server (NTRS)
Garai, Anirban; Diosady, Laslo T.; Murman, Scott M.; Madavan, Nateri K.
2016-01-01
Recent progress towards developing a new computational capability for accurate and efficient high-fidelity direct numerical simulation (DNS) and large-eddy simulation (LES) of turbomachinery is described. This capability is based on an entropy- stable Discontinuous-Galerkin spectral-element approach that extends to arbitrarily high orders of spatial and temporal accuracy, and is implemented in a computationally efficient manner on a modern high performance computer architecture. An inflow turbulence generation procedure based on a linear forcing approach has been incorporated in this framework and DNS conducted to study the effect of inflow turbulence on the suction- side separation bubble in low-pressure turbine (LPT) cascades. The T106 series of airfoil cascades in both lightly (T106A) and highly loaded (T106C) configurations at exit isentropic Reynolds numbers of 60,000 and 80,000, respectively, are considered. The numerical simulations are performed using 8th-order accurate spatial and 4th-order accurate temporal discretization. The changes in separation bubble topology due to elevated inflow turbulence is captured by the present method and the physical mechanisms leading to the changes are explained. The present results are in good agreement with prior numerical simulations but some expected discrepancies with the experimental data for the T106C case are noted and discussed.
A survey of electric and hybrid vehicle simulation programs
NASA Technical Reports Server (NTRS)
Bevan, J.; Heimburger, D. A.; Metcalfe, M. A.
1978-01-01
Results of a survey conducted within the United States to determine the extent of development and capabilities of automotive performance simulation programs suitable for electric and hybrid vehicle studies are summarized. Altogether, 111 programs were identified as being in a usable state. The complexity of the existing programs spans a range from a page of simple desktop calculator instructions to 300,000 lines of a high-level programming language. The capability to simulate electric vehicles was most common, heat-engines second, and hybrid vehicles least common. Batch-operated programs are slightly more common than interactive ones, and one-third can be operated in either mode. The most commonly used language was FORTRAN, the language typically used by engineers. The higher-level simulation languages (e.g. SIMSCRIPT, GPSS, SIMULA) used by "model builders" were conspicuously lacking.
Additions and improvements to the high energy density physics capabilities in the FLASH code
NASA Astrophysics Data System (ADS)
Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.
2017-10-01
FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.
NASA Technical Reports Server (NTRS)
Galante, Joseph M.; Eepoel, John Van; Strube, Matt; Gill, Nat; Gonzalez, Marcelo; Hyslop, Andrew; Patrick, Bryan
2012-01-01
Argon is a flight-ready sensor suite with two visual cameras, a flash LIDAR, an on- board flight computer, and associated electronics. Argon was designed to provide sensing capabilities for relative navigation during proximity, rendezvous, and docking operations between spacecraft. A rigorous ground test campaign assessed the performance capability of the Argon navigation suite to measure the relative pose of high-fidelity satellite mock-ups during a variety of simulated rendezvous and proximity maneuvers facilitated by robot manipulators in a variety of lighting conditions representative of the orbital environment. A brief description of the Argon suite and test setup are given as well as an analysis of the performance of the system in simulated proximity and rendezvous operations.
Man-vehicle systems research facility: Design and operating characteristics
NASA Technical Reports Server (NTRS)
1983-01-01
The Man-Vehicle Systems Research Facility (MVSRF) provides the capability of simulating aircraft (two with full crews), en route and terminal air traffic control and aircrew interactions, and advanced cockpit (1995) display representative of future generations of aircraft, all within the full mission context. The characteristics of this facility derive from research, addressing critical human factors issues that pertain to: (1) information requirements for the utilization and integration of advanced electronic display systems, (2) the interaction and distribution of responsibilities between aircrews and ground controllers, and (3) the automation of aircrew functions. This research has emphasized the need for high fidelity in simulations and for the capability to conduct full mission simulations of relevant aircraft operations. This report briefly describes the MVSRF design and operating characteristics.
NASA Technical Reports Server (NTRS)
Taylor, John G.
1990-01-01
An investigation was conducted in the Static Test Facility of the NASA Langley 16-Foot Transonic Tunnel to determine the internal performance of two-dimensional convergent-divergent nozzles designed to have simultaneous pitch and yaw thrust vectoring capability. This concept utilized divergent flap rotation of thrust vectoring in the pitch plane and deflection of flat yaw flaps hinged at the end of the sidewalls for yaw thrust vectoring. The hinge location of the yaw flaps was varied at four positions from the nozzle exit plane to the throat plane. The yaw flaps were designed to contain the flow laterally independent of power setting. In order to eliminate any physical interference between the yaw flap deflected into the exhaust stream and the divergent flaps, the downstream corners of both upper and lower divergent flaps were cut off to allow for up to 30 deg of yaw flap deflection. The impact of varying the nozzle pitch vector angle, throat area, yaw flap hinge location, yaw flap length, and yaw flap deflection angle on nozzle internal performance characteristics, was studied. High-pressure air was used to simulate jet exhaust at nozzle pressure ratios up to 7.0. Static results indicate that configurations with the yaw flap hinge located upstream of the exit plane provide relatively high levels of thrust vectoring efficiency without causing large losses in resultant thrust ratio. Therefore, these configurations represent a viable concept for providing simultaneous pitch and yaw thrust vectoring.
IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.
This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less
Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael
2011-01-01
Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.
NASA Astrophysics Data System (ADS)
Liang, Albert K.; Koniczek, Martin; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua
2016-03-01
Pixelated photon counting detectors with energy discrimination capabilities are of increasing clinical interest for x-ray imaging. Such detectors, presently in clinical use for mammography and under development for breast tomosynthesis and spectral CT, usually employ in-pixel circuits based on crystalline silicon - a semiconductor material that is generally not well-suited for economic manufacture of large-area devices. One interesting alternative semiconductor is polycrystalline silicon (poly-Si), a thin-film technology capable of creating very large-area, monolithic devices. Similar to crystalline silicon, poly-Si allows implementation of the type of fast, complex, in-pixel circuitry required for photon counting - operating at processing speeds that are not possible with amorphous silicon (the material currently used for large-area, active matrix, flat-panel imagers). The pixel circuits of two-dimensional photon counting arrays are generally comprised of four stages: amplifier, comparator, clock generator and counter. The analog front-end (in particular, the amplifier) strongly influences performance and is therefore of interest to study. In this paper, the relationship between incident and output count rate of the analog front-end is explored under diagnostic imaging conditions for a promising poly-Si based design. The input to the amplifier is modeled in the time domain assuming a realistic input x-ray spectrum. Simulations of circuits based on poly-Si thin-film transistors are used to determine the resulting output count rate as a function of input count rate, energy discrimination threshold and operating conditions.
NASA Technical Reports Server (NTRS)
Sinha, Neeraj
2014-01-01
This Phase II project validated a state-of-the-art LES model, coupled with a Ffowcs Williams-Hawkings (FW-H) far-field acoustic solver, to support the development of advanced engine concepts. These concepts include innovative flow control strategies to attenuate jet noise emissions. The end-to-end LES/ FW-H noise prediction model was demonstrated and validated by applying it to rectangular nozzle designs with a high aspect ratio. The model also was validated against acoustic and flow-field data from a realistic jet-pylon experiment, thereby significantly advancing the state of the art for LES.
Role of High-End Computing in Meeting NASA's Science and Engineering Challenges
NASA Technical Reports Server (NTRS)
Biswas, Rupak
2006-01-01
High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.
NASA Technical Reports Server (NTRS)
Verma, Savita Arora; Jung, Yoon Chul
2017-01-01
This presentation describes the overview of the ATD-2 project and the integrated simulation of surface and airspace to evaluate the procedures of IADS system and evaluate surface metering capabilities via a high-fidelity human-in-the-loop simulation. Two HITL facilities, Future Flight Central (FFC) and Airspace Operations Laboratory (AOL), are integrated for simulating surface operations of the Charlotte-Douglas International Airport (CLT) and airspace in CLT TRACON and Washington Center.
Modeling of turbulent separated flows for aerodynamic applications
NASA Technical Reports Server (NTRS)
Marvin, J. G.
1983-01-01
Steady, high speed, compressible separated flows modeled through numerical simulations resulting from solutions of the mass-averaged Navier-Stokes equations are reviewed. Emphasis is placed on benchmark flows that represent simplified (but realistic) aerodynamic phenomena. These include impinging shock waves, compression corners, glancing shock waves, trailing edge regions, and supersonic high angle of attack flows. A critical assessment of modeling capabilities is provided by comparing the numerical simulations with experiment. The importance of combining experiment, numerical algorithm, grid, and turbulence model to effectively develop this potentially powerful simulation technique is stressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustafson Jr., WI; Vogelmann, AM
2015-09-01
This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less
Self-organization in suspensions of end-functionalized semiflexible polymers under shear flow
NASA Astrophysics Data System (ADS)
Myung, Jin Suk; Winkler, Roland G.; Gompper, Gerhard
2015-12-01
The nonequilibrium dynamical behavior and structure formation of end-functionalized semiflexible polymer suspensions under flow are investigated by mesoscale hydrodynamic simulations. The hybrid simulation approach combines the multiparticle collision dynamics method for the fluid, which accounts for hydrodynamic interactions, with molecular dynamics simulations for the semiflexible polymers. In equilibrium, various kinds of scaffold-like network structures are observed, depending on polymer flexibility and end-attraction strength. We investigate the flow behavior of the polymer networks under shear and analyze their nonequilibrium structural and rheological properties. The scaffold structure breaks up and densified aggregates are formed at low shear rates, while the structural integrity is completely lost at high shear rates. We provide a detailed analysis of the shear- rate-dependent flow-induced structures. The studies provide a deeper understanding of the formation and deformation of network structures in complex materials.
Simulator technology as a tool for education in cardiac care.
Hravnak, Marilyn; Beach, Michael; Tuite, Patricia
2007-01-01
Assisting nurses in gaining the cognitive and psychomotor skills necessary to safely and effectively care for patients with cardiovascular disease can be challenging for educators. Ideally, nurses would have the opportunity to synthesize and practice these skills in a protected training environment before application in the dynamic clinical setting. Recently, a technology known as high fidelity human simulation was introduced, which permits learners to interact with a simulated patient. The dynamic physiologic parameters and physical assessment capabilities of the simulated patient provide for a realistic learning environment. This article describes the High Fidelity Human Simulation Laboratory at the University of Pittsburgh School of Nursing and presents strategies for using this technology as a tool in teaching complex cardiac nursing care at the basic and advanced practice nursing levels. The advantages and disadvantages of high fidelity human simulation in learning are discussed.
Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States
Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy; ...
2016-07-14
Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic ismore » capable of producing health and climate benefits of between $54 and $120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately $690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. In the end, this work demonstrates health and climate benefits of off shore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.« less
Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy
Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic ismore » capable of producing health and climate benefits of between $54 and $120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately $690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. In the end, this work demonstrates health and climate benefits of off shore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.« less
NASA Technical Reports Server (NTRS)
Garrahan, Steven L.; Tolson, Robert H.; Williams, Robert L., II
1995-01-01
Industrial robots are usually attached to a rigid base. Placing the robot on a compliant base introduces dynamic coupling between the two systems. The Vehicle Emulation System (VES) is a six DOF platform that is capable of modeling this interaction. The VES employs a force-torque sensor as the interface between robot and base. A computer simulation of the VES is presented. Each of the hardware and software components is described and Simulink is used as the programming environment. The simulation performance is compared with experimental results to validate accuracy. A second simulation which models the dynamic interaction of a robot and a flexible base acts as a comparison to the simulated motion of the VES. Results are presented that compare the simulated VES motion with the motion of the VES hardware using the same admittance model. The two computer simulations are compared to determine how well the VES is expected to emulate the desired motion. Simulation results are given for robots mounted to the end effector of the Space Shuttle Remote Manipulator System (SRMS). It is shown that for fast motions of the two robots studied, the SRMS experiences disturbances on the order of centimeters. Larger disturbances are possible if different manipulators are used.
Simulator for concurrent processing data flow architectures
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.
1992-01-01
A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
Variance in binary stellar population synthesis
NASA Astrophysics Data System (ADS)
Breivik, Katelyn; Larson, Shane L.
2016-03-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations in less than a week, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Studying Variance in the Galactic Ultra-compact Binary Population
NASA Astrophysics Data System (ADS)
Larson, Shane L.; Breivik, Katelyn
2017-01-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations on week-long timescales, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Large-Signal Klystron Simulations Using KLSC
NASA Astrophysics Data System (ADS)
Carlsten, B. E.; Ferguson, P.
1997-05-01
We describe a new, 2-1/2 dimensional, klystron-simulation code, KLSC. This code has a sophisticated input cavity model for calculating the klystron gain with arbitrary input cavity matching and tuning, and is capable of modeling coupled output cavities. We will discuss the input and output cavity models, and present simulation results from a high-power, S-band design. We will use these results to explore tuning issues with coupled output cavities.
A Storm Surge and Inundation Model of the Back River Watershed at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Loftis, Jon Derek; Wang, Harry V.; DeYoung, Russell J.
2013-01-01
This report on a Virginia Institute for Marine Science project demonstrates that the sub-grid modeling technology (now as part of Chesapeake Bay Inundation Prediction System, CIPS) can incorporate high-resolution Lidar measurements provided by NASA Langley Research Center into the sub-grid model framework to resolve detailed topographic features for use as a hydrological transport model for run-off simulations within NASA Langley and Langley Air Force Base. The rainfall over land accumulates in the ditches/channels resolved via the model sub-grid was tested to simulate the run-off induced by heavy precipitation. Possessing both the capabilities for storm surge and run-off simulations, the CIPS model was then applied to simulate real storm events starting with Hurricane Isabel in 2003. It will be shown that the model can generate highly accurate on-land inundation maps as demonstrated by excellent comparison of the Langley tidal gauge time series data (CAPABLE.larc.nasa.gov) and spatial patterns of real storm wrack line measurements with the model results simulated during Hurricanes Isabel (2003), Irene (2011), and a 2009 Nor'easter. With confidence built upon the model's performance, sea level rise scenarios from the ICCP (International Climate Change Partnership) were also included in the model scenario runs to simulate future inundation cases.
The mechanical design and simulation of a scaled H⁻ Penning ion source.
Rutter, T; Faircloth, D; Turner, D; Lawrie, S
2016-02-01
The existing ISIS Penning H(-) source is unable to produce the beam parameters required for the front end test stand and so a new, high duty factor, high brightness scaled source is being developed. This paper details first the development of an electrically biased aperture plate for the existing ISIS source and second, the design, simulation, and development of a prototype scaled source.
The mechanical design and simulation of a scaled H- Penning ion source
NASA Astrophysics Data System (ADS)
Rutter, T.; Faircloth, D.; Turner, D.; Lawrie, S.
2016-02-01
The existing ISIS Penning H- source is unable to produce the beam parameters required for the front end test stand and so a new, high duty factor, high brightness scaled source is being developed. This paper details first the development of an electrically biased aperture plate for the existing ISIS source and second, the design, simulation, and development of a prototype scaled source.
Advanced Ground Systems Maintenance Physics Models For Diagnostics Project
NASA Technical Reports Server (NTRS)
Perotti, Jose M.
2015-01-01
The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations. This project will develop and implement high-fidelity physics-based modeling techniques tosimulate the real-time operation of cryogenics and other fluids systems and, when compared to thereal-time operation of the actual systems, provide assessment of their state. Physics-modelcalculated measurements (called “pseudo-sensors”) will be compared to the system real-timedata. Comparison results will be utilized to provide systems operators with enhanced monitoring ofsystems' health and status, identify off-nominal trends and diagnose system/component failures.This capability can also be used to conduct planning and analysis of cryogenics and other fluidsystems designs. This capability will be interfaced with the ground operations command andcontrol system as a part of the Advanced Ground Systems Maintenance (AGSM) project to helpassure system availability and mission success. The initial capability will be developed for theLiquid Oxygen (LO2) ground loading systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustafson, William I.; Vogelmann, Andrew M.; Cheng, Xiaoping
The Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility began a pilot project in May 2015 to design a routine, high-resolution modeling capability to complement ARM’s extensive suite of measurements. This modeling capability has been named the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) project. The initial focus of LASSO is on shallow convection at the ARM Southern Great Plains (SGP) Climate Research Facility. The availability of LES simulations with concurrent observations will serve many purposes. LES helps bridge the scale gap between DOE ARM observations and models, and the use of routine LES addsmore » value to observations. It provides a self-consistent representation of the atmosphere and a dynamical context for the observations. Further, it elucidates unobservable processes and properties. LASSO will generate a simulation library for researchers that enables statistical approaches beyond a single-case mentality. It will also provide tools necessary for modelers to reproduce the LES and conduct their own sensitivity experiments. Many different uses are envisioned for the combined LASSO LES and observational library. For an observationalist, LASSO can help inform instrument remote sensing retrievals, conduct Observation System Simulation Experiments (OSSEs), and test implications of radar scan strategies or flight paths. For a theoretician, LASSO will help calculate estimates of fluxes and co-variability of values, and test relationships without having to run the model yourself. For a modeler, LASSO will help one know ahead of time which days have good forcing, have co-registered observations at high-resolution scales, and have simulation inputs and corresponding outputs to test parameterizations. Further details on the overall LASSO project are available at https://www.arm.gov/capabilities/modeling/lasso.« less
Enhanced bioactivity of internally functionalized cationic dendrimers with PEG cores
Albertazzi, Lorenzo; Mickler, Frauke M.; Pavan, Giovanni M.; Salomone, Fabrizio; Bardi, Giuseppe; Panniello, Mariangela; Amir, Elizabeth; Kang, Taegon; Killops, Kato L.; Bräuchle, Christoph; Amir, Roey J.; Hawker, Craig J.
2012-01-01
Hybrid dendritic-linear block copolymers based on a 4-arm polyethylene glycol (PEG) core were synthesized using an accelerated AB2/CD2 dendritic growth approach through orthogonal amine/epoxy and thiol-yne chemistries. The biological activity of these 4-arm and the corresponding 2-arm hybrid dendrimers revealed an enhanced, dendritic effect with an exponential increase in cell internalization concomitant with increasing amine end-groups and low cytotoxicity. Furthermore, the ability of these hybrid dendrimers to induce endosomal escape combined with their facile and efficient synthesis makes them attractive platforms for gene transfection. The 4-arm-based dendrimer showed significantly improved DNA binding and gene transfection capabilities in comparison with the 2-arm derivative. These results combined with the MD simulation indicate a significant effect of both the topology of the PEG core and the multivalency of these hybrid macromolecules, on their DNA binding and delivery capablities. PMID:23140570
High-temperature seals and lubricants for geothermal rock bits. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrickson, R.R.; Winzenried, R.W.; Jones A.H.
1981-04-01
High temperature seals (elastomeric and mechanical) and lubricants were developed specifically for journal-type rock bits to be used in geothermal well drilling. Results at simulated downhole conditions indicate that five selected elastomeric seals (L'Garde No. 267, Utex Nos. 227, 231 and HTCR, and Sandia Glow Discharge Coated Viton) are capable of 288/sup 0/C (500/sup 0/F) service. Two prototype mechanical seals did not achieve the life determined for the elastomeric seals. Six lubricants (Pacer PLX-024 oil, PLX-043 oil, PLX-045 oil, Geobond Oil, and Geobond Grease) demonstrated 316/sup 0/C (600/sup 0/F) capability. Recommendation is made for full-scale simulated geothermal drilling tests utilizingmore » the improved elastomeric seals and lubricants.« less
Steady-state capabilities for hydroturbines with OpenFOAM
NASA Astrophysics Data System (ADS)
Page, M.; Beaudoin, M.; Giroux, A. M.
2010-08-01
The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R&D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Québec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.
Highly Automated Arrival Management and Control System Suitable for Early NextGen
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Jung, Jaewoo
2013-01-01
This is a presentation of previously published work conducted in the development of the Terminal Area Precision Scheduling and Spacing (TAPSS) system. Included are concept and technical descriptions of the TAPSS system and results from human in the loop simulations conducted at Ames Research Center. The Terminal Area Precision Scheduling and Spacing system has demonstrated through research and extensive high-fidelity simulation studies to have benefits in airport arrival throughput, supporting efficient arrival descents, and enabling mixed aircraft navigation capability operations during periods of high congestion. NASA is currently porting the TAPSS system into the FAA TBFM and STARS system prototypes to ensure its ability to operate in the FAA automation Infrastructure. NASA ATM Demonstration Project is using the the TAPSS technologies to provide the ground-based automation tools to enable airborne Interval Management (IM) capabilities. NASA and the FAA have initiated a Research Transition Team to enable potential TAPSS and IM Technology Transfer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, P.; Powers, W.; Hritzay, D.
1959-06-01
The development of an arc wind tunnel capable of stagnation pressures in the excess of twenty atmospheres and using as much as fifteen megawatts of electrical power is described. The calibration of this facility shows that it is capable of reproducing the aerodynamic environment encountered by vehicles flying at velocities as great as satellite velocity. Its use as a missile re-entry material test facility is described. The large power capacity of this facility allows one to make material tests on specimens of size sufficient to be useful for material development yet at realistic energy and Reynolds number values. By themore » addition of a high-capacity vacuum system, this facility can be used to produce the low density, high Mach number environment needed for simulating satellite re-entry, as well as hypersonic flight at extreme altitudes. (auth)« less
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Bhasin, Kul B.; Fabian, Theodore P.; Griner, James H.; Kachmar, Brian A.; Richard, Alan M.
1999-01-01
The continuing technological advances in satellite communications and global networking have resulted in commercial systems that now can potentially provide capabilities for communications with space-based science platforms. This reduces the need for expensive government owned communications infrastructures to support space science missions while simultaneously making available better service to the end users. An interactive, high data rate Internet type connection through commercial space communications networks would enable authorized researchers anywhere to control space-based experiments in near real time and obtain experimental results immediately. A space based communications network architecture consisting of satellite constellations connecting orbiting space science platforms to ground users can be developed to provide this service. The unresolved technical issues presented by this scenario are the subject of research at NASA's Glenn Research Center in Cleveland, Ohio. Assessment of network architectures, identification of required new or improved technologies, and investigation of data communications protocols are being performed through testbed and satellite experiments and laboratory simulations.
Deep Learning for Flow Sculpting: Insights into Efficient Learning using Scientific Simulation Data
NASA Astrophysics Data System (ADS)
Stoecklein, Daniel; Lore, Kin Gwn; Davies, Michael; Sarkar, Soumik; Ganapathysubramanian, Baskar
2017-04-01
A new technique for shaping microfluid flow, known as flow sculpting, offers an unprecedented level of passive fluid flow control, with potential breakthrough applications in advancing manufacturing, biology, and chemistry research at the microscale. However, efficiently solving the inverse problem of designing a flow sculpting device for a desired fluid flow shape remains a challenge. Current approaches struggle with the many-to-one design space, requiring substantial user interaction and the necessity of building intuition, all of which are time and resource intensive. Deep learning has emerged as an efficient function approximation technique for high-dimensional spaces, and presents a fast solution to the inverse problem, yet the science of its implementation in similarly defined problems remains largely unexplored. We propose that deep learning methods can completely outpace current approaches for scientific inverse problems while delivering comparable designs. To this end, we show how intelligent sampling of the design space inputs can make deep learning methods more competitive in accuracy, while illustrating their generalization capability to out-of-sample predictions.
Compilation of Abstracts for SC12 Conference Proceedings
NASA Technical Reports Server (NTRS)
Morello, Gina Francine (Compiler)
2012-01-01
1 A Breakthrough in Rotorcraft Prediction Accuracy Using Detached Eddy Simulation; 2 Adjoint-Based Design for Complex Aerospace Configurations; 3 Simulating Hypersonic Turbulent Combustion for Future Aircraft; 4 From a Roar to a Whisper: Making Modern Aircraft Quieter; 5 Modeling of Extended Formation Flight on High-Performance Computers; 6 Supersonic Retropropulsion for Mars Entry; 7 Validating Water Spray Simulation Models for the SLS Launch Environment; 8 Simulating Moving Valves for Space Launch System Liquid Engines; 9 Innovative Simulations for Modeling the SLS Solid Rocket Booster Ignition; 10 Solid Rocket Booster Ignition Overpressure Simulations for the Space Launch System; 11 CFD Simulations to Support the Next Generation of Launch Pads; 12 Modeling and Simulation Support for NASA's Next-Generation Space Launch System; 13 Simulating Planetary Entry Environments for Space Exploration Vehicles; 14 NASA Center for Climate Simulation Highlights; 15 Ultrascale Climate Data Visualization and Analysis; 16 NASA Climate Simulations and Observations for the IPCC and Beyond; 17 Next-Generation Climate Data Services: MERRA Analytics; 18 Recent Advances in High-Resolution Global Atmospheric Modeling; 19 Causes and Consequences of Turbulence in the Earths Protective Shield; 20 NASA Earth Exchange (NEX): A Collaborative Supercomputing Platform; 21 Powering Deep Space Missions: Thermoelectric Properties of Complex Materials; 22 Meeting NASA's High-End Computing Goals Through Innovation; 23 Continuous Enhancements to the Pleiades Supercomputer for Maximum Uptime; 24 Live Demonstrations of 100-Gbps File Transfers Across LANs and WANs; 25 Untangling the Computing Landscape for Climate Simulations; 26 Simulating Galaxies and the Universe; 27 The Mysterious Origin of Stellar Masses; 28 Hot-Plasma Geysers on the Sun; 29 Turbulent Life of Kepler Stars; 30 Modeling Weather on the Sun; 31 Weather on Mars: The Meteorology of Gale Crater; 32 Enhancing Performance of NASAs High-End Computing Applications; 33 Designing Curiosity's Perfect Landing on Mars; 34 The Search Continues: Kepler's Quest for Habitable Earth-Sized Planets.
Rapid Automated Aircraft Simulation Model Updating from Flight Data
NASA Technical Reports Server (NTRS)
Brian, Geoff; Morelli, Eugene A.
2011-01-01
Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.
An Implementation of Hydrostatic Boundary Conditions for Variable Density Lattice Boltzmann Methods
NASA Astrophysics Data System (ADS)
Bardsley, K. J.; Thorne, D. T.; Lee, J. S.; Sukop, M. C.
2006-12-01
Lattice Boltzmann Methods (LBMs) have been under development for the last two decades and have become another capable numerical method for simulating fluid flow. Recent advances in lattice Boltzmann applications involve simulation of density-dependent fluid flow in closed (Dixit and Babu, 2006; D'Orazio et al., 2004) or periodic (Guo and Zhao, 2005) domains. However, standard pressure boundary conditions (BCs) are incompatible with concentration-dependent density flow simulations that use a body force for gravity. An implementation of hydrostatic BCs for use under these conditions is proposed here. The basis of this new implementation is an additional term in the pressure BC. It is derived to account for the incorporation of gravity as a body force and the effect of varying concentration in the fluid. The hydrostatic BC expands the potential of density-dependent LBM to simulate domains with boundaries other than the closed or periodic boundaries that have appeared in previous literature on LBM simulations. With this new implementation, LBM will be able to simulate complex concentration-dependent density flows, such as salt water intrusion in the classic Henry and Henry-Hilleke problems. This is demonstrated using various examples, beginning with a closed box system, and ending with a system containing two solid walls, one velocity boundary and one pressure boundary, as in the Henry problem. References Dixit, H. N., V. Babu, (2006), Simulation of high Rayleigh number natural convection in a square cavity using the lattice Boltzmann method, Int. J. Heat Mass Transfer, 49, 727-739. D'Orazio, A., M. Corcione, G.P. Celata, (2004), Application to natural convection enclosed flows of a lattice Boltzmann BGK model coupled with a general purpose thermal boundary conditions, Int. J. Thermal Sci., 43, 575-586. Gou, Z., T.S. Zhao, (2005), Lattice Boltzmann simulation of natural convection with temperature-dependant viscosity in a porous cavity, Numerical Heat Transfer, Part B, 47, 157-177.
NASA Astrophysics Data System (ADS)
Yan, Hui; Wang, K. G.; Jones, Jim E.
2016-06-01
A parallel algorithm for large-scale three-dimensional phase-field simulations of phase coarsening is developed and implemented on high-performance architectures. From the large-scale simulations, a new kinetics in phase coarsening in the region of ultrahigh volume fraction is found. The parallel implementation is capable of harnessing the greater computer power available from high-performance architectures. The parallelized code enables increase in three-dimensional simulation system size up to a 5123 grid cube. Through the parallelized code, practical runtime can be achieved for three-dimensional large-scale simulations, and the statistical significance of the results from these high resolution parallel simulations are greatly improved over those obtainable from serial simulations. A detailed performance analysis on speed-up and scalability is presented, showing good scalability which improves with increasing problem size. In addition, a model for prediction of runtime is developed, which shows a good agreement with actual run time from numerical tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi
2013-11-29
This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implementmore » a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.« less
Mars Science Laboratory: Entry, Descent, and Landing System Performance
NASA Technical Reports Server (NTRS)
Way, David W.; Powell, Richard W.; Chen, Allen; SanMartin, A. Miguel; Burkhart, P. Daniel; Mendeck, Gavin F.
2007-01-01
In 2010, the Mars Science Laboratory (MSL) mission will pioneer the next generation of robotic Entry, Descent, and Landing (EDL) systems, by delivering the largest and most capable rover to date to the surface of Mars. To do so, MSL will fly a guided lifting entry at a lift-to-drag ratio in excess of that ever flown at Mars, deploy the largest parachute ever at Mars, and perform a novel Sky Crane maneuver. Through improved altitude capability, increased latitude coverage, and more accurate payload delivery, MSL is allowing the science community to consider the exploration of previously inaccessible regions of the planet. The MSL EDL system is a new EDL architecture based on Viking heritage technologies and designed to meet the challenges of landing increasing massive payloads on Mars. In accordance with level-1 requirements, the MSL EDL system is being designed to land an 850 kg rover to altitudes as high as 1 km above the Mars Orbiter Laser Altimeter defined areoid within 10 km of the desired landing site. Accordingly, MSL will enter the largest entry mass, fly the largest 70 degree sphere-cone aeroshell, generate the largest hypersonic lift-to-drag ratio, and deploy the largest Disk-Gap-Band supersonic parachute of any previous mission to Mars. Major EDL events include a hypersonic guided entry, supersonic parachute deploy and inflation, subsonic heatshield jettison, terminal descent sensor acquisition, powered descent initiation, sky crane terminal descent, rover touchdown detection, and descent stage flyaway. Key performance metrics, derived from level-1 requirements and tracked by the EDL design team to indicate performance capability and timeline margins, include altitude and range at parachute deploy, time on radar, and propellant use. The MSL EDL system, which will continue to develop over the next three years, will enable a notable extension in the advancement of Mars surface science by delivering more science capability than ever before to the surface of Mars. This paper describes the current MSL EDL system performance as predicted by end-to-end EDL simulations, highlights the sensitivity of this baseline performance to several key environmental assumptions, and discusses some of the challenges faced in delivering such an unprecedented rover payload to the surface of Mars.
Mars Science Laboratory: Entry, Descent, and Landing System Performance
NASA Technical Reports Server (NTRS)
Way, David W.; Powell, Richard W.; Chen, Allen; Steltzner, Adam D.; San Martin, Alejandro M.; Burkhart, Paul D.; mendeck, Gavin F.
2006-01-01
In 2010, the Mars Science Laboratory (MSL) mission will pioneer the next generation of robotic Entry, Descent, and Landing (EDL) systems, by delivering the largest and most capable rover to date to the surface of Mars. To do so, MSL will fly a guided lifting entry at a lift-to-drag ratio in excess of that ever flown at Mars, deploy the largest parachute ever at Mars, and perform a novel Sky Crane maneuver. Through improved altitude capability, increased latitude coverage, and more accurate payload delivery, MSL is allowing the science community to consider the exploration of previously inaccessible regions of the planet. The MSL EDL system is a new EDL architecture based on Viking heritage technologies and designed to meet the challenges of landing increasing massive payloads on Mars. In accordance with level-1 requirements, the MSL EDL system is being designed to land an 850 kg rover to altitudes as high as 1 km above the Mars Orbiter Laser Altimeter defined areoid within 10 km of the desired landing site. Accordingly, MSL will enter the largest entry mass, fly the largest 70 degree sphere-cone aeroshell, generate the largest hypersonic lift-to-drag ratio, and deploy the largest Disk-Gap-Band supersonic parachute of any previous mission to Mars. Major EDL events include a hypersonic guided entry, supersonic parachute deploy and inflation, subsonic heatshield jettison, terminal descent sensor acquisition, powered descent initiation, sky crane terminal descent, rover touchdown detection, and descent stage flyaway. Key performance metrics, derived from level-1 requirements and tracked by the EDL design team to indicate performance capability and timeline margins, include altitude and range at parachute deploy, time on radar, and propellant use. The MSL EDL system, which will continue to develop over the next three years, will enable a notable extension in the advancement of Mars surface science by delivering more science capability than ever before to the surface of Mars. This paper describes the current MSL EDL system performance as predicted by end-to-end EDL simulations, highlights the sensitivity of this baseline performance to several key environmental assumptions, and discusses some of the challenges faced in delivering such an unprecedented rover payload to the surface of Mars.
Threat radar system simulations
NASA Astrophysics Data System (ADS)
Miller, L.
The capabilities, requirements, and goals of radar emitter simulators are discussed. Simulators are used to evaluate competing receiver designs, to quantify the performance envelope of a radar system, and to model the characteristics of a transmitted signal waveform. A database of candidate threat systems is developed and, in concert with intelligence data on a given weapons system, permits upgrading simulators to new projected threat capabilities. Four currently available simulation techniques are summarized, noting the usefulness of developing modular software for fast controlled-cost upgrades of simulation capabilities.
Evaluating average and atypical response in radiation effects simulations
NASA Astrophysics Data System (ADS)
Weller, R. A.; Sternberg, A. L.; Massengill, L. W.; Schrimpf, R. D.; Fleetwood, D. M.
2003-12-01
We examine the limits of performing single-event simulations using pre-averaged radiation events. Geant4 simulations show the necessity, for future devices, to supplement current methods with ensemble averaging of device-level responses to physically realistic radiation events. Initial Monte Carlo simulations have generated a significant number of extremal events in local energy deposition. These simulations strongly suggest that proton strikes of sufficient energy, even those that initiate purely electronic interactions, can initiate device response capable in principle of producing single event upset or microdose damage in highly scaled devices.
VIPER: Virtual Intelligent Planetary Exploration Rover
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Flueckiger, Lorenzo; Nguyen, Laurent; Washington, Richard
2001-01-01
Simulation and visualization of rover behavior are critical capabilities for scientists and rover operators to construct, test, and validate plans for commanding a remote rover. The VIPER system links these capabilities. using a high-fidelity virtual-reality (VR) environment. a kinematically accurate simulator, and a flexible plan executive to allow users to simulate and visualize possible execution outcomes of a plan under development. This work is part of a larger vision of a science-centered rover control environment, where a scientist may inspect and explore the environment via VR tools, specify science goals, and visualize the expected and actual behavior of the remote rover. The VIPER system is constructed from three generic systems, linked together via a minimal amount of customization into the integrated system. The complete system points out the power of combining plan execution, simulation, and visualization for envisioning rover behavior; it also demonstrates the utility of developing generic technologies. which can be combined in novel and useful ways.
High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair.
Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K
2018-01-01
Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed.
High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair
Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.
2018-01-01
Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed. PMID:29706894
NASA Technical Reports Server (NTRS)
Badescu, Mircea; Bonitz, Robert; Kulczycki, Erick; Aisen, Norman; Dandino, Charles M.; Cantrell, Brett S.; Gallagher, William; Shevin, Jesse; Ganino, Anthony; Haddad, Nicolas;
2013-01-01
The 2011 Decadal Survey for planetary science released by the National Research Council of the National Academies identified Comet Surface Sample Return (CSSR) as one of five high priority potential New Frontiers-class missions in the next decade. The main objectives of the research described in this publication are: develop a concept for an end-to-end system for collecting and storing a comet sample to be returned to Earth; design, fabricate and test a prototype Dynamic Acquisition and Retrieval Tool (DART) capable of collecting 500 cc sample in a canister and eject the canister with a predetermined speed; identify a set of simulants with physical properties at room temperature that suitably match the physical properties of the comet surface as it would be sampled. We propose the use of a dart that would be launched from the spacecraft to impact and penetrate the comet surface. After collecting the sample, the sample canister would be ejected at a speed greater than the comet's escape velocity and captured by the spacecraft, packaged into a return capsule and returned to Earth. The dart would be composed of an inner tube or sample canister, an outer tube, a decelerator, a means of capturing and retaining the sample, and a mechanism to eject the canister with the sample for later rendezvous with the spacecraft. One of the significant unknowns is the physical properties of the comet surface. Based on new findings from the recent Deep Impact comet encounter mission, we have limited our search of solutions for sampling materials to materials with 10 to 100 kPa shear strength in loose or consolidated form. As the possible range of values for the comet surface temperature is also significantly different than room temperature and testing at conditions other than the room temperature can become resource intensive, we sought sample simulants with physical properties at room temperature similar to the expected physical properties of the comet surface material. The chosen DART configuration, the efforts to identify a test simulant and the properties of these simulants, and the results of the preliminary testing will be described in this paper.
Developing a 3-DOF Compliant Perching Arm for a Free-Flying Robot on the International Space Station
NASA Technical Reports Server (NTRS)
Park, In-Won; Smith, Marion F.; Sanchez, Hugo S.; Wong, Sze Wun; Piacenza, Pedro; Ciocarlie, Matei
2017-01-01
This paper presents the design and control of the 3-DOF compliant perching arm for the free-flying Astrobee robots that will operate inside the International Space Station (ISS). The robots are intended to serve as a flexible platform for future guest scientists to use for zero-gravity robotics research - thus, the arm is designed to support manipulation research. It provides a 1-DOF underactuated tendon-driven gripper capable of enveloping a range of objects of different shapes and sizes. Co-located RGB camera and LIDAR sensors provide perception. The Astrobee robots will be capable of grasping each other in flight, to simulate orbital capture scenarios. The arm's end-effector module is swappable on-orbit, allowing guest scientists to add upgraded grippers, or even additional arm degrees of freedom. The design of the arm balances research capabilities with Astrobee's operational need to perch on ISS handrails to reduce power consumption. Basic arm functioning and grip strength were evaluated using an integrated Astrobee prototype riding on a low-friction air bearing.
NASA Astrophysics Data System (ADS)
Vannitsen, Jordan; Rizzitelli, Federico; Wang, Kaiti; Segret, Boris; Juang, Jyh-Ching; Miau, Jiun-Jih
2017-12-01
This paper presents a Multi-satellite Data Analysis and Simulator Tool (MDAST), developed with the original goal to support the science requirements of a Martian 3-Unit CubeSat mission profile named Bleeping Interplanetary Radiation Determination Yo-yo (BIRDY). MDAST was firstly designed and tested by taking into account the positions, attitudes, instruments field of view and energetic particles flux measurements from four spacecrafts (ACE, MSL, STEREO A, and STEREO B). Secondly, the simulated positions, attitudes and instrument field of view from the BIRDY CubeSat have been adapted for input. And finally, this tool can be used for data analysis of the measurements from the four spacecrafts mentioned above so as to simulate the instrument trajectory and observation capabilities of the BIRDY CubeSat. The onset, peak and end time of a solar particle event is specifically defined and identified with this tool. It is not only useful for the BIRDY mission but also for analyzing data from the four satellites aforementioned and can be utilized for other space weather missions with further customization.
A Procedural Electroencephalogram Simulator for Evaluation of Anesthesia Monitors.
Petersen, Christian Leth; Görges, Matthias; Massey, Roslyn; Dumont, Guy Albert; Ansermino, J Mark
2016-11-01
Recent research and advances in the automation of anesthesia are driving the need to better understand electroencephalogram (EEG)-based anesthesia end points and to test the performance of anesthesia monitors. This effort is currently limited by the need to collect raw EEG data directly from patients. A procedural method to synthesize EEG signals was implemented in a mobile software application. The application is capable of sending the simulated signal to an anesthesia depth of hypnosis monitor. Systematic sweeps of the simulator generate functional monitor response profiles reminiscent of how network analyzers are used to test electronic components. Three commercial anesthesia monitors (Entropy, NeuroSENSE, and BIS) were compared with this new technology, and significant response and feature variations between the monitor models were observed; this includes reproducible, nonmonotonic apparent multistate behavior and significant hysteresis at light levels of anesthesia. Anesthesia monitor response to a procedural simulator can reveal significant differences in internal signal processing algorithms. The ability to synthesize EEG signals at different anesthetic depths potentially provides a new method for systematically testing EEG-based monitors and automated anesthesia systems with all sensor hardware fully operational before human trials.
THE VIRTUAL INSTRUMENT: SUPPORT FOR GRID-ENABLED MCELL SIMULATIONS
Casanova, Henri; Berman, Francine; Bartol, Thomas; Gokcay, Erhan; Sejnowski, Terry; Birnbaum, Adam; Dongarra, Jack; Miller, Michelle; Ellisman, Mark; Faerman, Marcio; Obertelli, Graziano; Wolski, Rich; Pomerantz, Stuart; Stiles, Joel
2010-01-01
Ensembles of widely distributed, heterogeneous resources, or Grids, have emerged as popular platforms for large-scale scientific applications. In this paper we present the Virtual Instrument project, which provides an integrated application execution environment that enables end-users to run and interact with running scientific simulations on Grids. This work is performed in the specific context of MCell, a computational biology application. While MCell provides the basis for running simulations, its capabilities are currently limited in terms of scale, ease-of-use, and interactivity. These limitations preclude usage scenarios that are critical for scientific advances. Our goal is to create a scientific “Virtual Instrument” from MCell by allowing its users to transparently access Grid resources while being able to steer running simulations. In this paper, we motivate the Virtual Instrument project and discuss a number of relevant issues and accomplishments in the area of Grid software development and application scheduling. We then describe our software design and report on the current implementation. We verify and evaluate our design via experiments with MCell on a real-world Grid testbed. PMID:20689618
Urban public transit systems modeling capabilities
DOT National Transportation Integrated Search
1995-02-01
Current national transportation policy places increasing emphasis on multi-modal : solutions involving public transit and high-occupancy vehicle (HOV) facilities : and services. Current traffic simulation/assignment models, however, have only : limit...
The Exploration of Mars Launch and Assembly Simulation
NASA Technical Reports Server (NTRS)
Cates, Grant; Stromgren, Chel; Mattfeld, Bryan; Cirillo, William; Goodliff, Kandyce
2016-01-01
Advancing human exploration of space beyond Low Earth Orbit, and ultimately to Mars, is of great interest to NASA, other organizations, and space exploration advocates. Various strategies for getting to Mars have been proposed. These include NASA's Design Reference Architecture 5.0, a near-term flyby of Mars advocated by the group Inspiration Mars, and potential options developed for NASA's Evolvable Mars Campaign. Regardless of which approach is used to get to Mars, they all share a need to visualize and analyze their proposed campaign and evaluate the feasibility of the launch and on-orbit assembly segment of the campaign. The launch and assembly segment starts with flight hardware manufacturing and ends with final departure of a Mars Transfer Vehicle (MTV), or set of MTVs, from an assembly orbit near Earth. This paper describes a discrete event simulation based strategic visualization and analysis tool that can be used to evaluate the launch campaign reliability of any proposed strategy for exploration beyond low Earth orbit. The input to the simulation can be any manifest of multiple launches and their associated transit operations between Earth and the exploration destinations, including Earth orbit, lunar orbit, asteroids, moons of Mars, and ultimately Mars. The simulation output includes expected launch dates and ascent outcomes i.e., success or failure. Running 1,000 replications of the simulation provides the capability to perform launch campaign reliability analysis to determine the probability that all launches occur in a timely manner to support departure opportunities and to deliver their payloads to the intended orbit. This allows for quantitative comparisons between alternative scenarios, as well as the capability to analyze options for improving launch campaign reliability. Results are presented for representative strategies.
Welch, M C; Kwan, P W; Sajeev, A S M
2014-10-01
Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bieniosek, F.M.; Anders, A.; Barnard, J.J.
This effort contains two main components: The new induction-bunching module is expected to deliver higher fluence in the bunched beam, and the new target positioner will enable a significantly enhanced target physics repetition rate. The velocity ramp that bunches the K{sup +} beam in the neutralized drift compression section is established with a bipolar voltage ramp applied to an acceleration gap. An induction acceleration module creates this voltage waveform. The new bunching module (IBM) specially built for NDCX has approximately twice the capability (volt-seconds) as our original IBM. We reported on the beam line design for the best use ofmore » the bunching module in our FY08 Q2 report. Based on simulations and theoretical work, we chose to extend the drift compression section and use the additional volt-seconds to extend the pulse duration and keep the peak voltage swing (and velocity excursions) similar to the present module. Simulations showed that this approach, which extends the drift section, to be advantageous because it limits the chromatic aberrations in the beam spot on target. To this end, colleagues at PPPL have fabricated the meter-long extension to the ferroelectric plasma source and it was installed on the beam line with the new IBM in January 2009. Simulation results suggest a factor of two increase in energy deposition from the bunched beam. In the first WDM target run (August-November 2008) the target handling setup required opening the vacuum system to manually replace the target after each shot (which destroys the target). Because of the requirement for careful alignment of each individual target, the target shot repetition rate was no greater than 1 shot per day. Initial results of this run are reported in our FY08 4th Quarter Milestone Report. Based on the valuable experience gained in the initial run, we have designed and installed an improved target alignment and positioning system with the capability to reposition targets remotely. This capability allows us to significantly increase our shot repetition rate, and to take greater advantage of the pinhole/cone arrangement we have developed to localize the beam at final focus. In addition we have improved the capability of the optical diagnostic systems, and we have installed a new beam current transformer downstream of the target to monitor beam current transmitted through the target during an experiment. These improvements will allow us to better exploit the inherent capability of the NDCX facility for high repetition rate and thus to provide more detailed experimental data to assess WDM physics models of target behavior. This milestone has been met by demonstrating highly compressed beams with the new bunching module, which are neutralized in the longer drift compression section by the new ferro-electric plasma sources. The peak uncompressed beam intensity ({approx}600 kW/cm{sup 2}) is higher than in previous measurements, and the bunched beam current profiles are {approx}2ns. We have also demonstrated a large increase in the experimental data acquisition rate for target heating experiments. In the first test of the new remote-controlled target positioning system, we completed three successful target physics shots in less than two hours. Further improvements are expected.« less
NASA Astrophysics Data System (ADS)
Bel Hadj Kacem, Mohamed Salah
All hydrological processes are affected by the spatial variability of the physical parameters of the watershed, and also by human intervention on the landscape. The water outflow from a watershed strictly depends on the spatial and temporal variabilities of the physical parameters of the watershed. It is now apparent that the integration of mathematical models into GIS's can benefit both GIS and three-dimension environmental models: a true modeling capability can help the modeling community bridge the gap between planners, scientists, decision-makers and end-users. The main goal of this research is to design a practical tool to simulate run-off water surface using Geographic design a practical tool to simulate run-off water surface using Geographic Information Systems and the simulation of the hydrological behavior by the Finite Element Method.
A dynamic simulation based water resources education tool.
Williams, Alison; Lansey, Kevin; Washburne, James
2009-01-01
Educational tools to assist the public in recognizing impacts of water policy in a realistic context are not generally available. This project developed systems with modeling-based educational decision support simulation tools to satisfy this need. The goal of this model is to teach undergraduate students and the general public about the implications of common water management alternatives so that they can better understand or become involved in water policy and make more knowledgeable personal or community decisions. The model is based on Powersim, a dynamic simulation software package capable of producing web-accessible, intuitive, graphic, user-friendly interfaces. Modules are included to represent residential, agricultural, industrial, and turf uses, as well as non-market values, water quality, reservoir, flow, and climate conditions. Supplementary materials emphasize important concepts and lead learners through the model, culminating in an open-ended water management project. The model is used in a University of Arizona undergraduate class and within the Arizona Master Watershed Stewards Program. Evaluation results demonstrated improved understanding of concepts and system interactions, fulfilling the project's objectives.
The Integrated Mission Design Center (IMDC) at NASA Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Karpati, Gabriel; Martin, John; Steiner, Mark; Reinhardt, K.
2002-01-01
NASA Goddard has used its Integrated Mission Design Center (IMDC) to perform more than 150 mission concept studies. The IMDC performs rapid development of high-level, end-to-end mission concepts, typically in just 4 days. The approach to the studies varies, depending on whether the proposed mission is near-future using existing technology, mid-future using new technology being actively developed, or far-future using technology which may not yet be clearly defined. The emphasis and level of detail developed during any particular study depends on which timeframe (near-, mid-, or far-future) is involved and the specific needs of the study client. The most effective mission studies are those where mission capabilities required and emerging technology developments can synergistically work together; thus both enhancing mission capabilities and providing impetus for ongoing technology development.
Valente, Virgilio; Dai Jiang; Demosthenous, Andreas
2015-08-01
This paper presents the preliminary design and simulation of a flexible and programmable analog front-end (AFE) circuit with current and voltage readout capabilities for electric impedance spectroscopy (EIS). The AFE is part of a fully integrated multifrequency EIS platform. The current readout comprises of a transimpedance stage and an automatic gain control (AGC) unit designed to accommodate impedance changes larger than 3 order of magnitude. The AGC is based on a dynamic peak detector that tracks changes in the input current over time and regulates the gain of a programmable gain amplifier in order to optimise the signal-to-noise ratio. The system works up to 1 MHz. The voltage readout consists of a 2 stages of fully differential current-feedback instrumentation amplifier which provide 100 dB of CMRR and a programmable gain up to 20 V/V per stage with a bandwidth in excess of 10MHz.
Degree-constrained multicast routing for multimedia communications
NASA Astrophysics Data System (ADS)
Wang, Yanlin; Sun, Yugeng; Li, Guidan
2005-02-01
Multicast services have been increasingly used by many multimedia applications. As one of the key techniques to support multimedia applications, the rational and effective multicast routing algorithms are very important to networks performance. When switch nodes in networks have different multicast capability, multicast routing problem is modeled as the degree-constrained Steiner problem. We presented two heuristic algorithms, named BMSTA and BSPTA, for the degree-constrained case in multimedia communications. Both algorithms are used to generate degree-constrained multicast trees with bandwidth and end to end delay bound. Simulations over random networks were carried out to compare the performance of the two proposed algorithms. Experimental results show that the proposed algorithms have advantages in traffic load balancing, which can avoid link blocking and enhance networks performance efficiently. BMSTA has better ability in finding unsaturated links and (or) unsaturated nodes to generate multicast trees than BSPTA. The performance of BMSTA is affected by the variation of degree constraints.
Reactive transport codes for subsurface environmental simulation
Steefel, C. I.; Appelo, C. A. J.; Arora, B.; ...
2014-09-26
A general description of the mathematical and numerical formulations used in modern numerical reactive transport codes relevant for subsurface environmental simulations is presented. The formulations are followed by short descriptions of commonly used and available subsurface simulators that consider continuum representations of flow, transport, and reactions in porous media. These formulations are applicable to most of the subsurface environmental benchmark problems included in this special issue. The list of codes described briefly here includes PHREEQC, HPx, PHT3D, OpenGeoSys (OGS), HYTEC, ORCHESTRA, TOUGHREACT, eSTOMP, HYDROGEOCHEM, CrunchFlow, MIN3P, and PFLOTRAN. The descriptions include a high-level list of capabilities for each of themore » codes, along with a selective list of applications that highlight their capabilities and historical development.« less
Scenario management and automated scenario generation
NASA Astrophysics Data System (ADS)
McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee
2006-05-01
The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.
Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2011-01-01
This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.
NASA Astrophysics Data System (ADS)
Utzmann, Jens; Flohrer, Tim; Schildknecht, Thomas; Wagner, Axel; Silha, Jiri; Willemsen, Philip; Teston, Frederic
This paper presents the capabilities of a Space-Based Space Surveillance (SBSS) demonstration mission for Space Surveillance and Tracking (SST) based on a micro-satellite platform. The results have been produced in the frame of ESA’s "Assessment Study for Space Based Space Surveillance Demonstration Mission" performed by the Airbus Defence and Space consortium. Space Surveillance and Tracking is part of Space Situational Awareness (SSA) and covers the detection, tracking and cataloguing of space debris and satellites. Derived SST services comprise a catalogue of these man-made objects, collision warning, detection and characterisation of in-orbit fragmentations, sub-catalogue debris characterisation, etc. The assessment of SBSS in a SST system architecture has shown that both an operational SBSS and also already a well-designed space-based demonstrator can provide substantial performance in terms of surveillance and tracking of beyond-LEO objects. Especially the early deployment of a demonstrator, possible by using standard equipment, could boost initial operating capability and create a self-maintained object catalogue. Furthermore, unique statistical information about small-size LEO debris (mm size) can be collected in-situ. Unlike classical technology demonstration missions, the primary goal is the demonstration and optimisation of the functional elements in a complex end-to-end chain (mission planning, observation strategies, data acquisition, processing and fusion, etc.) until the final products can be offered to the users. Also past and current missions by the US (SBV, SBSS) and Canada (Sapphire, NEOSSat) underline the advantages of space-based space surveillance. The presented SBSS system concept takes the ESA SST System Requirements (derived within the ESA SSA Preparatory Program) into account and aims at fulfilling SST core requirements in a stand-alone manner. Additionally, requirments for detection and characterisation of small-sized LEO debris are considered. The evaluation of the concept has shown that an according solution can be implemented with low technological effort and risk. The paper presents details of the system concept, candidate micro-satellite platforms, the observation strategy and the results of performance simulations for space debris coverage and cataloguing accuracy.
Advanced Ground Systems Maintenance Physics Models for Diagnostics Project
NASA Technical Reports Server (NTRS)
Harp, Janicce Leshay
2014-01-01
The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations.
NASA Technical Reports Server (NTRS)
Wong, Yen F.; Kegege, Obadiah; Schaire, Scott H.; Bussey, George; Altunc, Serhat; Zhang, Yuwen; Patel, Chitra
2016-01-01
National Aeronautics and Space Administration (NASA) CubeSat missions are expected to grow rapidly in the next decade. Higher data rate CubeSats are transitioning away from Amateur Radio bands to higher frequency bands. A high-level communication architecture for future space-to-ground CubeSat communication was proposed within NASA Goddard Space Flight Center. This architecture addresses CubeSat direct-to-ground communication, CubeSat to Tracking Data Relay Satellite System (TDRSS) communication, CubeSat constellation with Mothership direct-to-ground communication, and CubeSat Constellation with Mothership communication through K-Band Single Access (KSA).A Study has been performed to explore this communication architecture, through simulations, analyses, and identifying technologies, to develop the optimum communication concepts for CubeSat communications. This paper will present details of the simulation and analysis that include CubeSat swarm, daughter shipmother ship constellation, Near Earth Network (NEN) S and X-band direct to ground link, TDRS Multiple Access (MA) array vs Single Access mode, notional transceiverantenna configurations, ground asset configurations and Code Division Multiple Access (CDMA) signal trades for daughter mother CubeSat constellation inter-satellite crosslink. Results of Space Science X-band 10 MHz maximum achievable data rate study will be summarized. Assessment of Technology Readiness Level (TRL) of current CubeSat communication technologies capabilities will be presented. Compatibility test of the CubeSat transceiver through NEN and Space Network (SN) will be discussed. Based on the analyses, signal trade studies and technology assessments, the functional design and performance requirements as well as operation concepts for future CubeSat end-to-end communications will be derived.
Simulator certification methods and the vertical motion simulator
NASA Technical Reports Server (NTRS)
Showalter, T. W.
1981-01-01
The vertical motion simulator (VMS) is designed to simulate a variety of experimental helicopter and STOL/VTOL aircraft as well as other kinds of aircraft with special pitch and Z axis characteristics. The VMS includes a large motion base with extensive vertical and lateral travel capabilities, a computer generated image visual system, and a high speed CDC 7600 computer system, which performs aero model calculations. Guidelines on how to measure and evaluate VMS performance were developed. A survey of simulation users was conducted to ascertain they evaluated and certified simulators for use. The results are presented.
NASA Astrophysics Data System (ADS)
Jizhi, Liu; Xingbi, Chen
2009-12-01
A new quasi-three-dimensional (quasi-3D) numeric simulation method for a high-voltage level-shifting circuit structure is proposed. The performances of the 3D structure are analyzed by combining some 2D device structures; the 2D devices are in two planes perpendicular to each other and to the surface of the semiconductor. In comparison with Davinci, the full 3D device simulation tool, the quasi-3D simulation method can give results for the potential and current distribution of the 3D high-voltage level-shifting circuit structure with appropriate accuracy and the total CPU time for simulation is significantly reduced. The quasi-3D simulation technique can be used in many cases with advantages such as saving computing time, making no demands on the high-end computer terminals, and being easy to operate.
Catalytic Ignition and Upstream Reaction Propagation in Monolith Reactors
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Dietrich, Daniel L.; Miller, Fletcher J.; T'ien, James S.
2007-01-01
Using numerical simulations, this work demonstrates a concept called back-end ignition for lighting-off and pre-heating a catalytic monolith in a power generation system. In this concept, a downstream heat source (e.g. a flame) or resistive heating in the downstream portion of the monolith initiates a localized catalytic reaction which subsequently propagates upstream and heats the entire monolith. The simulations used a transient numerical model of a single catalytic channel which characterizes the behavior of the entire monolith. The model treats both the gas and solid phases and includes detailed homogeneous and heterogeneous reactions. An important parameter in the model for back-end ignition is upstream heat conduction along the solid. The simulations used both dry and wet CO chemistry as a model fuel for the proof-of-concept calculations; the presence of water vapor can trigger homogenous reactions, provided that gas-phase temperatures are adequately high and there is sufficient fuel remaining after surface reactions. With sufficiently high inlet equivalence ratio, back-end ignition occurs using the thermophysical properties of both a ceramic and metal monolith (coated with platinum in both cases), with the heat-up times significantly faster for the metal monolith. For lower equivalence ratios, back-end ignition occurs without upstream propagation. Once light-off and propagation occur, the inlet equivalence ratio could be reduced significantly while still maintaining an ignited monolith as demonstrated by calculations using complete monolith heating.
NASA Astrophysics Data System (ADS)
Liu, C. L.; Kirchengast, G.; Zhang, K. F.; Norman, R.; Li, Y.; Zhang, S. C.; Carter, B.; Fritzer, J.; Schwaerz, M.; Choy, S. L.; Wu, S. Q.; Tan, Z. X.
2013-09-01
Global Navigation Satellite System (GNSS) radio occultation (RO) is an innovative meteorological remote sensing technique for measuring atmospheric parameters such as refractivity, temperature, water vapour and pressure for the improvement of numerical weather prediction (NWP) and global climate monitoring (GCM). GNSS RO has many unique characteristics including global coverage, long-term stability of observations, as well as high accuracy and high vertical resolution of the derived atmospheric profiles. One of the main error sources in GNSS RO observations that significantly affect the accuracy of the derived atmospheric parameters in the stratosphere is the ionospheric error. In order to mitigate the effect of this error, the linear ionospheric correction approach for dual-frequency GNSS RO observations is commonly used. However, the residual ionospheric errors (RIEs) can be still significant, especially when large ionospheric disturbances occur and prevail such as during the periods of active space weather. In this study, the RIEs were investigated under different local time, propagation direction and solar activity conditions and their effects on RO bending angles are characterised using end-to-end simulations. A three-step simulation study was designed to investigate the characteristics of the RIEs through comparing the bending angles with and without the effects of the RIEs. This research forms an important step forward in improving the accuracy of the atmospheric profiles derived from the GNSS RO technique.
40 CFR 264.191 - Assessment of existing tank system's integrity.
Code of Federal Regulations, 2010 CFR
2010-07-01
...); and (5) Results of a leak test, internal inspection, or other tank integrity examination such that: (i) For non-enterable underground tanks, the assessment must include a leak test that is capable of taking into account the effects of temperature variations, tank end deflection, vapor pockets, and high water...
DEVELOPMENT AND DEMONSTRATION OF CONCEPTS FOR IMPROVING COKE-OVEN DOOR SEALS
The report discusses the design, laboratory scale tests, construction, and field tests of an improved metal-to-metal seal for coke-oven end doors. Basic features of the seal are: high-strength temperature-resistant steel capable of 3 times the deflection of current seals without ...
The MeqTrees software system and its use for third-generation calibration of radio interferometers
NASA Astrophysics Data System (ADS)
Noordam, J. E.; Smirnov, O. M.
2010-12-01
Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on WSRT data. It is the only package that is specifically designed to handle what we propose to call third-generation calibration (3GC), which is needed for the new generation of giant radio telescopes, but can also improve the calibration of existing instruments.
Simulated breeding with QU-GENE graphical user interface.
Hathorn, Adrian; Chapman, Scott; Dieters, Mark
2014-01-01
Comparing the efficiencies of breeding methods with field experiments is a costly, long-term process. QU-GENE is a highly flexible genetic and breeding simulation platform capable of simulating the performance of a range of different breeding strategies and for a continuum of genetic models ranging from simple to complex. In this chapter we describe some of the basic mechanics behind the QU-GENE user interface and give a simplified example of how it works.
NASA Astrophysics Data System (ADS)
Dhakal, B.; Nicholson, D. E.; Saleeb, A. F.; Padula, S. A., II; Vaidyanathan, R.
2016-09-01
Shape memory alloy (SMA) actuators often operate under a complex state of stress for an extended number of thermomechanical cycles in many aerospace and engineering applications. Hence, it becomes important to account for multi-axial stress states and deformation characteristics (which evolve with thermomechanical cycling) when calibrating any SMA model for implementation in large-scale simulation of actuators. To this end, the present work is focused on the experimental validation of an SMA model calibrated for the transient and cyclic evolutionary behavior of shape memory Ni49.9Ti50.1, for the actuation of axially loaded helical-coil springs. The approach requires both experimental and computational aspects to appropriately assess the thermomechanical response of these multi-dimensional structures. As such, an instrumented and controlled experimental setup was assembled to obtain temperature, torque, degree of twist and extension, while controlling end constraints during heating and cooling of an SMA spring under a constant externally applied axial load. The computational component assesses the capabilities of a general, multi-axial, SMA material-modeling framework, calibrated for Ni49.9Ti50.1 with regard to its usefulness in the simulation of SMA helical-coil spring actuators. Axial extension, being the primary response, was examined on an axially-loaded spring with multiple active coils. Two different conditions of end boundary constraint were investigated in both the numerical simulations as well as the validation experiments: Case (1) where the loading end is restrained against twist (and the resulting torque measured as the secondary response) and Case (2) where the loading end is free to twist (and the degree of twist measured as the secondary response). The present study focuses on the transient and evolutionary response associated with the initial isothermal loading and the subsequent thermal cycles under applied constant axial load. The experimental results for the helical-coil actuator under two different boundary conditions are found to be within error to their counterparts in the numerical simulations. The numerical simulation and the experimental validation demonstrate similar transient and evolutionary behavior in the deformation response under the complex, inhomogeneous, multi-axial stress-state and large deformations of the helical-coil actuator. This response, although substantially different in magnitude, exhibited similar evolutionary characteristics to the simple, uniaxial, homogeneous, stress-state of the isobaric tensile tests results used for the model calibration. There was no significant difference in the axial displacement (primary response) magnitudes observed between Cases (1) and (2) for the number of cycles investigated here. The simulated secondary responses of the two cases evolved in a similar manner when compared to the experimental validation of the respective cases.
Rotorcraft Research at the NASA Vertical Motion Simulator
NASA Technical Reports Server (NTRS)
Aponso, Bimal Lalith; Tran, Duc T.; Schroeder, Jeffrey A.
2009-01-01
In the 1970 s the role of the military helicopter evolved to encompass more demanding missions including low-level nap-of-the-earth flight and operation in severely degraded visual environments. The Vertical Motion Simulator (VMS) at the NASA Ames Research Center was built to provide a high-fidelity simulation capability to research new rotorcraft concepts and technologies that could satisfy these mission requirements. The VMS combines a high-fidelity large amplitude motion system with an adaptable simulation environment including interchangeable and configurable cockpits. In almost 30 years of operation, rotorcraft research on the VMS has contributed significantly to the knowledge-base on rotorcraft performance, handling qualities, flight control, and guidance and displays. These contributions have directly benefited current rotorcraft programs and flight safety. The high fidelity motion system in the VMS was also used to research simulation fidelity. This research provided a fundamental understanding of pilot cueing modalities and their effect on simulation fidelity.
NASA Astrophysics Data System (ADS)
Monfort, Samuel S.; Sibley, Ciara M.; Coyne, Joseph T.
2016-05-01
Future unmanned vehicle operations will see more responsibilities distributed among fewer pilots. Current systems typically involve a small team of operators maintaining control over a single aerial platform, but this arrangement results in a suboptimal configuration of operator resources to system demands. Rather than devoting the full-time attention of several operators to a single UAV, the goal should be to distribute the attention of several operators across several UAVs as needed. Under a distributed-responsibility system, operator task load would be continuously monitored, with new tasks assigned based on system needs and operator capabilities. The current paper sought to identify a set of metrics that could be used to assess workload unobtrusively and in near real-time to inform a dynamic tasking algorithm. To this end, we put 20 participants through a variable-difficulty multiple UAV management simulation. We identified a subset of candidate metrics from a larger pool of pupillary and behavioral measures. We then used these metrics as features in a machine learning algorithm to predict workload condition every 60 seconds. This procedure produced an overall classification accuracy of 78%. An automated tasker sensitive to fluctuations in operator workload could be used to efficiently delegate tasks for teams of UAV operators.
Pietra, Francesco
2014-12-01
In this work, molecular dynamics (MD) simulations of the permeation of proteins by small gases of biological significance have been extended from gas carrier, sensor, and enzymatic proteins to genetically encoded tags and killer proteins. To this end, miniSOG was taken as an example of current high interest, using a biased form of MD, called random-acceleration MD. Various egress gates and binding pockets for dioxygen, as an indistinguishable mimic of singlet dioxygen, were found on both above and below the isoalloxazine plane of the flavin mononucleotide cofactor in miniSOG. Of such gates and binding pockets, those lying within two opposite cones, coaxial with a line normal to the isoalloxazine plane, and with the vertex at the center of such a plane are those most visited by the escaping gas molecule. Out of residues most capable of quenching (1) O2 , Y30, lying near the base of one such a cone, and H85, near the base of the opposite cone, are held to be most responsible for the reduced quantum yield of (1) O2 with folded miniSOG with respect to free flavin mononucleotide in solution. Copyright © 2014 Verlag Helvetica Chimica Acta AG, Zürich.