Sample records for simulation capability utilizing

  1. Communications, Navigation, and Surveillance Models in ACES: Design Implementation and Capabilities

    NASA Technical Reports Server (NTRS)

    Kubat, Greg; Vandrei, Don; Satapathy, Goutam; Kumar, Anil; Khanna, Manu

    2006-01-01

    Presentation objectives include: a) Overview of the ACES/CNS System Models Design and Integration; b) Configuration Capabilities available for Models and Simulations using ACES with CNS Modeling; c) Descriptions of recently added, Enhanced CNS Simulation Capabilities; and d) General Concepts Ideas that Utilize CNS Modeling to Enhance Concept Evaluations.

  2. Engineering design and integration simulation utilization manual

    NASA Technical Reports Server (NTRS)

    Hirsch, G. N.

    1976-01-01

    A description of the Engineering Design Integration (EDIN) Simulation System as it exists at Johnson Space Center is provided. A discussion of the EDIN Simulation System capabilities and applications is presented.

  3. Aircrew Training Devices: Utility and Utilization of Advanced Instructional Features (Phase IV--Summary Report).

    ERIC Educational Resources Information Center

    Polzella, Donald J.; And Others

    Modern aircrew training devices (ATDs) are equipped with sophisticated hardware and software capabilities, known as advanced instructional features (AIFs), that permit a simulator instructor to prepare briefings, manage training, vary task difficulty/fidelity, monitor performance, and provide feedback for flight simulation training missions. The…

  4. Utility fog: A universal physical substance

    NASA Technical Reports Server (NTRS)

    Hall, J. Storrs

    1993-01-01

    Active, polymorphic material ('Utility Fog') can be designed as a conglomeration of 100-micron robotic cells ('foglets'). Such robots could be built with the techniques of molecular nanotechnology. Controllers with processing capabilities of 1000 MIPS per cubic micron, and electric motors with power densities of one milliwatt per cubic micron are assumed. Utility Fog should be capable of simulating most everyday materials, dynamically changing its form and properties, and forms a substrate for an integrated virtual reality and telerobotics.

  5. Overview of ASC Capability Computing System Governance Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott W.

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  6. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  7. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.

  8. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    NASA Astrophysics Data System (ADS)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  9. Simulation model of stratified thermal energy storage tank using finite difference method

    NASA Astrophysics Data System (ADS)

    Waluyo, Joko

    2016-06-01

    Stratified TES tank is normally used in the cogeneration plant. The stratified TES tanks are simple, low cost, and equal or superior in thermal performance. The advantage of TES tank is that it enables shifting of energy usage from off-peak demand for on-peak demand requirement. To increase energy utilization in a stratified TES tank, it is required to build a simulation model which capable to simulate the charging phenomenon in the stratified TES tank precisely. This paper is aimed to develop a novel model in addressing the aforementioned problem. The model incorporated chiller into the charging of stratified TES tank system in a closed system. The model was developed in one-dimensional type involve with heat transfer aspect. The model covers the main factors affect to degradation of temperature distribution namely conduction through the tank wall, conduction between cool and warm water, mixing effect on the initial flow of the charging as well as heat loss to surrounding. The simulation model is developed based on finite difference method utilizing buffer concept theory and solved in explicit method. Validation of the simulation model is carried out using observed data obtained from operating stratified TES tank in cogeneration plant. The temperature distribution of the model capable of representing S-curve pattern as well as simulating decreased charging temperature after reaching full condition. The coefficient of determination values between the observed data and model obtained higher than 0.88. Meaning that the model has capability in simulating the charging phenomenon in the stratified TES tank. The model is not only capable of generating temperature distribution but also can be enhanced for representing transient condition during the charging of stratified TES tank. This successful model can be addressed for solving the limitation temperature occurs in charging of the stratified TES tank with the absorption chiller. Further, the stratified TES tank can be charged with the cooling energy of absorption chiller that utilizes from waste heat from gas turbine of the cogeneration plant.

  10. Development of a Simulation Capability for the Space Station Active Rack Isolation System

    NASA Technical Reports Server (NTRS)

    Johnson, Terry L.; Tolson, Robert H.

    1998-01-01

    To realize quality microgravity science on the International Space Station, many microgravity facilities will utilize the Active Rack Isolation System (ARIS). Simulation capabilities for ARIS will be needed to predict the microgravity environment. This paper discusses the development of a simulation model for use in predicting the performance of the ARIS in attenuating disturbances with frequency content between 0.01 Hz and 10 Hz. The derivation of the model utilizes an energy-based approach. The complete simulation includes the dynamic model of the ISPR integrated with the model for the ARIS controller so that the entire closed-loop system is simulated. Preliminary performance predictions are made for the ARIS in attenuating both off-board disturbances as well as disturbances from hardware mounted onboard the microgravity facility. These predictions suggest that the ARIS does eliminate resonant behavior detrimental to microgravity experimentation. A limited comparison is made between the simulation predictions of ARIS attenuation of off-board disturbances and results from the ARIS flight test. These comparisons show promise, but further tuning of the simulation is needed.

  11. Simulation and animation of sensor-driven robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.; Trivedi, M.M.; Bidlack, C.R.

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less

  12. Lewis Research Center studies of multiple large wind turbine generators on a utility network

    NASA Technical Reports Server (NTRS)

    Gilbert, L. J.; Triezenberg, D. M.

    1979-01-01

    A NASA-Lewis program to study the anticipated performance of a wind turbine generator farm on an electric utility network is surveyed. The paper describes the approach of the Lewis Wind Energy Project Office to developing analysis capabilities in the area of wind turbine generator-utility network computer simulations. Attention is given to areas such as, the Lewis Purdue hybrid simulation, an independent stability study, DOE multiunit plant study, and the WEST simulator. Also covered are the Lewis mod-2 simulation including analog simulation of a two wind turbine system and comparison with Boeing simulation results, and gust response of a two machine model. Finally future work to be done is noted and it is concluded that the study shows little interaction between the generators and between the generators and the bus.

  13. Advanced radiometric and interferometric milimeter-wave scene simulations

    NASA Technical Reports Server (NTRS)

    Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.

    1993-01-01

    Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.

  14. Recent progress in utilization planning for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Bartoe, John-David F.; Thiringer, Peter S.

    1991-01-01

    The progress made in utilization planning for the redesigned Space Station Freedom (SSF) concept is described. Consideration is given to the SSF user capabilities, the strategic planning process, the strategic planning organizations, and the Consolidated Operations and Utilization Plan (COUP, which will be released in January 1993) as well as to the COUP development process and implementation. The process by which the COUP will be produced was exercised in the international Multilateral Strategic and Tactical Integration Process (MUSTIP) simulation. The paper describes the MUSTIP simulation and its activities along with MUSTIP findings and recommendations.

  15. Computational Science and Innovation

    NASA Astrophysics Data System (ADS)

    Dean, D. J.

    2011-09-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  16. Harnessing atomistic simulations to predict the rate at which dislocations overcome obstacles

    NASA Astrophysics Data System (ADS)

    Saroukhani, S.; Nguyen, L. D.; Leung, K. W. K.; Singh, C. V.; Warner, D. H.

    2016-05-01

    Predicting the rate at which dislocations overcome obstacles is key to understanding the microscopic features that govern the plastic flow of modern alloys. In this spirit, the current manuscript examines the rate at which an edge dislocation overcomes an obstacle in aluminum. Predictions were made using different popular variants of Harmonic Transition State Theory (HTST) and compared to those of direct Molecular Dynamics (MD) simulations. The HTST predictions were found to be grossly inaccurate due to the large entropy barrier associated with the dislocation-obstacle interaction. Considering the importance of finite temperature effects, the utility of the Finite Temperature String (FTS) method was then explored. While this approach was found capable of identifying a prominent reaction tube, it was not capable of computing the free energy profile along the tube. Lastly, the utility of the Transition Interface Sampling (TIS) approach was explored, which does not need a free energy profile and is known to be less reliant on the choice of reaction coordinate. The TIS approach was found capable of accurately predicting the rate, relative to direct MD simulations. This finding was utilized to examine the temperature and load dependence of the dislocation-obstacle interaction in a simple periodic cell configuration. An attractive rate prediction approach combining TST and simple continuum models is identified, and the strain rate sensitivity of individual dislocation obstacle interactions is predicted.

  17. Aircrew Training Devices: Utility and Utilization of Advanced Instructional Features (Phase II-Air Training Command, Military Airlift Command, and Strategic Air Command [and] Phase III-Electronic Warfare Trainers).

    ERIC Educational Resources Information Center

    Polzella, Donald J.; Hubbard, David C.

    This document consists of an interim report and a final report which describe the second and third phases of a project designed to determine the utility and utilization of sophisticated hardware and software capabilities known as advanced instructional features (AIFs). Used with an aircrew training device (ATD), AIFs permit a simulator instructor…

  18. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  19. Advanced Simulation in Undergraduate Pilot Training (ASUPT) Facility Utilization Plan.

    ERIC Educational Resources Information Center

    Hagin, William V.; Smith, James F.

    The capabilities of a flight simulation research facility located at Williams AFB, Arizona are described. Research philosophy to be applied is discussed. Long range and short range objectives are identified. A time phased plan for long range research accomplishment is described. In addition, some examples of near term research efforts which will…

  20. Novel Use of a Noninvasive Hemodynamic Monitor in a Personalized, Active Learning Simulation

    ERIC Educational Resources Information Center

    Zoller, Jonathan K.; He, Jianghua; Ballew, Angela T.; Orr, Walter N.; Flynn, Brigid C.

    2017-01-01

    The present study furthered the concept of simulation-based medical education by applying a personalized active learning component. We tested this novel approach utilizing a noninvasive hemodynamic monitor with the capability to measure and display in real time numerous hemodynamic parameters in the exercising participant. Changes in medical…

  1. Using multi-disciplinary optimization and numerical simulation on the transiting exoplanet survey satellite

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2017-08-01

    The Transiting Exoplanet Survey Satellite (TESS) is an instrument consisting of four, wide fieldof- view CCD cameras dedicated to the discovery of exoplanets around the brightest stars, and understanding the diversity of planets and planetary systems in our galaxy. Each camera utilizes a seven-element lens assembly with low-power and low-noise CCD electronics. Advanced multivariable optimization and numerical simulation capabilities accommodating arbitrarily complex objective functions have been added to the internally developed Lincoln Laboratory Integrated Modeling and Analysis Software (LLIMAS) and used to assess system performance. Various optical phenomena are accounted for in these analyses including full dn/dT spatial distributions in lenses and charge diffusion in the CCD electronics. These capabilities are utilized to design CCD shims for thermal vacuum chamber testing and flight, and verify comparable performance in both environments across a range of wavelengths, field points and temperature distributions. Additionally, optimizations and simulations are used for model correlation and robustness optimizations.

  2. Characterizing Wheel-Soil Interaction Loads Using Meshfree Finite Element Methods: A Sensitivity Analysis for Design Trade Studies

    NASA Technical Reports Server (NTRS)

    Contreras, Michael T.; Trease, Brian P.; Bojanowski, Cezary; Kulakx, Ronald F.

    2013-01-01

    A wheel experiencing sinkage and slippage events poses a high risk to planetary rover missions as evidenced by the mobility challenges endured by the Mars Exploration Rover (MER) project. Current wheel design practice utilizes loads derived from a series of events in the life cycle of the rover which do not include (1) failure metrics related to wheel sinkage and slippage and (2) performance trade-offs based on grouser placement/orientation. Wheel designs are rigorously tested experimentally through a variety of drive scenarios and simulated soil environments; however, a robust simulation capability is still in development due to myriad of complex interaction phenomena that contribute to wheel sinkage and slippage conditions such as soil composition, large deformation soil behavior, wheel geometry, nonlinear contact forces, terrain irregularity, etc. For the purposes of modeling wheel sinkage and slippage at an engineering scale, meshfree nite element approaches enable simulations that capture su cient detail of wheel-soil interaction while remaining computationally feasible. This study implements the JPL wheel-soil benchmark problem in the commercial code environment utilizing the large deformation modeling capability of Smooth Particle Hydrodynamics (SPH) meshfree methods. The nominal, benchmark wheel-soil interaction model that produces numerically stable and physically realistic results is presented and simulations are shown for both wheel traverse and wheel sinkage cases. A sensitivity analysis developing the capability and framework for future ight applications is conducted to illustrate the importance of perturbations to critical material properties and parameters. Implementation of the proposed soil-wheel interaction simulation capability and associated sensitivity framework has the potential to reduce experimentation cost and improve the early stage wheel design proce

  3. Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry: Spray Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.

    2009-04-26

    The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less

  4. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  5. Teaching Tip: Utilizing Classroom Simulation to Convey Key Concepts in IT Portfolio Management

    ERIC Educational Resources Information Center

    Larson, Eric C.

    2013-01-01

    Managing a portfolio of IT projects is an important capability for firms and their managers. The classroom simulation described here provides students in an MBA information systems management/strategy course with the opportunity to deepen their understanding of the key concepts that should be considered in managing an IT portfolio and helps…

  6. On the relationship between parallel computation and graph embedding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, A.K.

    1989-01-01

    The problem of efficiently simulating an algorithm designed for an n-processor parallel machine G on an m-processor parallel machine H with n > m arises when parallel algorithms designed for an ideal size machine are simulated on existing machines which are of a fixed size. The author studies this problem when every processor of H takes over the function of a number of processors in G, and he phrases the simulation problem as a graph embedding problem. New embeddings presented address relevant issues arising from the parallel computation environment. The main focus centers around embedding complete binary trees into smaller-sizedmore » binary trees, butterflies, and hypercubes. He also considers simultaneous embeddings of r source machines into a single hypercube. Constant factors play a crucial role in his embeddings since they are not only important in practice but also lead to interesting theoretical problems. All of his embeddings minimize dilation and load, which are the conventional cost measures in graph embeddings and determine the maximum amount of time required to simulate one step of G on H. His embeddings also optimize a new cost measure called ({alpha},{beta})-utilization which characterizes how evenly the processors of H are used by the processors of G. Ideally, the utilization should be balanced (i.e., every processor of H simulates at most (n/m) processors of G) and the ({alpha},{beta})-utilization measures how far off from a balanced utilization the embedding is. He presents embeddings for the situation when some processors of G have different capabilities (e.g. memory or I/O) than others and the processors with different capabilities are to be distributed uniformly among the processors of H. Placing such conditions on an embedding results in an increase in some of the cost measures.« less

  7. Simulation of the hyperspectral data from multispectral data using Python programming language

    NASA Astrophysics Data System (ADS)

    Tiwari, Varun; Kumar, Vinay; Pandey, Kamal; Ranade, Rigved; Agarwal, Shefali

    2016-04-01

    Multispectral remote sensing (MRS) sensors have proved their potential in acquiring and retrieving information of Land Use Land (LULC) Cover features in the past few decades. These MRS sensor generally acquire data within limited broad spectral bands i.e. ranging from 3 to 10 number of bands. The limited number of bands and broad spectral bandwidth in MRS sensors becomes a limitation in detailed LULC studies as it is not capable of distinguishing spectrally similar LULC features. On the counterpart, fascinating detailed information available in hyperspectral (HRS) data is spectrally over determined and able to distinguish spectrally similar material of the earth surface. But presently the availability of HRS sensors is limited. This is because of the requirement of sensitive detectors and large storage capability, which makes the acquisition and processing cumbersome and exorbitant. So, there arises a need to utilize the available MRS data for detailed LULC studies. Spectral reconstruction approach is one of the technique used for simulating hyperspectral data from available multispectral data. In the present study, spectral reconstruction approach is utilized for the simulation of hyperspectral data using EO-1 ALI multispectral data. The technique is implemented using python programming language which is open source in nature and possess support for advanced imaging processing libraries and utilities. Over all 70 bands have been simulated and validated using visual interpretation, statistical and classification approach.

  8. Final report for the endowment of simulator agents with human-like episodic memory LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, Ann Elizabeth; Lippitt, Carl Edward; Thomas, Edward Victor

    This report documents work undertaken to endow the cognitive framework currently under development at Sandia National Laboratories with a human-like memory for specific life episodes. Capabilities have been demonstrated within the context of three separate problem areas. The first year of the project developed a capability whereby simulated robots were able to utilize a record of shared experience to perform surveillance of a building to detect a source of smoke. The second year focused on simulations of social interactions providing a queriable record of interactions such that a time series of events could be constructed and reconstructed. The third yearmore » addressed tools to promote desktop productivity, creating a capability to query episodic logs in real time allowing the model of a user to build on itself based on observations of the user's behavior.« less

  9. Research and Technology Capabilities Available for Partnership, 2007-2008

    DTIC Science & Technology

    2010-01-01

    simulated aircraft environment to measure acoustic and/ or IR radiation and signature. Instrumentation is capable of 96 pressure channels and 105...temperature channels. Mobile Aircraft Infrared Measurement System (AIMS) is field deployable and is used to take full-spectrum IR measurements at our CTF...three phase power. The facility is utilized for the development of visible, IR and RF spectrum sensors/seekers, signature measurement collection of

  10. Sustainable Human Presence on the Moon using In Situ Resources

    NASA Technical Reports Server (NTRS)

    McLemore, Carol A.; Fikes, John C.; McCarley, Kevin S.; Darby, Charles A.; Curreri, Peter A.; Kennedy, James P.; Good, James E.; Gilley, Scott D.

    2008-01-01

    New capabilities, technologies and infrastructure must be developed to enable a sustained human presence on the moon and beyond. The key to having this permanent presence is the utilization of in situ resources. To this end, NASA is investigating how in situ resources can be utilized to improve mission success by reducing up-mass, improving safety, reducing risk, and bringing down costs for the overall mission. To ensure that this capability is available when needed, technology development is required now. NASA/Marshall Space Flight Center (MSFC) is supporting this endeavor, along with other NASA centers, by exploring how lunar regolith can be mined for uses such as construction, life support, propulsion, power, and fabrication. Efforts at MSFC include development of lunar regolith simulant for hardware testing and development, extraction of oxygen and other materials from the lunar regolith, production of parts and tools on the moon from local materials or from provisioned feedstocks, and capabilities to show that produced parts are "ready for use". This paper discusses the lunar regolith, how the regolith is being replicated in the development of simulants and possible uses of the regolith.

  11. Learning Reverse Engineering and Simulation with Design Visualization

    NASA Technical Reports Server (NTRS)

    Hemsworth, Paul J.

    2018-01-01

    The Design Visualization (DV) group supports work at the Kennedy Space Center by utilizing metrology data with Computer-Aided Design (CAD) models and simulations to provide accurate visual representations that aid in decision-making. The capability to measure and simulate objects in real time helps to predict and avoid potential problems before they become expensive in addition to facilitating the planning of operations. I had the opportunity to work on existing and new models and simulations in support of DV and NASA’s Exploration Ground Systems (EGS).

  12. Current state of virtual reality simulation in robotic surgery training: a review.

    PubMed

    Bric, Justin D; Lumbard, Derek C; Frelich, Matthew J; Gould, Jon C

    2016-06-01

    Worldwide, the annual number of robotic surgical procedures continues to increase. Robotic surgical skills are unique from those used in either open or laparoscopic surgery. The acquisition of a basic robotic surgical skill set may be best accomplished in the simulation laboratory. We sought to review the current literature pertaining to the use of virtual reality (VR) simulation in the acquisition of robotic surgical skills on the da Vinci Surgical System. A PubMed search was conducted between December 2014 and January 2015 utilizing the following keywords: virtual reality, robotic surgery, da Vinci, da Vinci skills simulator, SimSurgery Educational Platform, Mimic dV-Trainer, and Robotic Surgery Simulator. Articles were included if they were published between 2007 and 2015, utilized VR simulation for the da Vinci Surgical System, and utilized a commercially available VR platform. The initial search criteria returned 227 published articles. After all inclusion and exclusion criteria were applied, a total of 47 peer-reviewed manuscripts were included in the final review. There are many benefits to utilizing VR simulation for robotic skills acquisition. Four commercially available simulators have been demonstrated to be capable of assessing robotic skill. Three of the four simulators demonstrate the ability of a VR training curriculum to improve basic robotic skills, with proficiency-based training being the most effective training style. The skills obtained on a VR training curriculum are comparable with those obtained on dry laboratory simulation. The future of VR simulation includes utilization in assessment for re-credentialing purposes, advanced procedural-based training, and as a warm-up tool prior to surgery.

  13. Improving the capability of an integrated CA-Markov model to simulate spatio-temporal urban growth trends using an Analytical Hierarchy Process and Frequency Ratio

    NASA Astrophysics Data System (ADS)

    Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan

    2017-07-01

    The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.

  14. Man-vehicle systems research facility: Design and operating characteristics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Man-Vehicle Systems Research Facility (MVSRF) provides the capability of simulating aircraft (two with full crews), en route and terminal air traffic control and aircrew interactions, and advanced cockpit (1995) display representative of future generations of aircraft, all within the full mission context. The characteristics of this facility derive from research, addressing critical human factors issues that pertain to: (1) information requirements for the utilization and integration of advanced electronic display systems, (2) the interaction and distribution of responsibilities between aircrews and ground controllers, and (3) the automation of aircrew functions. This research has emphasized the need for high fidelity in simulations and for the capability to conduct full mission simulations of relevant aircraft operations. This report briefly describes the MVSRF design and operating characteristics.

  15. Exploring JWST's Capability to Constrain Habitability on Simulated Terrestrial TESS Planets

    NASA Astrophysics Data System (ADS)

    Tremblay, Luke; Britt, Amber; Batalha, Natasha; Schwieterman, Edward; Arney, Giada; Domagal-Goldman, Shawn; Mandell, Avi; Planetary Systems Laboratory; Virtual Planetary Laboratory

    2017-01-01

    In the following, we have worked to develop a flexible "observability" scale of biologically relevant molecules in the atmospheres of newly discovered exoplanets for the instruments aboard NASA's next flagship mission, the James Webb Space Telescope (JWST). We sought to create such a scale in order to provide the community with a tool with which to optimize target selection for JWST observations based on detections of the upcoming Transiting Exoplanet Satellite Survey (TESS). Current literature has laid the groundwork for defining both biologically relevant molecules as well as what characteristics would make a new world "habitable", but it has so far lacked a cohesive analysis of JWST's capabilities to observe these molecules in exoplanet atmospheres and thereby constrain habitability. In developing our Observability Scale, we utilized a range of hypothetical planets (over planetary radii and stellar insolation) and generated three self-consistent atmospheric models (of dierent molecular compositions) for each of our simulated planets. With these planets and their corresponding atmospheres, we utilized the most accurate JWST instrument simulator, created specically to process transiting exoplanet spectra. Through careful analysis of these simulated outputs, we were able to determine the relevant parameters that effected JWST's ability to constrain each individual molecular bands with statistical accuracy and therefore generate a scale based on those key parameters. As a preliminary test of our Observability Scale, we have also applied it to the list of TESS candidate stars in order to determine JWST's observational capabilities for any soon-to-be-detected planet in those solar systems.

  16. Advanced Energy Validated Photovoltaic Inverter Technology at NREL | Energy

    Science.gov Websites

    power hardware-in-the-loop system and megawatt-scale grid simulators. Photo of two men pointing at a The ESIF's utility-scale power hardware-in-the-loop capability allowed Advanced Energy to loop its

  17. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  18. Jdpd: an open java simulation kernel for molecular fragment dissipative particle dynamics.

    PubMed

    van den Broek, Karina; Kuhn, Hubert; Zielesny, Achim

    2018-05-21

    Jdpd is an open Java simulation kernel for Molecular Fragment Dissipative Particle Dynamics with parallelizable force calculation, efficient caching options and fast property calculations. It is characterized by an interface and factory-pattern driven design for simple code changes and may help to avoid problems of polyglot programming. Detailed input/output communication, parallelization and process control as well as internal logging capabilities for debugging purposes are supported. The new kernel may be utilized in different simulation environments ranging from flexible scripting solutions up to fully integrated "all-in-one" simulation systems.

  19. NLS Flight Simulation Laboratory (FSL) documentation

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Flight Simulation Laboratory (FSL) Electronic Documentation System design consists of modification and utilization of the MSFC Integrated Engineering System (IES), translation of the existing FSL documentation to an electronic format, and generation of new drawings to represent the Engine Flight Simulation Laboratory design and implementation. The intent of the electronic documentation is to provide ease of access, local print/plot capabilities, as well as the ability to correct and/or modify the stored data by network users who are authorized to access this information.

  20. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  1. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE PAGES

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...

    2017-07-24

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  2. Flight code validation simulator

    NASA Astrophysics Data System (ADS)

    Sims, Brent A.

    1996-05-01

    An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.

  3. VIPER: Virtual Intelligent Planetary Exploration Rover

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Flueckiger, Lorenzo; Nguyen, Laurent; Washington, Richard

    2001-01-01

    Simulation and visualization of rover behavior are critical capabilities for scientists and rover operators to construct, test, and validate plans for commanding a remote rover. The VIPER system links these capabilities. using a high-fidelity virtual-reality (VR) environment. a kinematically accurate simulator, and a flexible plan executive to allow users to simulate and visualize possible execution outcomes of a plan under development. This work is part of a larger vision of a science-centered rover control environment, where a scientist may inspect and explore the environment via VR tools, specify science goals, and visualize the expected and actual behavior of the remote rover. The VIPER system is constructed from three generic systems, linked together via a minimal amount of customization into the integrated system. The complete system points out the power of combining plan execution, simulation, and visualization for envisioning rover behavior; it also demonstrates the utility of developing generic technologies. which can be combined in novel and useful ways.

  4. Optimization Model for Web Based Multimodal Interactive Simulations.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  5. Optimization Model for Web Based Multimodal Interactive Simulations

    PubMed Central

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-01-01

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713

  6. Study of Dynamic Characteristics of Aeroelastic Systems Utilizing Randomdec Signatures

    NASA Technical Reports Server (NTRS)

    Chang, C. S.

    1975-01-01

    The feasibility of utilizing the random decrement method in conjunction with a signature analysis procedure to determine the dynamic characteristics of an aeroelastic system for the purpose of on-line prediction of potential on-set of flutter was examined. Digital computer programs were developed to simulate sampled response signals of a two-mode aeroelastic system. Simulated response data were used to test the random decrement method. A special curve-fit approach was developed for analyzing the resulting signatures. A number of numerical 'experiments' were conducted on the combined processes. The method is capable of determining frequency and damping values accurately from randomdec signatures of carefully selected lengths.

  7. A tapered dielectric waveguide solar concentrator for a compound semiconductor photovoltaic cell.

    PubMed

    Park, Minkyu; Oh, Kyunghwan; Kim, Jeong; Shin, Hyun Woo; Oh, Byung Du

    2010-01-18

    A novel tapered dielectric waveguide solar concentrator is proposed for compound semiconductor solar cells utilizing optical fiber preform. Its light collecting capability is numerically simulated and experimentally demonstrated for feasibility and potential assessments. Utilizing tapered shape of an optical fiber preform with a step-index profile, low loss guidance was enhanced and the limitation in the acceptance angle of solar radiation was alleviated by an order of magnitude. Using a solar simulator the device performances were experimentally investigated and discussed in terms of the photocurrent improvements. Total acceptance angle exceeding +/- 6 degrees was experimentally achieved sustaining a high solar flux.

  8. Performance of an MPI-only semiconductor device simulator on a quad socket/quad core InfiniBand platform.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, John Nicolas; Lin, Paul Tinphone

    2009-01-01

    This preliminary study considers the scaling and performance of a finite element (FE) semiconductor device simulator on a capacity cluster with 272 compute nodes based on a homogeneous multicore node architecture utilizing 16 cores. The inter-node communication backbone for this Tri-Lab Linux Capacity Cluster (TLCC) machine is comprised of an InfiniBand interconnect. The nonuniform memory access (NUMA) nodes consist of 2.2 GHz quad socket/quad core AMD Opteron processors. The performance results for this study are obtained with a FE semiconductor device simulation code (Charon) that is based on a fully-coupled Newton-Krylov solver with domain decomposition and multilevel preconditioners. Scaling andmore » multicore performance results are presented for large-scale problems of 100+ million unknowns on up to 4096 cores. A parallel scaling comparison is also presented with the Cray XT3/4 Red Storm capability platform. The results indicate that an MPI-only programming model for utilizing the multicore nodes is reasonably efficient on all 16 cores per compute node. However, the results also indicated that the multilevel preconditioner, which is critical for large-scale capability type simulations, scales better on the Red Storm machine than the TLCC machine.« less

  9. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    NASA Technical Reports Server (NTRS)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  10. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, David G.; Cook, Marvin A.

    This report summarizes collaborative efforts between Secure Scalable Microgrid and Korean Institute of Energy Research team members . The efforts aim to advance microgrid research and development towards the efficient utilization of networked microgrids . The collaboration resulted in the identification of experimental and real time simulation capabilities that may be leveraged for networked microgrids research, development, and demonstration . Additional research was performed to support the demonstration of control techniques within real time simulation and with hardware in the loop for DC microgrids .

  12. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    NASA Astrophysics Data System (ADS)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; McPherson, Brian J.; Grigg, Reid B.

    Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storagemore » of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.« less

  14. Training Analysis of P-3 Replacement Pilot Training.

    ERIC Educational Resources Information Center

    Browning, Robert F.; And Others

    The report covers an evaluation of current P-3 pilot training programs at the replacement squadron level. It contains detailed discussions concerning training hardware and software that have been supplied. A detailed examination is made of the curriculum and the simulation capabilities and utilization of P-3 operational flight trainers. Concurrent…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less

  16. A comparative analysis and guide to virtual reality robotic surgical simulators.

    PubMed

    Julian, Danielle; Tanaka, Alyssa; Mattingly, Patricia; Truong, Mireille; Perez, Manuela; Smith, Roger

    2018-02-01

    Since the US Food and Drug Administration approved robotically assisted surgical devices for human surgery in 2000, the number of surgeries utilizing this innovative technology has risen. In 2015, approximately 650 000 robot-assisted procedures were performed worldwide. Surgeons must be properly trained to safely transition to using such innovative technology. Multiple virtual reality robotic simulators are now commercially available for educational and training purposes. There is a need for comparative evaluations of these simulators to aid users in selecting an appropriate device for their purposes. We conducted a comparison of the design and capabilities of all dedicated simulators of the da Vinci robot - the da Vinci Skills Simulator (dVSS), dV-Trainer (dVT), Robotic Skills Simulators (RoSS) and the RobotiX Mentor. This paper provides the base specifications of the hardware and software, with an emphasis on the training capabilities of each system. Each simulator contains a large number of training exercises for skills development: dVSS n = 40, dVT n = 65, RoSS n = 52, RobotiX Mentor n = 31. All four offer 3D visual images but use different display technologies. The dVSS leverages the real robotic surgical console to provide visualization, hand controls and foot pedals. The dVT, RoSS and RobotiX Mentor created simulated versions of all of these control systems. Each includes systems management services that allow instructors to collect, export and analyze the scores of students using the simulators. This study provides comparative information on the four simulators' functional capabilities. Each device offers unique advantages and capabilities for training robotic surgeons. Each has been the subject of validation experiments, which have been published in the literature. But those do not provide specific details on the capabilities of the simulators, which are necessary for an understanding sufficient to select the one best suited for an organization's needs. This article provides comparative information to assist with that type of selection. Copyright © 2017 John Wiley & Sons, Ltd.

  17. A Monte Carlo simulation study for the gamma-ray/neutron dual-particle imager using rotational modulation collimator (RMC).

    PubMed

    Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun

    2018-03-01

    The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.

  18. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  19. Toolbox for Urban Mobility Simulation: High Resolution Population Dynamics for Global Cities

    NASA Astrophysics Data System (ADS)

    Bhaduri, B. L.; Lu, W.; Liu, C.; Thakur, G.; Karthik, R.

    2015-12-01

    In this rapidly urbanizing world, unprecedented rate of population growth is not only mirrored by increasing demand for energy, food, water, and other natural resources, but has detrimental impacts on environmental and human security. Transportation simulations are frequently used for mobility assessment in urban planning, traffic operation, and emergency management. Previous research, involving purely analytical techniques to simulations capturing behavior, has investigated questions and scenarios regarding the relationships among energy, emissions, air quality, and transportation. Primary limitations of past attempts have been availability of input data, useful "energy and behavior focused" models, validation data, and adequate computational capability that allows adequate understanding of the interdependencies of our transportation system. With increasing availability and quality of traditional and crowdsourced data, we have utilized the OpenStreetMap roads network, and has integrated high resolution population data with traffic simulation to create a Toolbox for Urban Mobility Simulations (TUMS) at global scale. TUMS consists of three major components: data processing, traffic simulation models, and Internet-based visualizations. It integrates OpenStreetMap, LandScanTM population, and other open data (Census Transportation Planning Products, National household Travel Survey, etc.) to generate both normal traffic operation and emergency evacuation scenarios. TUMS integrates TRANSIMS and MITSIM as traffic simulation engines, which are open-source and widely-accepted for scalable traffic simulations. Consistent data and simulation platform allows quick adaption to various geographic areas that has been demonstrated for multiple cities across the world. We are combining the strengths of geospatial data sciences, high performance simulations, transportation planning, and emissions, vehicle and energy technology development to design and develop a simulation framework to assist decision makers at all levels - local, state, regional, and federal. Using Cleveland, Tennessee as an example, in this presentation, we illustrate how emerging cities could easily assess future land use scenario driven impacts on energy and environment utilizing such a capability.

  20. Hardware simulation of fuel cell/gas turbine hybrids

    NASA Astrophysics Data System (ADS)

    Smith, Thomas Paul

    Hybrid solid oxide fuel cell/gas turbine (SOFC/GT) systems offer high efficiency power generation, but face numerous integration and operability challenges. This dissertation addresses the application of hardware-in-the-loop simulation (HILS) to explore the performance of a solid oxide fuel cell stack and gas turbine when combined into a hybrid system. Specifically, this project entailed developing and demonstrating a methodology for coupling a numerical SOFC subsystem model with a gas turbine that has been modified with supplemental process flow and control paths to mimic a hybrid system. This HILS approach was implemented with the U.S. Department of Energy Hybrid Performance Project (HyPer) located at the National Energy Technology Laboratory. By utilizing HILS the facility provides a cost effective and capable platform for characterizing the response of hybrid systems to dynamic variations in operating conditions. HILS of a hybrid system was accomplished by first interfacing a numerical model with operating gas turbine hardware. The real-time SOFC stack model responds to operating turbine flow conditions in order to predict the level of thermal effluent from the SOFC stack. This simulated level of heating then dynamically sets the turbine's "firing" rate to reflect the stack output heat rate. Second, a high-speed computer system with data acquisition capabilities was integrated with the existing controls and sensors of the turbine facility. In the future, this will allow for the utilization of high-fidelity fuel cell models that infer cell performance parameters while still computing the simulation in real-time. Once the integration of the numeric and the hardware simulation components was completed, HILS experiments were conducted to evaluate hybrid system performance. The testing identified non-intuitive transient responses arising from the large thermal capacitance of the stack that are inherent to hybrid systems. Furthermore, the tests demonstrated the capabilities of HILS as a research tool for investigating the dynamic behavior of SOFC/GT hybrid power generation systems.

  1. The Littoral Combat Ship (LCS) Surface Warfare (SUW) Module: Determining the Surface-To-Surface Missile and Air-To-Surface Missile Mix

    DTIC Science & Technology

    2010-09-01

    agent-based modeling platform known as MANA. The simulation is exercised over a broad range of different weapon systems types with their capabilities...Navy B.A., University of Florida, 2004 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN MODELING ...aerial vehicle (UAV) will have. This study uses freely available data to build a simulation utilizing an agent-based modeling platform known as MANA

  2. Iterative repair for scheduling and rescheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene; Deale, Michael

    1991-01-01

    An iterative repair search method is described called constraint based simulated annealing. Simulated annealing is a hill climbing search technique capable of escaping local minima. The utility of the constraint based framework is shown by comparing search performance with and without the constraint framework on a suite of randomly generated problems. Results are also shown of applying the technique to the NASA Space Shuttle ground processing problem. These experiments show that the search methods scales to complex, real world problems and reflects interesting anytime behavior.

  3. Novel use of a noninvasive hemodynamic monitor in a personalized, active learning simulation.

    PubMed

    Zoller, Jonathan K; He, Jianghua; Ballew, Angela T; Orr, Walter N; Flynn, Brigid C

    2017-06-01

    The present study furthered the concept of simulation-based medical education by applying a personalized active learning component. We tested this novel approach utilizing a noninvasive hemodynamic monitor with the capability to measure and display in real time numerous hemodynamic parameters in the exercising participant. Changes in medical knowledge concerning physiology were examined with a pre-and posttest. Simply by observation of one's own hemodynamic variables, the understanding of complex physiological concepts was significantly enhanced. Copyright © 2017 the American Physiological Society.

  4. Image Processing

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A new spinoff product was derived from Geospectra Corporation's expertise in processing LANDSAT data in a software package. Called ATOM (for Automatic Topographic Mapping), it's capable of digitally extracting elevation information from stereo photos taken by spaceborne cameras. ATOM offers a new dimension of realism in applications involving terrain simulations, producing extremely precise maps of an area's elevations at a lower cost than traditional methods. ATOM has a number of applications involving defense training simulations and offers utility in architecture, urban planning, forestry, petroleum and mineral exploration.

  5. Integration of OpenMC methods into MAMMOTH and Serpent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerby, Leslie; DeHart, Mark; Tumulak, Aaron

    OpenMC, a Monte Carlo particle transport simulation code focused on neutron criticality calculations, contains several methods we wish to emulate in MAMMOTH and Serpent. First, research coupling OpenMC and the Multiphysics Object-Oriented Simulation Environment (MOOSE) has shown promising results. Second, the utilization of Functional Expansion Tallies (FETs) allows for a more efficient passing of multiphysics data between OpenMC and MOOSE. Both of these capabilities have been preliminarily implemented into Serpent. Results are discussed and future work recommended.

  6. Evaluation and utilization of beam simulation codes for the SNS ion source and low energy beam transport developmenta)

    NASA Astrophysics Data System (ADS)

    Han, B. X.; Welton, R. F.; Stockli, M. P.; Luciano, N. P.; Carmichael, J. R.

    2008-02-01

    Beam simulation codes PBGUNS, SIMION, and LORENTZ-3D were evaluated by modeling the well-diagnosed SNS base line ion source and low energy beam transport (LEBT) system. Then, an investigation was conducted using these codes to assist our ion source and LEBT development effort which is directed at meeting the SNS operational and also the power-upgrade project goals. A high-efficiency H- extraction system as well as magnetic and electrostatic LEBT configurations capable of transporting up to 100mA is studied using these simulation tools.

  7. Periodically poled silicon

    NASA Astrophysics Data System (ADS)

    Hon, Nick K.; Tsia, Kevin K.; Solli, Daniel R.; Jalali, Bahram

    2009-03-01

    We propose a new class of photonic devices based on periodic stress fields in silicon that enable second-order nonlinearity as well as quasi-phase matching. Periodically poled silicon (PePSi) adds the periodic poling capability to silicon photonics and allows the excellent crystal quality and advanced manufacturing capabilities of silicon to be harnessed for devices based on second-order nonlinear effects. As an example of the utility of the PePSi technology, we present simulations showing that midwave infrared radiation can be efficiently generated through difference frequency generation from near-infrared with a conversion efficiency of 50%.

  8. Control Room Training for the Hyper-X Project Utilizing Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Lux-Baumann, Jesica; Dees, Ray; Fratello, David

    2006-01-01

    The NASA Dryden Flight Research Center flew two Hyper-X research vehicles and achieved hypersonic speeds over the Pacific Ocean in March and November 2004. To train the flight and mission control room crew, the NASA Dryden simulation capability was utilized to generate telemetry and radar data, which was used in nominal and emergency mission scenarios. During these control room training sessions personnel were able to evaluate and refine data displays, flight cards, mission parameter allowable limits, and emergency procedure checklists. Practice in the mission control room ensured that all primary and backup Hyper-X staff were familiar with the nominal mission and knew how to respond to anomalous conditions quickly and successfully. This report describes the technology in the simulation environment and the Mission Control Center, the need for and benefit of control room training, and the rationale and results of specific scenarios unique to the Hyper-X research missions.

  9. Control Room Training for the Hyper-X Program Utilizing Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Lux-Baumann, Jessica R.; Dees, Ray A.; Fratello, David J.

    2006-01-01

    The NASA Dryden Flight Research Center flew two Hyper-X Research Vehicles and achieved hypersonic speeds over the Pacific Ocean in March and November 2004. To train the flight and mission control room crew, the NASA Dryden simulation capability was utilized to generate telemetry and radar data, which was used in nominal and emergency mission scenarios. During these control room training sessions, personnel were able to evaluate and refine data displays, flight cards, mission parameter allowable limits, and emergency procedure checklists. Practice in the mission control room ensured that all primary and backup Hyper-X staff were familiar with the nominal mission and knew how to respond to anomalous conditions quickly and successfully. This paper describes the technology in the simulation environment and the mission control center, the need for and benefit of control room training, and the rationale and results of specific scenarios unique to the Hyper-X research missions.

  10. Efficient Schmidt number scaling in dissipative particle dynamics

    NASA Astrophysics Data System (ADS)

    Krafnick, Ryan C.; García, Angel E.

    2015-12-01

    Dissipative particle dynamics is a widely used mesoscale technique for the simulation of hydrodynamics (as well as immersed particles) utilizing coarse-grained molecular dynamics. While the method is capable of describing any fluid, the typical choice of the friction coefficient γ and dissipative force cutoff rc yields an unacceptably low Schmidt number Sc for the simulation of liquid water at standard temperature and pressure. There are a variety of ways to raise Sc, such as increasing γ and rc, but the relative cost of modifying each parameter (and the concomitant impact on numerical accuracy) has heretofore remained undetermined. We perform a detailed search over the parameter space, identifying the optimal strategy for the efficient and accuracy-preserving scaling of Sc, using both numerical simulations and theoretical predictions. The composite results recommend a parameter choice that leads to a speed improvement of a factor of three versus previously utilized strategies.

  11. Studies with a reconstituted muscle glycolytic system. The anaerobic glycolytic response to simulated tetanic contraction

    PubMed Central

    Scopes, Robert K.

    1974-01-01

    By using a reconstituted glycolytic system and a highly active adenosine triphosphatase (ATPase), the metabolism during muscular tetanic contraction was simulated and observed. With an ATPase activity somewhat greater than can be maintained in muscle tissue, phosphocreatine was rapidly and completely utilized, lactate production commenced about 5s after the ATPase was added and after 15s adenine nucleotides were lost through deamination to IMP. By 40s, all metabolism ceased because of complete loss of adenine mononucleotides. With a lower ATPase activity, glycolytic regeneration of ATP was capable of maintaining the ATP concentration at its initial value and even by 80s, only one-half of the phosphocreatine had been utilized. No deamination occurred in this time. It is suggested that the metabolic events observed in the simulated system are basically the same as occur in muscle doing heavy work. PMID:4275706

  12. PowderSim: Lagrangian Discrete and Mesh-Free Continuum Simulation Code for Cohesive Soils

    NASA Technical Reports Server (NTRS)

    Johnson, Scott; Walton, Otis; Settgast, Randolph

    2013-01-01

    PowderSim is a calculation tool that combines a discrete-element method (DEM) module, including calibrated interparticle-interaction relationships, with a mesh-free, continuum, SPH (smoothed-particle hydrodynamics) based module that utilizes enhanced, calibrated, constitutive models capable of mimicking both large deformations and the flow behavior of regolith simulants and lunar regolith under conditions anticipated during in situ resource utilization (ISRU) operations. The major innovation introduced in PowderSim is to use a mesh-free method (SPH-based) with a calibrated and slightly modified critical-state soil mechanics constitutive model to extend the ability of the simulation tool to also address full-scale engineering systems in the continuum sense. The PowderSim software maintains the ability to address particle-scale problems, like size segregation, in selected regions with a traditional DEM module, which has improved contact physics and electrostatic interaction models.

  13. Atmospheric Modeling And Sensor Simulation (AMASS) study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1984-01-01

    The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.

  14. Radar range data signal enhancement tracker

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The design, fabrication, and performance characteristics are described of two digital data signal enhancement filters which are capable of being inserted between the Space Shuttle Navigation Sensor outputs and the guidance computer. Commonality of interfaces has been stressed so that the filters may be evaluated through operation with simulated sensors or with actual prototype sensor hardware. The filters will provide both a smoothed range and range rate output. Different conceptual approaches are utilized for each filter. The first filter is based on a combination low pass nonrecursive filter and a cascaded simple average smoother for range and range rate, respectively. Filter number two is a tracking filter which is capable of following transient data of the type encountered during burn periods. A test simulator was also designed which generates typical shuttle navigation sensor data.

  15. Utilization of a CRT display light pen in the design of feedback control systems

    NASA Technical Reports Server (NTRS)

    Thompson, J. G.; Young, K. R.

    1972-01-01

    A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.

  16. Granular Simulation of NEO Anchoring

    NASA Technical Reports Server (NTRS)

    Mazhar, Hammad

    2011-01-01

    NASA is interested in designing a spacecraft capable of visiting a Near Earth Object (NEO), performing experiments, and then returning safely. Certain periods of this mission will require the spacecraft to remain stationary relative to the NEO. Such situations require an anchoring mechanism that is compact, easy to deploy and upon mission completion, easily removed. The design philosophy used in the project relies on the simulation capability of a multibody dynamics physics engine. On Earth it is difficult to create low gravity conditions and testing in low gravity environments, whether artificial or in space is costly and therefore not feasible. Through simulation, gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine [1], a simulation package capable of utilizing massively parallel GPU hardware, several validation experiments will be performed. Once there is sufficient confidence, modeling of the NEO regolith interaction will begin after which the anchor tests will be performed and analyzed. The outcome of this task is a study with an analysis of several different anchor designs, along with a recommendation on which anchor is better suited to the task of anchoring. With the anchors tested against a range of parameters relating to soil, environment and anchor penetration angles/velocities on a NEO.

  17. Utilizing inventory information to calibrate a landscape simulation model

    Treesearch

    Steven R. Shifley; Frank R., III Thompson; David R. Larsen; David J. Mladenoff; Eric J. Gustafson

    2000-01-01

    LANDIS is a spatially explicit model that uses mapped landscape conditions as a starting point and projects the patterns in forest vegetation that will result from alternative harvest practices, alternative fire regimes, and wind events. LANDIS was originally developed for Lake States forests, but it is capable of handling the input, output, bookkeeping, and mapping...

  18. Laboratory Testing of Demand-Response Enabled Household Appliances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sparn, B.; Jin, X.; Earle, L.

    2013-10-01

    With the advent of the Advanced Metering Infrastructure (AMI) systems capable of two-way communications between the utility's grid and the building, there has been significant effort in the Automated Home Energy Management (AHEM) industry to develop capabilities that allow residential building systems to respond to utility demand events by temporarily reducing their electricity usage. Major appliance manufacturers are following suit by developing Home Area Network (HAN)-tied appliance suites that can take signals from the home's 'smart meter,' a.k.a. AMI meter, and adjust their run cycles accordingly. There are numerous strategies that can be employed by household appliances to respond tomore » demand-side management opportunities, and they could result in substantial reductions in electricity bills for the residents depending on the pricing structures used by the utilities to incent these types of responses.The first step to quantifying these end effects is to test these systems and their responses in simulated demand-response (DR) conditions while monitoring energy use and overall system performance.« less

  19. Laboratory Testing of Demand-Response Enabled Household Appliances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sparn, B.; Jin, X.; Earle, L.

    2013-10-01

    With the advent of the Advanced Metering Infrastructure (AMI) systems capable of two-way communications between the utility's grid and the building, there has been significant effort in the Automated Home Energy Management (AHEM) industry to develop capabilities that allow residential building systems to respond to utility demand events by temporarily reducing their electricity usage. Major appliance manufacturers are following suit by developing Home Area Network (HAN)-tied appliance suites that can take signals from the home's 'smart meter,' a.k.a. AMI meter, and adjust their run cycles accordingly. There are numerous strategies that can be employed by household appliances to respond tomore » demand-side management opportunities, and they could result in substantial reductions in electricity bills for the residents depending on the pricing structures used by the utilities to incent these types of responses. The first step to quantifying these end effects is to test these systems and their responses in simulated demand-response (DR) conditions while monitoring energy use and overall system performance.« less

  20. Integrated Simulation Design Challenges to Support TPS Repair Operations

    NASA Technical Reports Server (NTRS)

    Quiocho, Leslie J.; Crues, Edwin Z.; Huynh, An; Nguyen, Hung T.; MacLean, John

    2005-01-01

    During the Orbiter Repair Maneuver (ORM) operations planned for Return to Flight (RTF), the Shuttle Remote Manipulator System (SRMS) must grapple the International Space Station (ISS), undock the Orbiter, maneuver it through a long duration trajectory, and orient it to an EVA crewman poised at the end of the Space Station Remote Manipulator System (SSRMS) to facilitate the repair of the Thermal Protection System (TPS). Once repair has been completed and confirmed, then the SRMS proceeds back through the trajectory to dock the Orbiter to the Orbiter Docking System. In order to support analysis of the complex dynamic interactions of the integrated system formed by the Orbiter, ISS, SRMS, and SSRMS during the ORM, simulation tools used for previous 'nominal' mission support required substantial enhancements. These upgrades were necessary to provide analysts with the capabilities needed to study integrated system performance. This paper discusses the simulation design challenges encountered while developing simulation capabilities to mirror the ORM operations. The paper also describes the incremental build approach that was utilized, starting with the subsystem simulation elements and integration into increasing more complex simulations until the resulting ORM worksite dynamics simulation had been assembled. Furthermore, the paper presents an overall integrated simulation V&V methodology based upon a subsystem level testing, integrated comparisons, and phased checkout.

  1. Chemical Sensing Systems that Utilize Soft Electronics on Thin Elastomeric Substrates with Open Cellular Designs

    PubMed Central

    Lee, Yoon Kyeung; Jang, Kyung-In; Ma, Yinji; Koh, Ahyeon; Chen, Hang; Jung, Han Na; Kim, Yerim; Kwak, Jean Won; Wang, Liang; Xue, Yeguang; Yang, Yiyuan; Tian, Wenlong; Jiang, Yu; Zhang, Yihui; Feng, Xue; Huang, Yonggang

    2017-01-01

    A collection of materials and device architectures are introduced for thin, stretchable arrays of ion sensors that mount on open cellular substrates to facilitate solution exchange for use in biointegrated electronics. The results include integration strategies and studies of fundamental characteristics in chemical sensing and mechanical response. The latter involves experimental measurements and theoretical simulations that establish important considerations in the design of low modulus, stretchable properties in cellular substrates, and in the realization of advanced capabilities in spatiotemporal mapping of chemicals' gradients. As the chemical composition of extracellular fluids contains valuable information related to biological function, the concepts introduced here have potential utility across a range of skin- and internal-organ-integrated electronics where soft mechanics, fluidic permeability, and advanced chemical sensing capabilities are key requirements. PMID:28989338

  2. The development of an autonomous rendezvous and docking simulation using rapid integration and prototyping technology

    NASA Technical Reports Server (NTRS)

    Shackelford, John H.; Saugen, John D.; Wurst, Michael J.; Adler, James

    1991-01-01

    A generic planar 3 degree of freedom simulation was developed that supports hardware in the loop simulations, guidance and control analysis, and can directly generate flight software. This simulation was developed in a small amount of time utilizing rapid prototyping techniques. The approach taken to develop this simulation tool, the benefits seen using this approach to development, and on-going efforts to improve and extend this capability are described. The simulation is composed of 3 major elements: (1) Docker dynamics model, (2) Dockee dynamics model, and (3) Docker Control System. The docker and dockee models are based on simple planar orbital dynamics equations using a spherical earth gravity model. The docker control system is based on a phase plane approach to error correction.

  3. High Fidelity, “Faster than Real-Time” Simulator for Predicting Power System Dynamic Behavior - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flueck, Alex

    The “High Fidelity, Faster than Real­Time Simulator for Predicting Power System Dynamic Behavior” was designed and developed by Illinois Institute of Technology with critical contributions from Electrocon International, Argonne National Laboratory, Alstom Grid and McCoy Energy. Also essential to the project were our two utility partners: Commonwealth Edison and AltaLink. The project was a success due to several major breakthroughs in the area of large­scale power system dynamics simulation, including (1) a validated faster than real­ time simulation of both stable and unstable transient dynamics in a large­scale positive sequence transmission grid model, (2) a three­phase unbalanced simulation platform formore » modeling new grid devices, such as independently controlled single­phase static var compensators (SVCs), (3) the world’s first high fidelity three­phase unbalanced dynamics and protection simulator based on Electrocon’s CAPE program, and (4) a first­of­its­ kind implementation of a single­phase induction motor model with stall capability. The simulator results will aid power grid operators in their true time of need, when there is a significant risk of cascading outages. The simulator will accelerate performance and enhance accuracy of dynamics simulations, enabling operators to maintain reliability and steer clear of blackouts. In the long­term, the simulator will form the backbone of the newly conceived hybrid real­time protection and control architecture that will coordinate local controls, wide­area measurements, wide­area controls and advanced real­time prediction capabilities. The nation’s citizens will benefit in several ways, including (1) less down time from power outages due to the faster­than­real­time simulator’s predictive capability, (2) higher levels of reliability due to the detailed dynamics plus protection simulation capability, and (3) more resiliency due to the three­ phase unbalanced simulator’s ability to model three­phase and single­ phase networks and devices.« less

  4. Improved fault ride through capability of DFIG based wind turbines using synchronous reference frame control based dynamic voltage restorer.

    PubMed

    Rini Ann Jerin, A; Kaliannan, Palanisamy; Subramaniam, Umashankar

    2017-09-01

    Fault ride through (FRT) capability in wind turbines to maintain the grid stability during faults has become mandatory with the increasing grid penetration of wind energy. Doubly fed induction generator based wind turbine (DFIG-WT) is the most popularly utilized type of generator but highly susceptible to the voltage disturbances in grid. Dynamic voltage restorer (DVR) based external FRT capability improvement is considered. Since DVR is capable of providing fast voltage sag mitigation during faults and can maintain the nominal operating conditions for DFIG-WT. The effectiveness of the DVR using Synchronous reference frame (SRF) control is investigated for FRT capability in DFIG-WT during both balanced and unbalanced fault conditions. The operation of DVR is confirmed using time-domain simulation in MATLAB/Simulink using 1.5MW DFIG-WT. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Design for progressive fracture in composite shell structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Murthy, Pappu L. N.

    1992-01-01

    The load carrying capability and structural behavior of composite shell structures and stiffened curved panels are investigated to provide accurate early design loads. An integrated computer code is utilized for the computational simulation of composite structural degradation under practical loading for realistic design. Damage initiation, growth, accumulation, and propagation to structural fracture are included in the simulation. Progressive fracture investigations providing design insight for several classes of composite shells are presented. Results demonstrate the significance of local defects, interfacial regions, and stress concentrations on the structural durability of composite shells.

  6. Rocket nozzle thermal shock tests in an arc heater facility

    NASA Technical Reports Server (NTRS)

    Painter, James H.; Williamson, Ronald A.

    1986-01-01

    A rocket motor nozzle thermal structural test technique that utilizes arc heated nitrogen to simulate a motor burn was developed. The technique was used to test four heavily instrumented full-scale Star 48 rocket motor 2D carbon/carbon segments at conditions simulating the predicted thermal-structural environment. All four nozzles survived the tests without catastrophic or other structural failures. The test technique demonstrated promise as a low cost, controllable alternative to rocket motor firing. The technique includes the capability of rapid termination in the event of failure, allowing post-test analysis.

  7. Letting thoughts take wing.

    PubMed

    Jorgensen, Chuck; Wheeler, Kevin

    2002-03-01

    Recent developments in neuroelectronics are applied to aviation and airplane flight control instruments. Electromyographic control has been applied to flight simulations using the autopilot interface in order to use gestures to give bank and pitch commands to the autopilot. In other demonstrations, direct rate control was used to perform repeated successful landings and the damage-adaptive capability of inner-loop neural and propulsion-based controls was utilized.

  8. New methods and materials for molding and casting ice formations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Richter, G. Paul

    1987-01-01

    This study was designed to find improved materials and techniques for molding and casting natural or simulated ice shapes that could replace the wax and plaster method. By utilizing modern molding and casting materials and techniques, a new methodology was developed that provides excellent reproduction, low-temperature capability, and reasonable turnaround time. The resulting casts are accurate and tough.

  9. [Objective surgery -- advanced robotic devices and simulators used for surgical skill assessment].

    PubMed

    Suhánszki, Norbert; Haidegger, Tamás

    2014-12-01

    Robotic assistance became a leading trend in minimally invasive surgery, which is based on the global success of laparoscopic surgery. Manual laparoscopy requires advanced skills and capabilities, which is acquired through tedious learning procedure, while da Vinci type surgical systems offer intuitive control and advanced ergonomics. Nevertheless, in either case, the key issue is to be able to assess objectively the surgeons' skills and capabilities. Robotic devices offer radically new way to collect data during surgical procedures, opening the space for new ways of skill parameterization. This may be revolutionary in MIS training, given the new and objective surgical curriculum and examination methods. The article reviews currently developed skill assessment techniques for robotic surgery and simulators, thoroughly inspecting their validation procedure and utility. In the coming years, these methods will become the mainstream of Western surgical education.

  10. Vectorization of a particle simulation method for hypersonic rarefied flow

    NASA Technical Reports Server (NTRS)

    Mcdonald, Jeffrey D.; Baganoff, Donald

    1988-01-01

    An efficient particle simulation technique for hypersonic rarefied flows is presented at an algorithmic and implementation level. The implementation is for a vector computer architecture, specifically the Cray-2. The method models an ideal diatomic Maxwell molecule with three translational and two rotational degrees of freedom. Algorithms are designed specifically for compatibility with fine grain parallelism by reducing the number of data dependencies in the computation. By insisting on this compatibility, the method is capable of performing simulation on a much larger scale than previously possible. A two-dimensional simulation of supersonic flow over a wedge is carried out for the near-continuum limit where the gas is in equilibrium and the ideal solution can be used as a check on the accuracy of the gas model employed in the method. Also, a three-dimensional, Mach 8, rarefied flow about a finite-span flat plate at a 45 degree angle of attack was simulated. It utilized over 10 to the 7th particles carried through 400 discrete time steps in less than one hour of Cray-2 CPU time. This problem was chosen to exhibit the capability of the method in handling a large number of particles and a true three-dimensional geometry.

  11. High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications

    NASA Technical Reports Server (NTRS)

    Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad

    2012-01-01

    NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure shows an example of this capability where the Brazil Nut problem is simulated: as the container full of granular material is vibrated, the large ball slowly moves upwards. This capability was expanded to account for anchors of different shapes and penetration velocities, interacting with granular soils.

  12. Geology team

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Evaluating of the combined utility of narrowband and multispectral imaging in both the infrared and visible for the lithologic identification of geologic materials, and of the combined utility of multispectral imaging in the visible and infrared for lithologic mapping on a global bases are near term recommendations for future imaging capabilities. Long term recommendations include laboratory research into methods of field sampling and theoretical models of microscale mixing. The utility of improved spatial and spectral resolutions and radiometric sensitivity is also suggested for the long term. Geobotanical remote sensing research should be conducted to (1) separate geological and botanical spectral signatures in individual picture elements; (2) study geobotanical correlations that more fully simulate natural conditions; and use test sites designed to test specific geobotanical hypotheses.

  13. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  14. Advanced Ground Systems Maintenance Physics Models For Diagnostics Project

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.

    2015-01-01

    The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations. This project will develop and implement high-fidelity physics-based modeling techniques tosimulate the real-time operation of cryogenics and other fluids systems and, when compared to thereal-time operation of the actual systems, provide assessment of their state. Physics-modelcalculated measurements (called “pseudo-sensors”) will be compared to the system real-timedata. Comparison results will be utilized to provide systems operators with enhanced monitoring ofsystems' health and status, identify off-nominal trends and diagnose system/component failures.This capability can also be used to conduct planning and analysis of cryogenics and other fluidsystems designs. This capability will be interfaced with the ground operations command andcontrol system as a part of the Advanced Ground Systems Maintenance (AGSM) project to helpassure system availability and mission success. The initial capability will be developed for theLiquid Oxygen (LO2) ground loading systems.

  15. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  16. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  17. A dynamically reconfigurable multi-functional PLL for SRAM-based FPGA in 65nm CMOS technology

    NASA Astrophysics Data System (ADS)

    Yang, Mingqian; Chen, Lei; Li, Xuewu; Zhang, Yanlong

    2018-04-01

    Phase-locked loops (PLL) have been widely utilized in FPGA as an important module for clock management. PLL with dynamic reconfiguration capability is always welcomed in FPGA design as it is able to decrease power consumption and simultaneously improve flexibility. In this paper, a multi-functional PLL with dynamic reconfiguration capability for 65nm SRAM-based FPGA is proposed. Firstly, configurable charge pump and loop filter are utilized to optimize the loop bandwidth. Secondly, the PLL incorporates a VCO with dual control voltages to accelerate the adjustment of oscillation frequency. Thirdly, three configurable dividers are presented for flexible frequency synthesis. Lastly, a configuration block with dynamic reconfiguration function is proposed. Simulation results demonstrate that the proposed multi-functional PLL can output clocks with configurable division ratio, phase shift and duty cycle. The PLL can also be dynamically reconfigured without affecting other parts' running or halting the FPGA device.

  18. Extremum Seeking Control of Smart Inverters for VAR Compensation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Daniel; Negrete-Pincetic, Matias; Stewart, Emma

    2015-09-04

    Reactive power compensation is used by utilities to ensure customer voltages are within pre-defined tolerances and reduce system resistive losses. While much attention has been paid to model-based control algorithms for reactive power support and Volt Var Optimization (VVO), these strategies typically require relatively large communications capabilities and accurate models. In this work, a non-model-based control strategy for smart inverters is considered for VAR compensation. An Extremum Seeking control algorithm is applied to modulate the reactive power output of inverters based on real power information from the feeder substation, without an explicit feeder model. Simulation results using utility demand informationmore » confirm the ability of the control algorithm to inject VARs to minimize feeder head real power consumption. In addition, we show that the algorithm is capable of improving feeder voltage profiles and reducing reactive power supplied by the distribution substation.« less

  19. Secure relay selection based on learning with negative externality in wireless networks

    NASA Astrophysics Data System (ADS)

    Zhao, Caidan; Xiao, Liang; Kang, Shan; Chen, Guiquan; Li, Yunzhou; Huang, Lianfen

    2013-12-01

    In this paper, we formulate relay selection into a Chinese restaurant game. A secure relay selection strategy is proposed for a wireless network, where multiple source nodes send messages to their destination nodes via several relay nodes, which have different processing and transmission capabilities as well as security properties. The relay selection utilizes a learning-based algorithm for the source nodes to reach their best responses in the Chinese restaurant game. In particular, the relay selection takes into account the negative externality of relay sharing among the source nodes, which learn the capabilities and security properties of relay nodes according to the current signals and the signal history. Simulation results show that this strategy improves the user utility and the overall security performance in wireless networks. In addition, the relay strategy is robust against the signal errors and deviations of some user from the desired actions.

  20. Computer simulation of thermal and fluid systems for MIUS integration and subsystems test /MIST/ laboratory. [Modular Integrated Utility System

    NASA Technical Reports Server (NTRS)

    Rochelle, W. C.; Liu, D. K.; Nunnery, W. J., Jr.; Brandli, A. E.

    1975-01-01

    This paper describes the application of the SINDA (systems improved numerical differencing analyzer) computer program to simulate the operation of the NASA/JSC MIUS integration and subsystems test (MIST) laboratory. The MIST laboratory is designed to test the integration capability of the following subsystems of a modular integrated utility system (MIUS): (1) electric power generation, (2) space heating and cooling, (3) solid waste disposal, (4) potable water supply, and (5) waste water treatment. The SINDA/MIST computer model is designed to simulate the response of these subsystems to externally impressed loads. The computer model determines the amount of recovered waste heat from the prime mover exhaust, water jacket and oil/aftercooler and from the incinerator. This recovered waste heat is used in the model to heat potable water, for space heating, absorption air conditioning, waste water sterilization, and to provide for thermal storage. The details of the thermal and fluid simulation of MIST including the system configuration, modes of operation modeled, SINDA model characteristics and the results of several analyses are described.

  1. Total Transfer Capability Assessment Incorporating Corrective Controls for Transient Stability using TSCOPF

    NASA Astrophysics Data System (ADS)

    Hakim, Lukmanul; Kubokawa, Junji; Yorino, Naoto; Zoka, Yoshifumi; Sasaki, Yutaka

    Advancements have been made towards inclusion of both static and dynamic security into transfer capability calculation. However, to the authors' knowledge, work on considering corrective controls into the calculation has not been reported yet. Therefore, we propose a Total Transfer Capability (TTC) assessment considering transient stability corrective controls. The method is based on the Newton interior point method for nonlinear programming and transfer capability is approached as a maximization of power transfer with both static and transient stability constraints are incorporated into our Transient Stability Constrained Optimal Power Flow (TSCOPF) formulation. An interconnected power system is simulated to be subjected to a severe unbalanced 3-phase 4-line to ground fault and following the fault, generator and load are shed in a pre-defined sequence to mimic actual corrective controls. In a deregulated electricity market, both generator companies and large load customers are encouraged to actively participate in maintaining power system stability as corrective controls upon agreement of compensation for being shed following a disturbance. Implementation of this proposal on the actual power system operation should be carried out through combining it with the existing transient stabilization controller system. Utilization of these corrective controls results in increasing TTC as suggested in our numerical simulation. As Lagrange multipliers can also describe sensitivity of both inequality and equality constraints to the objective function, then selection of which generator or load to be shed can be carried out on the basis of values of Lagrange multipliers of its respective generator's rotor angle stability and active power balance equation. Hence, the proposal in this paper can be utilized by system operator to assess the maximum TTC for specific loads and network conditions.

  2. Advanced Grid Simulator for Multi-Megawatt Power Converter Testing and Certification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koralewicz, Przemyslaw; Gevorgian, Vahan; Wallen, Robb

    2017-02-16

    Grid integration testing of inverter-coupled renewable energy technologies is an essential step in the qualification of renewable energy and energy storage systems to ensure the stability of the power system. New types of devices must be thoroughly tested and validated for compliance with relevant grid codes and interconnection requirements. For this purpose, highly specialized custom-made testing equipment is needed to emulate various types of realistic grid conditions that are required by certification bodies or for research purposes. For testing multi-megawatt converters, a high power grid simulator capable of creating controlled grid conditions and meeting both power quality and dynamic characteristicsmore » is needed. This paper describes the new grid simulator concept based on ABB's medium voltage ACS6000 drive technology that utilizes advanced modulation and control techniques to create an unique testing platform for various multi-megawatt power converter systems. Its performance is demonstrated utilizing the test results obtained during commissioning activities at the National Renewable Energy Laboratory in Colorado, USA.« less

  3. Interplanetary Transit Simulations Using the International Space Station

    NASA Technical Reports Server (NTRS)

    Charles, J. B.; Arya, Maneesh

    2010-01-01

    It has been suggested that the International Space Station (ISS) be utilized to simulate the transit portion of long-duration missions to Mars and near-Earth asteroids (NEA). The ISS offers a unique environment for such simulations, providing researchers with a high-fidelity platform to study, enhance, and validate technologies and countermeasures for these long-duration missions. From a space life sciences perspective, two major categories of human research activities have been identified that will harness the various capabilities of the ISS during the proposed simulations. The first category includes studies that require the use of the ISS, typically because of the need for prolonged weightlessness. The ISS is currently the only available platform capable of providing researchers with access to a weightless environment over an extended duration. In addition, the ISS offers high fidelity for other fundamental space environmental factors, such as isolation, distance, and accessibility. The second category includes studies that do not require use of the ISS in the strictest sense, but can exploit its use to maximize their scientific return more efficiently and productively than in ground-based simulations. In addition to conducting Mars and NEA simulations on the ISS, increasing the current increment duration on the ISS from 6 months to a longer duration will provide opportunities for enhanced and focused research relevant to long-duration Mars and NEA missions. Although it is currently believed that increasing the ISS crew increment duration to 9 or even 12 months will pose little additional risk to crewmembers, additional medical monitoring capabilities may be required beyond those currently used for the ISS operations. The use of the ISS to simulate aspects of Mars and NEA missions seems practical, and it is recommended that planning begin soon, in close consultation with all international partners.

  4. A Deep Space Orbit Determination Software: Overview and Event Prediction Capability

    NASA Astrophysics Data System (ADS)

    Kim, Youngkwang; Park, Sang-Young; Lee, Eunji; Kim, Minsik

    2017-06-01

    This paper presents an overview of deep space orbit determination software (DSODS), as well as validation and verification results on its event prediction capabilities. DSODS was developed in the MATLAB object-oriented programming environment to support the Korea Pathfinder Lunar Orbiter (KPLO) mission. DSODS has three major capabilities: celestial event prediction for spacecraft, orbit determination with deep space network (DSN) tracking data, and DSN tracking data simulation. To achieve its functionality requirements, DSODS consists of four modules: orbit propagation (OP), event prediction (EP), data simulation (DS), and orbit determination (OD) modules. This paper explains the highest-level data flows between modules in event prediction, orbit determination, and tracking data simulation processes. Furthermore, to address the event prediction capability of DSODS, this paper introduces OP and EP modules. The role of the OP module is to handle time and coordinate system conversions, to propagate spacecraft trajectories, and to handle the ephemerides of spacecraft and celestial bodies. Currently, the OP module utilizes the General Mission Analysis Tool (GMAT) as a third-party software component for highfidelity deep space propagation, as well as time and coordinate system conversions. The role of the EP module is to predict celestial events, including eclipses, and ground station visibilities, and this paper presents the functionality requirements of the EP module. The validation and verification results show that, for most cases, event prediction errors were less than 10 millisec when compared with flight proven mission analysis tools such as GMAT and Systems Tool Kit (STK). Thus, we conclude that DSODS is capable of predicting events for the KPLO in real mission applications.

  5. Adding the Human Element to Ship Manoeuvring Simulations

    NASA Astrophysics Data System (ADS)

    Aarsæther, Karl Gunnar; Moan, Torgeir

    Time-domain simulation of ship manoeuvring has been utilized in risk analysis to assess the effect of changes to the ship-lane, development in traffic volume and the associated risk. The process of ship manoeuvring in a wider socio-technical context consists of the technical systems, operational procedures, the human operators and support functions. Automated manoeuvring simulations without human operators in the simulation loop have often been preferred in simulation studies due to the low time required for simulations. Automatic control has represented the human element with little effort devoted to explain the relationship between the guidance and control algorithms and the human operator which they replace. This paper describes the development and application of a model for the human element for autonomous time-domain manoeuvring simulations. The method is applicable in the time-domain, modular and found to be capable of reproducing observed manoeuvre patterns, but limited to represent the intended behaviour.

  6. An Analysis of the Space Transportation System Launch Rate Capability Utilizing Q-GERT Simulation Techniques.

    DTIC Science & Technology

    1982-12-01

    VAPE was modeled to determine this launch rate and to determine the processing times for an Orbiter at VAPe . This informa- 21 tion was then used in the...year (node 79 and activity ?1). ETa are then selected to be sent to either KSC or VAPE (node 80). This decision is made (using Ur 8) on the basis of

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, Gennady

    Typically the RFQs are designed using the Parmteq, DesRFQ and other similar specialized codes, which produces the files containing the field and geometrical parameters for every cell. The beam dynamic simulations with these analytical fields a re, of course, ideal realizations of the designed RFQs. The new advanced computing capabilities made it possible to simulate beam and even dark current in the realistic 3D electromagnetic fields in the RFQs that may reflect cavity tuning, presence of tune rs and couplers, RFQ segmentation etc. The paper describes the utilization of full 3D field distribution obtained with CST Studio Suite for beammore » dynamic simulations using both PIC solver of CST Particle Studio and the beam dynamic code TRACK.« less

  8. In-Situ Resource Utilization (ISRU) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Sanders, Gerald B.; Duke, Michael

    2005-01-01

    A progress review on In-Situ Resource Utilization (ISRU) capability is presented. The topics include: 1) In-Situ Resource Utilization (ISRU) Capability Roadmap: Level 1; 2) ISRU Emphasized Architecture Overview; 3) ISRU Capability Elements: Level 2 and below; and 4) ISRU Capability Roadmap Wrap-up.

  9. Computer-aided analysis and design of the shape rolling process for producing turbine engine airfoils

    NASA Technical Reports Server (NTRS)

    Lahoti, G. D.; Akgerman, N.; Altan, T.

    1978-01-01

    Mild steel (AISI 1018) was selected as model cold rolling material and Ti-6A1-4V and Inconel 718 were selected as typical hot rolling and cold rolling alloys, respectively. The flow stress and workability of these alloys were characterized and friction factor at the roll/workpiece interface was determined at their respective working conditions by conducting ring tests. Computer-aided mathematical models for predicting metal flow and stresses, and for simulating the shape rolling process were developed. These models utilized the upper bound and the slab methods of analysis, and were capable of predicting the lateral spread, roll separating force, roll torque, and local stresses, strains and strain rates. This computer-aided design system was also capable of simulating the actual rolling process, and thereby designing the roll pass schedule in rolling of an airfoil or a similar shape.

  10. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  11. WE-D-BRF-01: FEATURED PRESENTATION - Investigating Particle Track Structures Using Fluorescent Nuclear Track Detectors and Monte Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowdell, S; Paganetti, H; Schuemann, J

    Purpose: To report on the efforts funded by the AAPM seed funding grant to develop the basis for fluorescent nuclear track detector (FNTD) based radiobiological experiments in combination with dedicated Monte Carlo simulations (MCS) on the nanometer scale. Methods: Two confocal microscopes were utilized in this study. Two FNTD samples were used to find the optimal microscope settings, one FNTD irradiated with 11.1 MeV/u Gold ions and one irradiated with 428.77 MeV/u Carbon ions. The first sample provided a brightly luminescent central track while the latter is used to test the capabilities to observe secondary electrons. MCS were performed usingmore » TOPAS beta9 version, layered on top of Geant4.9.6p02. Two sets of simulations were performed, one with the Geant4-DNA physics list and approximating the FNTDs by water, a second set using the Penelope physics list in a water-approximated FNTD and a aluminum-oxide FNTD. Results: Within the first half of the funding period, we have successfully established readout capabilities of FNTDs at our institute. Due to technical limitations, our microscope setup is significantly different from the approach implemented at the DKFZ, Germany. However, we can clearly reconstruct Carbon tracks in 3D with electron track resolution of 200 nm. A second microscope with superior readout capabilities will be tested in the second half of the funding period, we expect an improvement in signal to background ratio with the same the resolution.We have successfully simulated tracks in FNTDs. The more accurate Geant4-DNA track simulations can be used to reconstruct the track energy from the size and brightness of the observed tracks. Conclusion: We have achieved the goals set in the seed funding proposal: the setup of FNTD readout and simulation capabilities. We will work on improving the readout resolution to validate our MCS track structures down to the nanometer scales.« less

  12. Statistical Similarities Between WSA-ENLIL+Cone Model and MAVEN in Situ Observations From November 2014 to March 2016

    NASA Astrophysics Data System (ADS)

    Lentz, C. L.; Baker, D. N.; Jaynes, A. N.; Dewey, R. M.; Lee, C. O.; Halekas, J. S.; Brain, D. A.

    2018-02-01

    Normal solar wind flows and intense solar transient events interact directly with the upper Martian atmosphere due to the absence of an intrinsic global planetary magnetic field. Since the launch of the Mars Atmosphere and Volatile EvolutioN (MAVEN) mission, there are now new means to directly observe solar wind parameters at the planet's orbital location for limited time spans. Due to MAVEN's highly elliptical orbit, in situ measurements cannot be taken while MAVEN is inside Mars' magnetosheath. To model solar wind conditions during these atmospheric and magnetospheric passages, this research project utilized the solar wind forecasting capabilities of the WSA-ENLIL+Cone model. The model was used to simulate solar wind parameters that included magnetic field magnitude, plasma particle density, dynamic pressure, proton temperature, and velocity during a four Carrington rotation-long segment. An additional simulation that lasted 18 Carrington rotations was then conducted. The precision of each simulation was examined for intervals when MAVEN was in the upstream solar wind, that is, with no exospheric or magnetospheric phenomena altering in situ measurements. It was determined that generalized, extensive simulations have comparable prediction capabilities as shorter, more comprehensive simulations. Generally, this study aimed to quantify the loss of detail in long-term simulations and to determine if extended simulations can provide accurate, continuous upstream solar wind conditions when there is a lack of in situ measurements.

  13. Study on the Potential Development of Rainwater Utilization in the Hilly City of Southern China

    NASA Astrophysics Data System (ADS)

    Fu, Xiaoran; Liu, Jiahong; Shao, Weiwei; Zhang, Haixing

    2017-12-01

    Aimed at the current flood problems and the contradiction between supply and demand of water resources in the southern cities of China, the comprehensive utilization of Urban Rainwater Resources (URRs) is a significant solution. At present, the research on the comprehensive utilization system of urban rainwater resources in China is still immature, especially the lack of a comprehensive method for the comprehensive utilization of the rainwater and flood resources in the south. Based on the current mode for utilization of URRs at home and abroad, Fenghuang County in Hunan Province was taken as a case of study, which is a typical mountainous city in the southern China. And the potential development of URRs was simulated and evaluated with a comparison of before and after the exploitation and utilization of URRs in this paper. The reduction effect of flood and waterlogging on the ancient city area is analyzed from SWMM. The simulation results show that the potential of exploitation and utilization of URRs in Fenghuang county is remarkable under the mode of exploitation and utilization which is given priority to flood prevention and control, and the annual development potential is 4.865×105 m3. The rainwater utilization measures of flood control effect is obvious with this mode, and the relevant research results can provide theoretical and technical support for enhancing urban water security capability, water conservation capacity, and disaster mitigation of urban flood.

  14. Thermal Properties Measurement Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmack, Jon; Braase, Lori; Papesch, Cynthia

    2015-08-01

    The Thermal Properties Measurement Report summarizes the research, development, installation, and initial use of significant experimental thermal property characterization capabilities at the INL in FY 2015. These new capabilities were used to characterize a U 3Si 2 (candidate Accident Tolerant) fuel sample fabricated at the INL. The ability to perform measurements at various length scales is important and provides additional data that is not currently in the literature. However, the real value of the data will be in accomplishing a phenomenological understanding of the thermal conductivity in fuels and the ties to predictive modeling. Thus, the MARMOT advanced modeling andmore » simulation capability was utilized to illustrate how the microstructural data can be modeled and compared with bulk characterization data. A scientific method was established for thermal property measurement capability on irradiated nuclear fuel samples, which will be installed in the Irradiated Material Characterization Laboratory (IMCL).« less

  15. Utilizing NX Advanced Simulation for NASA's New Mobile Launcher for Ares-l

    NASA Technical Reports Server (NTRS)

    Brown, Christopher

    2010-01-01

    This slide presentation reviews the use of NX to simulate the new Mobile Launcher (ML) for the Ares-I. It includes: a comparison of the sizes of the Saturn 5, the Space Shuttle, the Ares I, and the Ares V, with the height, and payload capability; the loads control plan; drawings of the base framing, the underside of the ML, beam arrangement, and the finished base and the origin of the 3D CAD data. It also reviews the modeling approach, meshing. the assembly Finite Element Modeling, the model summary. and beam improvements.

  16. Evaluation of an F100 multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Skira, C.

    1977-01-01

    The control evaluated has been designed for the F100-PW-100 turbofan engine. The F100 engine represents the current state-of-the-art in aircraft gas turbine technology. The control makes use of a multivariable, linear quadratic regulator. The evaluation procedure employed utilized a real-time hybrid computer simulation of the F100 engine and an implementation of the control logic on the NASA LeRC digital computer/controller. The results of the evaluation indicated that the control logic and its implementation will be capable of controlling the engine throughout its operating range.

  17. Coupled electromagnetic-thermodynamic simulations of microwave heating problems using the FDTD algorithm.

    PubMed

    Kopyt, Paweł; Celuch, Małgorzata

    2007-01-01

    A practical implementation of a hybrid simulation system capable of modeling coupled electromagnetic-thermodynamic problems typical in microwave heating is described. The paper presents two approaches to modeling such problems. Both are based on an FDTD-based commercial electromagnetic solver coupled to an external thermodynamic analysis tool required for calculations of heat diffusion. The first approach utilizes a simple FDTD-based thermal solver while in the second it is replaced by a universal commercial CFD solver. The accuracy of the two modeling systems is verified against the original experimental data as well as the measurement results available in literature.

  18. Orion Crew Module / Service Module Structural Weight and Center of Gravity Simulator and Vehicle Motion Simulator Hoist Structure for Orion Service Module Umbilical Testing

    NASA Technical Reports Server (NTRS)

    Ascoli, Peter A.; Haddock, Michael H.

    2014-01-01

    An Orion Crew Module Service Module Structural Weight and Center of Gravity Simulator and a Vehicle Motion Simulator Hoist Structure for Orion Service Module Umbilical Testing were designed during a summer 2014 internship in Kennedy Space Centers Structures and Mechanisms Design Branch. The simulator is a structure that supports ballast, which will be integrated into an existing Orion mock-up to simulate the mass properties of the Exploration Mission-1 flight vehicle in both fueled and unfueled states. The simulator mimics these configurations through the use of approximately 40,000 lbf of steel and water ballast, and a steel support structure. Draining four water tanks, which house the water ballast, transitions the simulator from the fueled to unfueled mass properties. The Ground Systems Development and Operations organization will utilize the simulator to verify and validate equipment used to maneuver and transport the Orion spacecraft in its fueled and unfueled configurations. The second design comprises a cantilevered tripod hoist structure that provides the capability to position a large Orion Service Module Umbilical in proximity to the Vehicle Motion Simulator. The Ground Systems Development and Operations organization will utilize the Vehicle Motion Simulator, with the hoist structure attached, to test the Orion Service Module Umbilical for proper operation prior to installation on the Mobile Launcher. Overall, these two designs provide NASA engineers viable concepts worthy of fabricating and placing into service to prepare for the launch of Orion in 2017.

  19. Status of the Correlation Process of the V-HAB Simulation with Ground Tests and ISS Telemetry Data

    NASA Technical Reports Server (NTRS)

    Ploetner, P.; Roth, C.; Zhukov, A.; Czupalla, M.; Anderson, M.; Ewert, M.

    2013-01-01

    The Virtual Habitat (V-HAB) is a dynamic Life Support System (LSS) simulation, created for investigation of future human spaceflight missions. It provides the capability to optimize LSS during early design phases. The focal point of the paper is the correlation and validation of V-HAB against ground test and flight data. In order to utilize V-HAB to design an Environmental Control and Life Support System (ECLSS) it is important to know the accuracy of simulations, strengths and weaknesses. Therefore, simulations of real systems are essential. The modeling of the International Space Station (ISS) ECLSS in terms of single technologies as well as an integrated system and correlation against ground and flight test data is described. The results of the simulations make it possible to prove the approach taken by V-HAB.

  20. It is not how much you have but how you use it: toward a rational use of simulation to support aviation training.

    PubMed

    Salas, E; Bowers, C A; Rhodenizer, L

    1998-01-01

    One of the most remarkable changes in aviation training over the past few decades is the use of simulation. The capabilities now offered by simulation have created unlimited opportunities for aviation training. In fact, aviation training is now more realistic, safe, cost-effective, and flexible than ever before. However, we believe that a number of misconceptions--or invalid assumptions--exist in the simulation community that prevent us from fully exploiting and utilizing recent scientific advances in a number of related fields in order to further enhance aviation training. These assumptions relate to the overreliance on high-fidelity simulation and to the misuse of simulation to enhance learning of complex skills. The purpose of this article is to discuss these assumptions in the hope of initiating a dialogue between behavioral scientists and engineers.

  1. Application of the Ecosystem Assessment Model to Lake Norman: A cooling lake in North Carolina: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porcella, D.B.; Bowie, G.L.; Campbell, C.L.

    The Ecosystem Assessment Model (EAM) of the Cooling Lake Assessment Methodology was applied to the extensive ecological field data collected at Lake Norman, North Carolina by Duke Power Company to evaluate its capability to simulate lake ecosystems and the ecological effects of steam electric power plants. The EAM provided simulations over a five-year verification period that behaved as expected based on a one-year calibration. Major state variables of interest to utilities and regulatory agencies are: temperature, dissolved oxygen, and fish community variables. In qualitative terms, temperature simulation was very accurate, dissolved oxygen simulation was accurate, and fish prediction was reasonablymore » accurate. The need for more accurate fisheries data collected at monthly intervals and non-destructive sampling techniques was identified.« less

  2. Measurement of Nanoplasmonic Field Enhancement with Ultrafast Photoemission.

    PubMed

    Rácz, Péter; Pápa, Zsuzsanna; Márton, István; Budai, Judit; Wróbel, Piotr; Stefaniuk, Tomasz; Prietl, Christine; Krenn, Joachim R; Dombi, Péter

    2017-02-08

    Probing nanooptical near-fields is a major challenge in plasmonics. Here, we demonstrate an experimental method utilizing ultrafast photoemission from plasmonic nanostructures that is capable of probing the maximum nanoplasmonic field enhancement in any metallic surface environment. Directly measured field enhancement values for various samples are in good agreement with detailed finite-difference time-domain simulations. These results establish ultrafast plasmonic photoelectrons as versatile probes for nanoplasmonic near-fields.

  3. Improved Load Alleviation Capability for the KC-135

    DTIC Science & Technology

    1997-09-01

    software, such as Matlab, Mathematica, Simulink, and Robotica Front End for Mathematica available in the simulation laboratory Overview This thesis report is...outlined in Spong’s text in order to utilize the Robotica system development software which automates the process of calculating the kinematic and...kinematic and dynamic equations can be accomplished using a computer tool called Robotica Front End (RFE) [ 15], developed by Doctor Spong. Boom Root d3

  4. Aircraft flight test trajectory control

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Walker, R. A.

    1988-01-01

    Two control law design techniques are compared and the performance of the resulting controllers evaluated. The design requirement is for a flight test trajectory controller (FTTC) capable of closed-loop, outer-loop control of an F-15 aircraft performing high-quality research flight test maneuvers. The maneuver modeling, linearization, and design methodologies utilized in this research, are detailed. The results of applying these FTTCs to a nonlinear F-15 simulation are presented.

  5. Filter Media Tests Under Simulated Martian Atmospheric Conditions

    NASA Technical Reports Server (NTRS)

    Agui, Juan H.

    2016-01-01

    Human exploration of Mars will require the optimal utilization of planetary resources. One of its abundant resources is the Martian atmosphere that can be harvested through filtration and chemical processes that purify and separate it into its gaseous and elemental constituents. Effective filtration needs to be part of the suite of resource utilization technologies. A unique testing platform is being used which provides the relevant operational and instrumental capabilities to test articles under the proper simulated Martian conditions. A series of tests were conducted to assess the performance of filter media. Light sheet imaging of the particle flow provided a means of detecting and quantifying particle concentrations to determine capturing efficiencies. The media's efficiency was also evaluated by gravimetric means through a by-layer filter media configuration. These tests will help to establish techniques and methods for measuring capturing efficiency and arrestance of conventional fibrous filter media. This paper will describe initial test results on different filter media.

  6. 2-D and 3-D mixing flow analyses of a scramjet-afterbody configuration

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Eleshaky, Mohamed E.; Engelund, Walter C.

    1989-01-01

    A cold simulant gas study of propulsion/airframe integration for a hypersonic vehicle powered by a scramjet engine is presented. The specific heat ratio of the hot exhaust gases are matched by utilizing a cold mixture of argon and Freon-12. Solutions are obtained for a hypersonic corner flow and a supersonic rectangular flow in order to provide the upstream boundary conditions. The computational test examples also provide a comparison of this flow with that of air as the expanding supersonic jet, where the specific heats are assumed to be constant. It is shown that the three-dimensional computational fluid capabilities developed for these types of flow may be utilized to augment the conventional wind tunnel studies of scramjet afterbody flows using cold simulant exhaust gases, which in turn can help in the design of a scramjet internal-external nozzle.

  7. Simulation Based Optimization of Complex Monolithic Composite Structures Using Cellular Core Technology

    NASA Astrophysics Data System (ADS)

    Hickmott, Curtis W.

    Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional tooling methods. Components made with the cellular core tooling method showed an improved strength at the joints. It is expected that this knowledge will help optimize the processing of complex, integrated structures and benefit applications in aerospace where lighter, structurally efficient components would be advantageous.

  8. Simulation and Analysis of Converging Shock Wave Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Shashkov, Mikhail J.

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the originalmore » problem, and minimally straining the general credibility of associated analysis and conclusions.« less

  9. Distributed environmental control

    NASA Technical Reports Server (NTRS)

    Cleveland, Gary A.

    1992-01-01

    We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).

  10. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormier, Dallas; Edra, Sherwin; Espinoza, Michael

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations,more » identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.« less

  11. Utility of Emulation and Simulation Computer Modeling of Space Station Environmental Control and Life Support Systems

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.

  12. Influencing agent group behavior by adjusting cultural trait values.

    PubMed

    Tuli, Gaurav; Hexmoor, Henry

    2010-10-01

    Social reasoning and norms among individuals that share cultural traits are largely fashioned by those traits. We have explored predominant sociological and cultural traits. We offer a methodology for parametrically adjusting relevant traits. This exploratory study heralds a capability to deliberately tune cultural group traits in order to produce a desired group behavior. To validate our methodology, we implemented a prototypical-agent-based simulated test bed for demonstrating an exemplar from intelligence, surveillance, and reconnaissance scenario. A group of simulated agents traverses a hostile territory while a user adjusts their cultural group trait settings. Group and individual utilities are dynamically observed against parametric values for the selected traits. Uncertainty avoidance index and individualism are the cultural traits we examined in depth. Upon the user's training of the correspondence between cultural values and system utilities, users deliberately produce the desired system utilities by issuing changes to trait. Specific cultural traits are without meaning outside of their context. Efficacy and timely application of traits in a given context do yield desirable results. This paper heralds a path for the control of large systems via parametric cultural adjustments.

  13. Small Projects Rapid Integration and Test Environment (SPRITE): Application for Increasing Robustness

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Heater, Daniel; Lee, Ashley

    2013-01-01

    Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.

  14. Nonlinear filter based decision feedback equalizer for optical communication systems.

    PubMed

    Han, Xiaoqi; Cheng, Chi-Hao

    2014-04-07

    Nonlinear impairments in optical communication system have become a major concern of optical engineers. In this paper, we demonstrate that utilizing a nonlinear filter based Decision Feedback Equalizer (DFE) with error detection capability can deliver a better performance compared with the conventional linear filter based DFE. The proposed algorithms are tested in simulation using a coherent 100 Gb/sec 16-QAM optical communication system in a legacy optical network setting.

  15. On a simulation study for reliable and secured smart grid communications

    NASA Astrophysics Data System (ADS)

    Mallapuram, Sriharsha; Moulema, Paul; Yu, Wei

    2015-05-01

    Demand response is one of key smart grid applications that aims to reduce power generation at peak hours and maintain a balance between supply and demand. With the support of communication networks, energy consumers can become active actors in the energy management process by adjusting or rescheduling their electricity usage during peak hours based on utilities pricing incentives. Nonetheless, the integration of communication networks expose the smart grid to cyber-attacks. In this paper, we developed a smart grid simulation test-bed and designed evaluation scenarios. By leveraging the capabilities of Matlab and ns-3 simulation tools, we conducted a simulation study to evaluate the impact of cyber-attacks on demand response application. Our data shows that cyber-attacks could seriously disrupt smart grid operations, thus confirming the need of secure and resilient communication networks for supporting smart grid operations.

  16. GTKDynamo: a PyMOL plug-in for QC/MM hybrid potential simulations

    PubMed Central

    Bachega, José Fernando R.; Timmers, Luís Fernando S.M.; Assirati, Lucas; Bachega, Leonardo R.; Field, Martin J.; Wymore, Troy

    2014-01-01

    Hybrid quantum chemical (QC)/molecular mechanical (MM) potentials are very powerful tools for molecular simulation. They are especially useful for studying processes in condensed phase systems, such as chemical reactions, that involve a relatively localized change in electronic structure and where the surrounding environment contributes to these changes but can be represented with more computationally efficient functional forms. Despite their utility, however, these potentials are not always straightforward to apply since the extent of significant electronic structure changes occurring in the condensed phase process may not be intuitively obvious. To facilitate their use we have developed an open-source graphical plug-in, GTKDynamo, that links the PyMOL visualization program and the pDynamo QC/MM simulation library. This article describes the implementation of GTKDynamo and its capabilities and illustrates its application to QC/MM simulations. PMID:24137667

  17. Determination of ASPS performance for large payloads in the shuttle orbiter disturbance environment. [digital simulation

    NASA Technical Reports Server (NTRS)

    Keckler, C. R.; Kibler, K. S.; Powell, L. F.

    1979-01-01

    A high fidelity simulation of the annular suspension and pointing system (ASPS), its payload, and the shuttle orbiter was used to define the worst case orientations of the ASPS and its payload for the various vehicle disturbances, and to determine the performance capability of the ASPS under these conditions. The most demanding and largest proposed payload, the Solar Optical Telescope was selected for study. It was found that, in all cases, the ASPS more than satisfied the payload's requirements. It is concluded that, to satisfy facility class payload requirements, the ASPS or a shuttle orbiter free-drift mode (control system off) should be utilized.

  18. Progress on the Multiphysics Capabilities of the Parallel Electromagnetic ACE3P Simulation Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kononenko, Oleksiy

    2015-03-26

    ACE3P is a 3D parallel simulation suite that is being developed at SLAC National Accelerator Laboratory. Effectively utilizing supercomputer resources, ACE3P has become a key tool for the coupled electromagnetic, thermal and mechanical research and design of particle accelerators. Based on the existing finite-element infrastructure, a massively parallel eigensolver is developed for modal analysis of mechanical structures. It complements a set of the multiphysics tools in ACE3P and, in particular, can be used for the comprehensive study of microphonics in accelerating cavities ensuring the operational reliability of a particle accelerator.

  19. Double-Pulse Two-Micron IPDA Lidar Simulation for Airborne Carbon Dioxide Measurements

    NASA Technical Reports Server (NTRS)

    Refaat, Tamer F.; Singh, Upendra N.; Yu, Jirong; Petros, Mulugeta

    2015-01-01

    An advanced double-pulsed 2-micron integrated path differential absorption lidar has been developed at NASA Langley Research Center for measuring atmospheric carbon dioxide. The instrument utilizes a state-of-the-art 2-micron laser transmitter with tunable on-line wavelength and advanced receiver. Instrument modeling and airborne simulations are presented in this paper. Focusing on random errors, results demonstrate instrument capabilities of performing precise carbon dioxide differential optical depth measurement with less than 3% random error for single-shot operation from up to 11 km altitude. This study is useful for defining CO2 measurement weighting, instrument setting, validation and sensitivity trade-offs.

  20. Current and anticipated uses of thermal-hydraulic codes in Germany

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teschendorff, V.; Sommer, F.; Depisch, F.

    1997-07-01

    In Germany, one third of the electrical power is generated by nuclear plants. ATHLET and S-RELAP5 are successfully applied for safety analyses of the existing PWR and BWR reactors and possible future reactors, e.g. EPR. Continuous development and assessment of thermal-hydraulic codes are necessary in order to meet present and future needs of licensing organizations, utilities, and vendors. Desired improvements include thermal-hydraulic models, multi-dimensional simulation, computational speed, interfaces to coupled codes, and code architecture. Real-time capability will be essential for application in full-scope simulators. Comprehensive code validation and quantification of uncertainties are prerequisites for future best-estimate analyses.

  1. Thermal bioaerosol cloud tracking with Bayesian classification

    NASA Astrophysics Data System (ADS)

    Smith, Christian W.; Dupuis, Julia R.; Schundler, Elizabeth C.; Marinelli, William J.

    2017-05-01

    The development of a wide area, bioaerosol early warning capability employing existing uncooled thermal imaging systems used for persistent perimeter surveillance is discussed. The capability exploits thermal imagers with other available data streams including meteorological data and employs a recursive Bayesian classifier to detect, track, and classify observed thermal objects with attributes consistent with a bioaerosol plume. Target detection is achieved based on similarity to a phenomenological model which predicts the scene-dependent thermal signature of bioaerosol plumes. Change detection in thermal sensor data is combined with local meteorological data to locate targets with the appropriate thermal characteristics. Target motion is tracked utilizing a Kalman filter and nearly constant velocity motion model for cloud state estimation. Track management is performed using a logic-based upkeep system, and data association is accomplished using a combinatorial optimization technique. Bioaerosol threat classification is determined using a recursive Bayesian classifier to quantify the threat probability of each tracked object. The classifier can accept additional inputs from visible imagers, acoustic sensors, and point biological sensors to improve classification confidence. This capability was successfully demonstrated for bioaerosol simulant releases during field testing at Dugway Proving Grounds. Standoff detection at a range of 700m was achieved for as little as 500g of anthrax simulant. Developmental test results will be reviewed for a range of simulant releases, and future development and transition plans for the bioaerosol early warning platform will be discussed.

  2. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  3. Toward a Climate OSSE for NASA Earth Sciences

    NASA Astrophysics Data System (ADS)

    Leroy, S. S.; Collins, W. D.; Feldman, D.; Field, R. D.; Ming, Y.; Pawson, S.; Sanderson, B.; Schmidt, G. A.

    2016-12-01

    In the Continuity Study, the National Academy of Sciences advised that future space missions be rated according to five categories: the importance of a well-defined scientific objective, the utility of the observation in addressing the scientific objective, the quality with which the observation can be made, the probability of the mission's success, and the mission's affordability. The importance, probability, and affordability are evaluated subjectively by scientific consensus, by engineering review panels, and by cost models; however, the utility and quality can be evaluated objectively by a climate observation system simulation experiment (COSSE). A discussion of the philosophical underpinnings of a COSSE for NASA Earth Sciences will be presented. A COSSE is built upon a perturbed physics ensemble of a sophisticated climate model that can simulate a mission's prospective observations and its well-defined quantitative scientific objective and that can capture the uncertainty associated with each. A strong correlation between observation and scientific objective after consideration of physical uncertainty leads to a high quality. Persistence of a high correlation after inclusion of the proposed measurement error leads to a high utility. There are five criteria that govern that nature of a particular COSSE: (1) whether the mission's scientific objective is one of hypothesis testing or climate prediction, (2) whether the mission is empirical or inferential, (3) whether the core climate model captures essential physical uncertainties, (4) the level of detail of the simulated observations, and (5) whether complementarity or redundancy of information is to be valued. Computation of the quality and utility is done using Bayesian statistics, as has been done previously for multi-decadal climate prediction conditioned on existing data. We advocate for a new program within NASA Earth Sciences to establish a COSSE capability. Creation of a COSSE program within NASA Earth Sciences will require answers from the climate research community to basic questions, such as whether a COSSE capability should be centralized or de-centralized. Most importantly, the quantified scientific objective of a proposed mission must be defined with extreme specificity for a COSSE to be applied.

  4. Modeling and design of light powered biomimicry micropump utilizing transporter proteins

    NASA Astrophysics Data System (ADS)

    Liu, Jin; Sze, Tsun-Kay Jackie; Dutta, Prashanta

    2014-11-01

    The creation of compact micropumps to provide steady flow has been an on-going challenge in the field of microfluidics. We present a mathematical model for a micropump utilizing Bacteriorhodopsin and sugar transporter proteins. This micropump utilizes transporter proteins as method to drive fluid flow by converting light energy into chemical potential. The fluid flow through a microchannel is simulated using the Nernst-Planck, Navier-Stokes, and continuity equations. Numerical results show that the micropump is capable of generating usable pressure. Designing parameters influencing the performance of the micropump are investigated including membrane fraction, lipid proton permeability, illumination, and channel height. The results show that there is a substantial membrane fraction region at which fluid flow is maximized. The use of lipids with low membrane proton permeability allows illumination to be used as a method to turn the pump on and off. This capability allows the micropump to be activated and shut off remotely without bulky support equipment. This modeling work provides new insights on mechanisms potentially useful for fluidic pumping in self-sustained bio-mimic microfluidic pumps. This work is supported in part by the National Science Fundation Grant CBET-1250107.

  5. Tolerance design of patient-specific range QA using the DMAIC framework in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Manger, Ryan P; Kim, Tae Hyun; Oh, Do Hoon; Kim, Dae Yong; Kim, Gwe-Ya

    2018-02-01

    To implement the DMAIC (Define-Measure-Analyze-Improve-Control) can be used for customizing the patient-specific QA by designing site-specific range tolerances. The DMAIC framework (process flow diagram, cause and effect, Pareto chart, control chart, and capability analysis) were utilized to determine the steps that need focus for improving the patient-specific QA. The patient-specific range QA plans were selected according to seven treatment site groups, a total of 1437 cases. The process capability index, C pm was used to guide the tolerance design of patient site-specific range. For prostate field, our results suggested that the patient range measurements were capable at the current tolerance level of ±1 mm in clinical proton plans. For other site-specific ranges, we analyzed that the tolerance tends to be overdesigned to insufficient process capability calculated by the patient-specific QA data. The customized tolerances were calculated for treatment sites. Control charts were constructed to simulate the patient QA time before and after the new tolerances were implemented. It is found that the total simulation QA time was decreased on average of approximately 20% after establishing new site-specific range tolerances. We simulated the financial impact of this project. The QA failure for whole process in proton therapy would lead up to approximately 30% increase in total cost. DMAIC framework can be used to provide an effective QA by setting customized tolerances. When tolerance design is customized, the quality is reasonably balanced with time and cost demands. © 2017 American Association of Physicists in Medicine.

  6. Missile airframe simulation testbed: MANPADS (MAST-M) for test and evaluation of aircraft survivability equipment

    NASA Astrophysics Data System (ADS)

    Clements, Jim; Robinson, Richard; Bunt, Leslie; Robinson, Joe

    2011-06-01

    A number of techniques have been utilized to evaluate the performance of Aircraft Survivability Equipment (ASE) against threat Man-Portable Air Defense Systems (MANPADS). These techniques include flying actual threat MANPADS against stationary ASE with simulated aircraft signatures, testing installed ASE systems against simulated threat signatures, and laboratory hardware-in-the-loop (HWIL) testing with simulated aircraft and simulated missile signatures. All of these tests lack the realism of evaluating installed ASE against in-flight MANPADS on a terminal homing intercept path toward the actual ASE equipped aircraft. This limitation is due primarily to the current inability to perform non-destructive MANPADS/Aircraft flight testing. The U.S. Army Aviation and Missile Research and Development and Engineering Center (AMRDEC) is working to overcome this limitation with the development of a recoverable surrogate MANPADS missile system capable of engaging aircraft equipped with ASE while guaranteeing collision avoidance with the test aircraft. Under its Missile Airframe Simulation Testbed - MANPADS (MAST-M) program, the AMRDEC is developing a surrogate missile system which will utilize actual threat MANPADS seeker/guidance sections to control the flight of a surrogate missile which will perform a collision avoidance and recovery maneuver prior to intercept to insure non-destructive test and evaluation of the ASE and reuse of the MANPADS seeker/guidance section. The remainder of this paper provides an overview of this development program and intended use.

  7. Computational Evaluation of Mg–Salen Compounds as Subsurface Fluid Tracers: Molecular Dynamics Simulations in Toluene–Water Mixtures and Clay Mineral Nanopores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greathouse, Jeffery A.; Boyle, Timothy J.; Kemp, Richard A.

    Molecular tracers that can be selectively placed underground and uniquely identified at the surface using simple on-site spectroscopic methods would significantly enhance subsurface fluid monitoring capabilities. To ensure their widespread utility, the solubility of these tracers must be easily tuned to oil- or water-wet conditions as well as reducing or eliminating their propensity to adsorb onto subsurface rock and/or mineral phases. In this work, molecular dynamics simulations were used to investigate the relative solubilities and mineral surface adsorption properties of three candidate tracer compounds comprising Mg–salen derivatives of varying degrees of hydrophilic character. Simulations in water–toluene liquid mixtures indicate thatmore » the partitioning of each Mg–salen compound relative to the interface is strongly influenced by the degree of hydrophobicity of the compound. Simulations of these complexes in fluid-filled mineral nanopores containing neutral (kaolinite) and negatively charged (montmorillonite) mineral surfaces reveal that adsorption tendencies depend upon a variety of parameters, including tracer chemical properties, mineral surface type, and solvent type (water or toluene). Simulation snapshots and averaged density profiles reveal insight into the solvation and adsorption mechanisms that control the partitioning of these complexes in mixed liquid phases and nanopore environments. As a result, this work demonstrates the utility of molecular simulation in the design and screening of molecular tracers for use in subsurface applications.« less

  8. Computational Evaluation of Mg–Salen Compounds as Subsurface Fluid Tracers: Molecular Dynamics Simulations in Toluene–Water Mixtures and Clay Mineral Nanopores

    DOE PAGES

    Greathouse, Jeffery A.; Boyle, Timothy J.; Kemp, Richard A.

    2018-04-11

    Molecular tracers that can be selectively placed underground and uniquely identified at the surface using simple on-site spectroscopic methods would significantly enhance subsurface fluid monitoring capabilities. To ensure their widespread utility, the solubility of these tracers must be easily tuned to oil- or water-wet conditions as well as reducing or eliminating their propensity to adsorb onto subsurface rock and/or mineral phases. In this work, molecular dynamics simulations were used to investigate the relative solubilities and mineral surface adsorption properties of three candidate tracer compounds comprising Mg–salen derivatives of varying degrees of hydrophilic character. Simulations in water–toluene liquid mixtures indicate thatmore » the partitioning of each Mg–salen compound relative to the interface is strongly influenced by the degree of hydrophobicity of the compound. Simulations of these complexes in fluid-filled mineral nanopores containing neutral (kaolinite) and negatively charged (montmorillonite) mineral surfaces reveal that adsorption tendencies depend upon a variety of parameters, including tracer chemical properties, mineral surface type, and solvent type (water or toluene). Simulation snapshots and averaged density profiles reveal insight into the solvation and adsorption mechanisms that control the partitioning of these complexes in mixed liquid phases and nanopore environments. As a result, this work demonstrates the utility of molecular simulation in the design and screening of molecular tracers for use in subsurface applications.« less

  9. Small Propeller and Rotor Testing Capabilities of the NASA Langley Low Speed Aeroacoustic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Zawodny, Nikolas S.; Haskin, Henry H.

    2017-01-01

    The Low Speed Aeroacoustic Wind Tunnel (LSAWT) at NASA Langley Research Center has recently undergone a configuration change. This change incorporates an inlet nozzle extension meant to serve the dual purposes of achieving lower free-stream velocities as well as a larger core flow region. The LSAWT, part of the NASA Langley Jet Noise Laboratory, had historically been utilized to simulate realistic forward flight conditions of commercial and military aircraft engines in an anechoic environment. The facility was modified starting in 2016 in order to expand its capabilities for the aerodynamic and acoustic testing of small propeller and unmanned aircraft system (UAS) rotor configurations. This paper describes the modifications made to the facility, its current aerodynamic and acoustic capabilities, the propeller and UAS rotor-vehicle configurations to be tested, and some preliminary predictions and experimental data for isolated propeller and UAS rotor con figurations, respectively. Isolated propeller simulations have been performed spanning a range of advance ratios to identify the theoretical propeller operational limits of the LSAWT. Performance and acoustic measurements of an isolated UAS rotor in hover conditions are found to compare favorably with previously measured data in an anechoic chamber and blade element-based acoustic predictions.

  10. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  11. Pumping Optimization Model for Pump and Treat Systems - 15091

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, S.; Ivarson, Kristine A.; Karanovic, M.

    2015-01-15

    Pump and Treat systems are being utilized to remediate contaminated groundwater in the Hanford 100 Areas adjacent to the Columbia River in Eastern Washington. Design of the systems was supported by a three-dimensional (3D) fate and transport model. This model provided sophisticated simulation capabilities but requires many hours to calculate results for each simulation considered. Many simulations are required to optimize system performance, so a two-dimensional (2D) model was created to reduce run time. The 2D model was developed as a equivalent-property version of the 3D model that derives boundary conditions and aquifer properties from the 3D model. It producesmore » predictions that are very close to the 3D model predictions, allowing it to be used for comparative remedy analyses. Any potential system modifications identified by using the 2D version are verified for use by running the 3D model to confirm performance. The 2D model was incorporated into a comprehensive analysis system (the Pumping Optimization Model, POM) to simplify analysis of multiple simulations. It allows rapid turnaround by utilizing a graphical user interface that: 1 allows operators to create hypothetical scenarios for system operation, 2 feeds the input to the 2D fate and transport model, and 3 displays the scenario results to evaluate performance improvement. All of the above is accomplished within the user interface. Complex analyses can be completed within a few hours and multiple simulations can be compared side-by-side. The POM utilizes standard office computing equipment and established groundwater modeling software.« less

  12. Numerical simulation of distributed snow processes in complex terrain utilizing triangulated irregular networks (TINs)

    NASA Astrophysics Data System (ADS)

    Rinehart, A. J.; Vivoni, E. R.

    2005-12-01

    Snow processes play a significant role in the hydrologic cycle of mountainous and high-latitude catchments in the western United States. Snowmelt runoff contributes to a large percentage of stream runoff while snow covered regions remain highly localized to small portions of the catchment area. The appropriate representation of snow dynamics at a given range of spatial and temporal scales is critical for adequately predicting runoff responses in snowmelt-dominated watersheds. In particular, the accurate depiction of snow cover patterns is important as a range of topographic, land-use and geographic parameters create zones of preferential snow accumulation or ablation that significantly affect the timing of a region's snow melt and the persistence of a snow pack. In this study, we present the development and testing of a distributed snow model designed for simulations over complex terrain. The snow model is developed within the context of the TIN-based Real-time Integrated Basin Simulator (tRIBS), a fully-distributed watershed model capable of continuous simulations of coupled hydrological processes, including unsaturated-saturated zone dynamics, land-atmosphere interactions and runoff generation via multiple mechanisms. The use of triangulated irregular networks as a domain discretization allows tRIBS to accurately represent topography with a reduced number of computational nodes, as compared to traditional grid-based models. This representation is developed using a Delauney optimization criterion that causes areas of topographic homogeneity to be represented at larger spatial scales than the original grid, while more heterogeneous areas are represented at higher resolutions. We utilize the TIN-based terrain representation to simulate microscale (10-m to 100-m) snow pack dynamics over a catchment. The model includes processes such as the snow pack energy balance, wind and bulk redistribution, and snow interception by vegetation. For this study, we present tests from a distributed one-layer energy balance model as applied to a northern New Mexico hillslope in a ponderosa pine forest using both synthetic and real meteorological forcing. We also provide tests of the model's capability to represent spatial patterns within a small watershed in the Jemez Mountain region. Finally, we discuss the interaction of the tested snow process module with existing components in the watershed model and additional applications and capabilities under development.

  13. A Multi-Paradigm Modeling Framework to Simulate Dynamic Reciprocity in a Bioreactor

    PubMed Central

    Kaul, Himanshu; Cui, Zhanfeng; Ventikos, Yiannis

    2013-01-01

    Despite numerous technology advances, bioreactors are still mostly utilized as functional black-boxes where trial and error eventually leads to the desirable cellular outcome. Investigators have applied various computational approaches to understand the impact the internal dynamics of such devices has on overall cell growth, but such models cannot provide a comprehensive perspective regarding the system dynamics, due to limitations inherent to the underlying approaches. In this study, a novel multi-paradigm modeling platform capable of simulating the dynamic bidirectional relationship between cells and their microenvironment is presented. Designing the modeling platform entailed combining and coupling fully an agent-based modeling platform with a transport phenomena computational modeling framework. To demonstrate capability, the platform was used to study the impact of bioreactor parameters on the overall cell population behavior and vice versa. In order to achieve this, virtual bioreactors were constructed and seeded. The virtual cells, guided by a set of rules involving the simulated mass transport inside the bioreactor, as well as cell-related probabilistic parameters, were capable of displaying an array of behaviors such as proliferation, migration, chemotaxis and apoptosis. In this way the platform was shown to capture not only the impact of bioreactor transport processes on cellular behavior but also the influence that cellular activity wields on that very same local mass transport, thereby influencing overall cell growth. The platform was validated by simulating cellular chemotaxis in a virtual direct visualization chamber and comparing the simulation with its experimental analogue. The results presented in this paper are in agreement with published models of similar flavor. The modeling platform can be used as a concept selection tool to optimize bioreactor design specifications. PMID:23555740

  14. Multispectral simulation environment for modeling low-light-level sensor systems

    NASA Astrophysics Data System (ADS)

    Ientilucci, Emmett J.; Brown, Scott D.; Schott, John R.; Raqueno, Rolando V.

    1998-11-01

    Image intensifying cameras have been found to be extremely useful in low-light-level (LLL) scenarios including military night vision and civilian rescue operations. These sensors utilize the available visible region photons and an amplification process to produce high contrast imagery. It has been demonstrated that processing techniques can further enhance the quality of this imagery. For example, fusion with matching thermal IR imagery can improve image content when very little visible region contrast is available. To aid in the improvement of current algorithms and the development of new ones, a high fidelity simulation environment capable of producing radiometrically correct multi-band imagery for low- light-level conditions is desired. This paper describes a modeling environment attempting to meet these criteria by addressing the task as two individual components: (1) prediction of a low-light-level radiance field from an arbitrary scene, and (2) simulation of the output from a low- light-level sensor for a given radiance field. The radiance prediction engine utilized in this environment is the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model which is a first principles based multi-spectral synthetic image generation model capable of producing an arbitrary number of bands in the 0.28 to 20 micrometer region. The DIRSIG model is utilized to produce high spatial and spectral resolution radiance field images. These images are then processed by a user configurable multi-stage low-light-level sensor model that applies the appropriate noise and modulation transfer function (MTF) at each stage in the image processing chain. This includes the ability to reproduce common intensifying sensor artifacts such as saturation and 'blooming.' Additionally, co-registered imagery in other spectral bands may be simultaneously generated for testing fusion and exploitation algorithms. This paper discusses specific aspects of the DIRSIG radiance prediction for low- light-level conditions including the incorporation of natural and man-made sources which emphasizes the importance of accurate BRDF. A description of the implementation of each stage in the image processing and capture chain for the LLL model is also presented. Finally, simulated images are presented and qualitatively compared to lab acquired imagery from a commercial system.

  15. Overview of Experimental Capabilities - Supersonics

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2007-01-01

    This viewgraph presentation gives an overview of experimental capabilities applicable to the area of supersonic research. The contents include: 1) EC Objectives; 2) SUP.11: Elements; 3) NRA; 4) Advanced Flight Simulator Flexible Aircraft Simulation Studies; 5) Advanced Flight Simulator Flying Qualities Guideline Development for Flexible Supersonic Transport Aircraft; 6) Advanced Flight Simulator Rigid/Flex Flight Control; 7) Advanced Flight Simulator Rapid Sim Model Exchange; 8) Flight Test Capabilities Advanced In-Flight Infrared (IR) Thermography; 9) Flight Test Capabilities In-Flight Schlieren; 10) Flight Test Capabilities CLIP Flow Calibration; 11) Flight Test Capabilities PFTF Flowfield Survey; 12) Ground Test Capabilities Laser-Induced Thermal Acoustics (LITA); 13) Ground Test Capabilities Doppler Global Velocimetry (DGV); 14) Ground Test Capabilities Doppler Global Velocimetry (DGV); and 15) Ground Test Capabilities EDL Optical Measurement Capability (PIV) for Rigid/Flexible Decelerator Models.

  16. Interplanetary Transit Simulations Using the International Space Station

    NASA Technical Reports Server (NTRS)

    Charles, John B.; Arya, M.; Kundrot, C. E.

    2010-01-01

    We evaluated the space life sciences utility of the International Space Station (ISS) to simulate the outbound transit portion of missions to Mars and Near Earth Asteroids (NEA) to investigate biomedical and psychological aspects of such transits, to develop and test space operation procedures compatible with communication delays and outages, and to demonstrate and validate technologies and countermeasures. Two major categories of space life sciences activities can capitalize on ISS capabilities. The first includes studies that require ISS (or a comparable facility), typically for access to prolonged weightlessness. The second includes studies that do not strictly require ISS but can exploit it to maximize their scientific return more efficiently and productively than in ground-based simulations. For these studies, ISS offers a high fidelity analog for fundamental factors on future missions, such as crew composition, mission control personnel, operational tasks and workload, real-world risk, and isolation, and can mimic the effects of distance and limited accessibility. In addition to conducting Mars- and NEA-transit simulations on 6-month ISS increments, extending the current ISS increment duration from 6 months to 9 or even 12 months will provide opportunities for enhanced and focused research relevant to long duration Mars and NEA missions. Increasing the crew duration may pose little additional risk to crewmembers beyond that currently accepted on 6-month increments, but additional medical monitoring capabilities will be required beyond those currently used for ISS operations. Finally, while presenting major logistical challenges, such a simulation followed by a post-landing simulation of Mars exploration could provide quantitative evidence of capabilities in an actual mission. Thus, the use of ISS to simulate aspects of Mars and NEA missions seems practical. If it were to be implemented without major disruption of on-going ISS activities, then planning should begin soon, in close consultation with all international partners.

  17. GrDHP: a general utility function representation for dual heuristic dynamic programming.

    PubMed

    Ni, Zhen; He, Haibo; Zhao, Dongbin; Xu, Xin; Prokhorov, Danil V

    2015-03-01

    A general utility function representation is proposed to provide the required derivable and adjustable utility function for the dual heuristic dynamic programming (DHP) design. Goal representation DHP (GrDHP) is presented with a goal network being on top of the traditional DHP design. This goal network provides a general mapping between the system states and the derivatives of the utility function. With this proposed architecture, we can obtain the required derivatives of the utility function directly from the goal network. In addition, instead of a fixed predefined utility function in literature, we conduct an online learning process for the goal network so that the derivatives of the utility function can be adaptively tuned over time. We provide the control performance of both the proposed GrDHP and the traditional DHP approaches under the same environment and parameter settings. The statistical simulation results and the snapshot of the system variables are presented to demonstrate the improved learning and controlling performance. We also apply both approaches to a power system example to further demonstrate the control capabilities of the GrDHP approach.

  18. Demonstration of coal reburning for cyclone boiler NO{sub x} control. Appendix, Book 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Based on the industry need for a pilot-scale cyclone boiler simulator, Babcock Wilcox (B&W) designed, fabricated, and installed such a facility at its Alliance Research Center (ARC) in 1985. The project involved conversion of an existing pulverized coal-fired facility to be cyclone-firing capable. Additionally, convective section tube banks were installed in the upper furnace in order to simulate a typical boiler convection pass. The small boiler simulator (SBS) is designed to simulate most fireside aspects of full-size utility boilers such as combustion and flue gas emissions characteristics, fireside deposition, etc. Prior to the design of the pilot-scale cyclone boiler simulator,more » the various cyclone boiler types were reviewed in order to identify the inherent cyclone boiler design characteristics which are applicable to the majority of these boilers. The cyclone boiler characteristics that were reviewed include NO{sub x} emissions, furnace exit gas temperature (FEGT) carbon loss, and total furnace residence time. Previous pilot-scale cyclone-fired furnace experience identified the following concerns: (1) Operability of a small cyclone furnace (e.g., continuous slag tapping capability). (2) The optimum cyclone(s) configuration for the pilot-scale unit. (3) Compatibility of NO{sub x} levels, carbon burnout, cyclone ash carryover to the convection pass, cyclone temperature, furnace residence time, and FEGT.« less

  19. Investigation of a compact coaxially fed switched oscillator.

    PubMed

    Wang, Yuwei; Chen, Dongqun; Zhang, Jiande; Cao, Shengguang; Li, Da; Liu, Chebo

    2013-09-01

    To generate a relative high frequency mesoband microwave, a compact coaxially fed transmission line switched oscillator with high voltage capability is investigated. The characteristic impedance and voltage capability of the low impedance transmission line (LITL) have been analyzed. It is shown that the working voltage of the oscillator can reach up to 200 kV when it is filled by pressurized nitrogen and charged by a nanosecond driving source. By utilizing a commercial electromagnetic simulation code, the transient performance of the switched oscillator with a lumped resistance load is simulated. It is illustrated that the center frequency of the output signal reaches up to ~0.6 GHz when the spark gap practically closes with a single channel. Besides, the influence of the closing mode and rapidity of the spark gap, the permittivity of the insulator at the output end of the LITL, and the load impedance on the transient performance of the designed oscillator has been analyzed in quantification. Finally, the good transient performance of the switched oscillator has been preliminarily proved by the experiment.

  20. Constraining the Origin of Phobos with the Elpasolite Planetary Ice and Composition Spectrometer (EPICS) - Simulated Performance

    NASA Astrophysics Data System (ADS)

    Nowicki, S. F.; Mesick, K.; Coupland, D. D. S.; Dallmann, N. A.; Feldman, W. C.; Stonehill, L. C.; Hardgrove, C.; Dibb, S.; Gabriel, T. S. J.; West, S.

    2017-12-01

    Elpasolites are a promising new family of inorganic scintillators that can detect both gamma rays and neutrons within a single detector volume, reducing the instrument size, weight, and power (SWaP), all of which are critical for planetary science missions. The ability to distinguish between neutron and gamma events is done through pulse shape discrimination (PSD). The Elpasolite Planetary Ice and Composition Spectrometer (EPICS) utilizes elpasolites in a next-generation, highly capable, low-SWaP gamma-ray and neutron spectrometer. We present simulated capabilities of EPICS sensitivities to neutron and gamma-rays, and demonstrate how EPICS can constrain the origin of Phobos between the following three main hypotheses: 1) accretion after a giant impact with Mars, 2) co-accretion with Mars, and 3) capture of an external body. The MCNP6 code was used to calculate the neutron and gamma-ray flux that escape the surface of Phobos, and GEANT4 to model the response of the EPICS instrument on orbit around Phobos.

  1. Multiscale molecular dynamics simulations of rotary motor proteins.

    PubMed

    Ekimoto, Toru; Ikeguchi, Mitsunori

    2018-04-01

    Protein functions require specific structures frequently coupled with conformational changes. The scale of the structural dynamics of proteins spans from the atomic to the molecular level. Theoretically, all-atom molecular dynamics (MD) simulation is a powerful tool to investigate protein dynamics because the MD simulation is capable of capturing conformational changes obeying the intrinsically structural features. However, to study long-timescale dynamics, efficient sampling techniques and coarse-grained (CG) approaches coupled with all-atom MD simulations, termed multiscale MD simulations, are required to overcome the timescale limitation in all-atom MD simulations. Here, we review two examples of rotary motor proteins examined using free energy landscape (FEL) analysis and CG-MD simulations. In the FEL analysis, FEL is calculated as a function of reaction coordinates, and the long-timescale dynamics corresponding to conformational changes is described as transitions on the FEL surface. Another approach is the utilization of the CG model, in which the CG parameters are tuned using the fluctuation matching methodology with all-atom MD simulations. The long-timespan dynamics is then elucidated straightforwardly by using CG-MD simulations.

  2. Manned systems utilization analysis (study 2.1). Volume 5: Program listing for the LOVES computer code

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1975-01-01

    The LOVES computer code developed to investigate the concept of space servicing operational satellites as an alternative to replacing expendable satellites or returning satellites to earth for ground refurbishment is presented. In addition to having the capability to simulate the expendable satellite operation and the ground refurbished satellite operation, the program is designed to simulate the logistics of space servicing satellites using an upper stage vehicle and/or the earth to orbit shuttle. The program not only provides for the initial deployment of the satellite but also simulates the random failure and subsequent replacement of various equipment modules comprising the satellite. The program has been used primarily to conduct trade studies and/or parametric studies of various space program operational philosophies.

  3. A computational approach for hypersonic nonequilibrium radiation utilizing space partition algorithm and Gauss quadrature

    NASA Astrophysics Data System (ADS)

    Shang, J. S.; Andrienko, D. A.; Huang, P. G.; Surzhikov, S. T.

    2014-06-01

    An efficient computational capability for nonequilibrium radiation simulation via the ray tracing technique has been accomplished. The radiative rate equation is iteratively coupled with the aerodynamic conservation laws including nonequilibrium chemical and chemical-physical kinetic models. The spectral properties along tracing rays are determined by a space partition algorithm of the nearest neighbor search process, and the numerical accuracy is further enhanced by a local resolution refinement using the Gauss-Lobatto polynomial. The interdisciplinary governing equations are solved by an implicit delta formulation through the diminishing residual approach. The axisymmetric radiating flow fields over the reentry RAM-CII probe have been simulated and verified with flight data and previous solutions by traditional methods. A computational efficiency gain nearly forty times is realized over that of the existing simulation procedures.

  4. Accurate simulations of helium pick-up experiments using a rejection-free Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Dutra, Matthew; Hinde, Robert

    2018-04-01

    In this paper, we present Monte Carlo simulations of helium droplet pick-up experiments with the intention of developing a robust and accurate theoretical approach for interpreting experimental helium droplet calorimetry data. Our approach is capable of capturing the evaporative behavior of helium droplets following dopant acquisition, allowing for a more realistic description of the pick-up process. Furthermore, we circumvent the traditional assumption of bulk helium behavior by utilizing density functional calculations of the size-dependent helium droplet chemical potential. The results of this new Monte Carlo technique are compared to commonly used Poisson pick-up statistics for simulations that reflect a broad range of experimental parameters. We conclude by offering an assessment of both of these theoretical approaches in the context of our observed results.

  5. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  6. Status Report on NEAMS System Analysis Module Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, R.; Fanning, T. H.; Sumner, T.

    2015-12-01

    Under the Reactor Product Line (RPL) of DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, an advanced SFR System Analysis Module (SAM) is being developed at Argonne National Laboratory. The goal of the SAM development is to provide fast-running, improved-fidelity, whole-plant transient analyses capabilities. SAM utilizes an object-oriented application framework MOOSE), and its underlying meshing and finite-element library libMesh, as well as linear and non-linear solvers PETSc, to leverage modern advanced software environments and numerical methods. It also incorporates advances in physical and empirical models and seeks closure models based on information from high-fidelity simulations and experiments. This reportmore » provides an update on the SAM development, and summarizes the activities performed in FY15 and the first quarter of FY16. The tasks include: (1) implement the support of 2nd-order finite elements in SAM components for improved accuracy and computational efficiency; (2) improve the conjugate heat transfer modeling and develop pseudo 3-D full-core reactor heat transfer capabilities; (3) perform verification and validation tests as well as demonstration simulations; (4) develop the coupling requirements for SAS4A/SASSYS-1 and SAM integration.« less

  7. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  8. A novel methodology for litho-to-etch pattern fidelity correction for SADP process

    NASA Astrophysics Data System (ADS)

    Chen, Shr-Jia; Chang, Yu-Cheng; Lin, Arthur; Chang, Yi-Shiang; Lin, Chia-Chi; Lai, Jun-Cheng

    2017-03-01

    For 2x nm node semiconductor devices and beyond, more aggressive resolution enhancement techniques (RETs) such as source-mask co-optimization (SMO), litho-etch-litho-etch (LELE) and self-aligned double patterning (SADP) are utilized for the low k1 factor lithography processes. In the SADP process, the pattern fidelity is extremely critical since a slight photoresist (PR) top-loss or profile roughness may impact the later core trim process, due to its sensitivity to environment. During the subsequent sidewall formation and core removal processes, the core trim profile weakness may worsen and induces serious defects that affect the final electrical performance. To predict PR top-loss, a rigorous lithography simulation can provide a reference to modify mask layouts; but it takes a much longer run time and is not capable of full-field mask data preparation. In this paper, we first brought out an algorithm which utilizes multi-intensity levels from conventional aerial image simulation to assess the physical profile through lithography to core trim etching steps. Subsequently, a novel correction method was utilized to improve the post-etch pattern fidelity without the litho. process window suffering. The results not only matched PR top-loss in rigorous lithography simulation, but also agreed with post-etch wafer data. Furthermore, this methodology can also be incorporated with OPC and post-OPC verification to improve core trim profile and final pattern fidelity at an early stage.

  9. Validating Inertial Confinement Fusion (ICF) predictive capability using perturbed capsules

    NASA Astrophysics Data System (ADS)

    Schmitt, Mark; Magelssen, Glenn; Tregillis, Ian; Hsu, Scott; Bradley, Paul; Dodd, Evan; Cobble, James; Flippo, Kirk; Offerman, Dustin; Obrey, Kimberly; Wang, Yi-Ming; Watt, Robert; Wilke, Mark; Wysocki, Frederick; Batha, Steven

    2009-11-01

    Achieving ignition on NIF is a monumental step on the path toward utilizing fusion as a controlled energy source. Obtaining robust ignition requires accurate ICF models to predict the degradation of ignition caused by heterogeneities in capsule construction and irradiation. LANL has embarked on a project to induce controlled defects in capsules to validate our ability to predict their effects on fusion burn. These efforts include the validation of feature-driven hydrodynamics and mix in a convergent geometry. This capability is needed to determine the performance of capsules imploded under less-than-optimum conditions on future IFE facilities. LANL's recently initiated Defect Implosion Experiments (DIME) conducted at Rochester's Omega facility are providing input for these efforts. Recent simulation and experimental results will be shown.

  10. A large scale software system for simulation and design optimization of mechanical systems

    NASA Technical Reports Server (NTRS)

    Dopker, Bernhard; Haug, Edward J.

    1989-01-01

    The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.

  11. A Low Loss Microstrip Antenna for Radiometric Applications

    NASA Technical Reports Server (NTRS)

    Wahid, Parveen

    2000-01-01

    The design and analysis of a series-fed, low-loss, inverted microstrip array antenna, operating at 1.413 GHz is presented. The antenna is composed of two subarrays. Each subarray consists of an equal number of microstrip patches all connected together with microstrip lines. In the first design microstrip array for linear polarization is presented which incorporated a series feeding technique. The next design, which is capable of dual linear polarization (V-polarization and H-polarization), utilizes a corporate feed network for the V-pol and series feed arrangement for the H-pol. The first element of each subarray for H-pol is coaxially fed with a 180 deg phase difference. This approach ensures a symmetric radiation pattern on broadside in H-pol. For the V-pol two feeds are in the same phase on the two subarrays ensuring a broadside beam in V-pol. The designs presented here are simulated using the IE3D code that utilizes the method of moments. Measured results are compared with simulated results and show good agreement.

  12. Development and Validation of an Automated Simulation Capability in Support of Integrated Demand Management

    NASA Technical Reports Server (NTRS)

    Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh

    2017-01-01

    Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies that will inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. This paper also describes the initial validation of individual components of the automated simulation capability, and an example application comparing the performance of the IDM concept under two TBFM scheduling paradigms. The results and conclusions from this simulation compare closely to those from previous HITL simulations using similar scenarios, providing an initial validation of the automated simulation capability.

  13. Multicolor Three-Dimensional Tracking for Single-Molecule Fluorescence Resonance Energy Transfer Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Aaron M.; DeVore, Matthew S.; Stich, Dominik G.

    Single-molecule fluorescence resonance energy transfer (smFRET) remains a widely utilized and powerful tool for quantifying heterogeneous interactions and conformational dynamics of biomolecules. However, traditional smFRET experiments either are limited to short observation times (typically less than 1 ms) in the case of “burst” confocal measurements or require surface immobilization which usually has a temporal resolution limited by the camera framing rate. We developed a smFRET 3D tracking microscope that is capable of observing single particles for extended periods of time with high temporal resolution. The confocal tracking microscope utilizes closed-loop feedback to follow the particle in solution by recentering itmore » within two overlapping tetrahedral detection elements, corresponding to donor and acceptor channels. We demonstrated the microscope’s multicolor tracking capability via random walk simulations and experimental tracking of 200 nm fluorescent beads in water with a range of apparent smFRET efficiency values, 0.45-0.69. We also demonstrated the microscope’s capability to track and quantify double-stranded DNA undergoing intramolecular smFRET in a viscous glycerol solution. In future experiments, the smFRET 3D tracking system will be used to study protein conformational dynamics while diffusing in solution and native biological environments with high temporal resolution.« less

  14. Multicolor Three-Dimensional Tracking for Single-Molecule Fluorescence Resonance Energy Transfer Measurements

    DOE PAGES

    Keller, Aaron M.; DeVore, Matthew S.; Stich, Dominik G.; ...

    2018-04-19

    Single-molecule fluorescence resonance energy transfer (smFRET) remains a widely utilized and powerful tool for quantifying heterogeneous interactions and conformational dynamics of biomolecules. However, traditional smFRET experiments either are limited to short observation times (typically less than 1 ms) in the case of “burst” confocal measurements or require surface immobilization which usually has a temporal resolution limited by the camera framing rate. We developed a smFRET 3D tracking microscope that is capable of observing single particles for extended periods of time with high temporal resolution. The confocal tracking microscope utilizes closed-loop feedback to follow the particle in solution by recentering itmore » within two overlapping tetrahedral detection elements, corresponding to donor and acceptor channels. We demonstrated the microscope’s multicolor tracking capability via random walk simulations and experimental tracking of 200 nm fluorescent beads in water with a range of apparent smFRET efficiency values, 0.45-0.69. We also demonstrated the microscope’s capability to track and quantify double-stranded DNA undergoing intramolecular smFRET in a viscous glycerol solution. In future experiments, the smFRET 3D tracking system will be used to study protein conformational dynamics while diffusing in solution and native biological environments with high temporal resolution.« less

  15. Automated Rendezvous and Capture System Development and Simulation for NASA

    NASA Technical Reports Server (NTRS)

    Roe, Fred D.; Howard, Richard T.; Murphy, Leslie

    2004-01-01

    The United States does not have an Automated Rendezvous and Capture/Docking (AR and C) capability and is reliant on manned control for rendezvous and docking of orbiting spacecraft. This reliance on the labor intensive manned interface for control of rendezvous and docking vehicles has a significant impact on the cost of the operation of the International Space Station (ISS) and precludes the use of any U.S. expendable launch capabilities for Space Station resupply. The Soviets have the capability to autonomously dock in space, but their system produces a hard docking with excessive force and contact velocity. Automated Rendezvous and Capture/Docking has been identified as a key enabling technology for the Space Launch Initiative (SLI) Program, DARPA Orbital Express and other DOD Programs. The development and implementation of an AR&C capability can significantly enhance system flexibility, improve safety, and lower the cost of maintaining, supplying, and operating the International Space Station. The Marshall Space Flight Center (MSFC) has conducted pioneering research in the development of an automated rendezvous and capture (or docking) (AR and C) system for U.S. space vehicles. This AR&C system was tested extensively using hardware-in-the-loop simulations in the Flight Robotics Laboratory, and a rendezvous sensor, the Video Guidance Sensor was developed and successfully flown on the Space Shuttle on flights STS-87 and STS-95, proving the concept of a video- based sensor. Further developments in sensor technology and vehicle and target configuration have lead to continued improvements and changes in AR&C system development and simulation. A new Advanced Video Guidance Sensor (AVGS) with target will be utilized on the Demonstration of Autonomous Rendezvous Technologies (DART) flight experiment in 2004.

  16. Development of GENOA Progressive Failure Parallel Processing Software Systems

    NASA Technical Reports Server (NTRS)

    Abdi, Frank; Minnetyan, Levon

    1999-01-01

    A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.

  17. Stochastic simulation of power systems with integrated renewable and utility-scale storage resources

    NASA Astrophysics Data System (ADS)

    Degeilh, Yannick

    The push for a more sustainable electric supply has led various countries to adopt policies advocating the integration of renewable yet variable energy resources, such as wind and solar, into the grid. The challenges of integrating such time-varying, intermittent resources has in turn sparked a growing interest in the implementation of utility-scale energy storage resources ( ESRs), with MWweek storage capability. Indeed, storage devices provide flexibility to facilitate the management of power system operations in the presence of uncertain, highly time-varying and intermittent renewable resources. The ability to exploit the potential synergies between renewable and ESRs hinges on developing appropriate models, methodologies, tools and policy initiatives. We report on the development of a comprehensive simulation methodology that provides the capability to quantify the impacts of integrated renewable and ESRs on the economics, reliability and emission variable effects of power systems operating in a market environment. We model the uncertainty in the demands, the available capacity of conventional generation resources and the time-varying, intermittent renewable resources, with their temporal and spatial correlations, as discrete-time random processes. We deploy models of the ESRs to emulate their scheduling and operations in the transmission-constrained hourly day-ahead markets. To this end, we formulate a scheduling optimization problem (SOP) whose solutions determine the operational schedule of the controllable ESRs in coordination with the demands and the conventional/renewable resources. As such, the SOP serves the dual purpose of emulating the clearing of the transmission-constrained day-ahead markets (DAMs ) and scheduling the energy storage resource operations. We also represent the need for system operators to impose stricter ramping requirements on the conventional generating units so as to maintain the system capability to perform "load following'', i.e., respond to quick variations in the loads and renewable resource outputs in a manner that maintains the power balance, by incorporating appropriate ramping requirement constraints in the formulation of the SOP. The simulation approach makes use of Monte Carlo simulation techniques to represent the impacts of the sources of uncertainty on the side-by-side power system and market operations. As such, we systematically sample the "input'' random processes -- namely the buyer demands, renewable resource outputs and conventional generation resource available capacities -- to generate the realizations, or sample paths, that we use in the emulation of the transmission-constrained day-ahead markets via SOP . As a result, we obtain realizations of the market outcomes and storage resource operations that we can use to approximate their statistics. The approach not only has the capability to emulate the side-by-side power system and energy market operations with the explicit representation of the chronology of time-dependent phenomena -- including storage cycles of charge/discharge -- and constraints imposed by the transmission network in terms of deliverability of the energy, but also to provide the figures of merit for all metrics to assess the economics, reliability and the environmental impacts of the performance of those operations. Our efforts to address the implementational aspects of the methodology so as to ensure computational tractability for large-scale systems over longer periods include relaxing the SOP, the use of a "warm-start'' technique as well as representative simulation periods, parallelization and variance reduction techniques. Our simulation approach is useful in power system planning, operations and investment analysis. There is a broad range of applications of the simulation methodology to resource planning studies, production costing issues, investment analysis, transmission utilization, reliability analysis, environmental assessments, policy formulation and to answer quantitatively various what-if questions. We demonstrate the capabilities of the simulation approach by carrying out various studies on modified IEEE 118- and WECC 240-bus systems. The results of our representative case studies effectively illustrate the synergies among wind and ESRs. Our investigations clearly indicate that energy storage and wind resources tend to complement each other in the reduction of wholesale purchase payments in the DAMs and the improvement of system reliability. In addition, we observe that CO2 emission impacts with energy storage depend on the resource mix characteristics. An important finding is that storage seems to attenuate the "diminishing returns'' associated with increased penetration of wind generation. Our studies also evidence the limited ability of integrated ESRs to enhance the wind resource capability to replace conventional resources from purely a system reliability perspective. Some useful insights into the siting of ESRs are obtained and they indicate the potentially significant impacts of such decisions on the network congestion patterns and, consequently, on the LMPs. Simulation results further indicate that the explicit representation of ramping requirements on the conventional units at the DAM level causes the expected total wholesale purchase payments to increase, thereby mitigating the benefits of wind integration. The stricter ramping requirements are also shown to impact the revenues of generators that do not even provide any ramp capability services.

  18. Assessment of the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing

    2007-01-01

    The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.

  19. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  20. Sensing and Active Flow Control for Advanced BWB Propulsion-Airframe Integration Concepts

    NASA Technical Reports Server (NTRS)

    Fleming, John; Anderson, Jason; Ng, Wing; Harrison, Neal

    2005-01-01

    In order to realize the substantial performance benefits of serpentine boundary layer ingesting diffusers, this study investigated the use of enabling flow control methods to reduce engine-face flow distortion. Computational methods and novel flow control modeling techniques were utilized that allowed for rapid, accurate analysis of flow control geometries. Results were validated experimentally using the Techsburg Ejector-based wind tunnel facility; this facility is capable of simulating the high-altitude, high subsonic Mach number conditions representative of BWB cruise conditions.

  1. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  2. Proceedings of the Technical Conference on the Effects of Helicopter Downwash on Free Projectiles, held at Bridgeton, Missouri, on 12-14 August 1975

    DTIC Science & Technology

    1975-11-01

    limitations. It must conform to severe weight restrictions in order to attain hover and maneuver capability. It is a sensitive, vibrating platform...simulations had to be performed utilizing assumed data generated with standard momentum theory based on the size of the rotor and gross helicopter weight ...downwash intersects the rocket’s flight path; 8 (C) the weight of the aircraft influences the vertical downwash component almost linearly; and (d) the

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold H. Kritz

    PTRANSP, which is the predictive version of the TRANSP code, was developed in a collaborative effort involving the Princeton Plasma Physics Laboratory, General Atomics Corporation, Lawrence Livermore National Laboratory, and Lehigh University. The PTRANSP/TRANSP suite of codes is the premier integrated tokamak modeling software in the United States. A production service for PTRANSP/TRANSP simulations is maintained at the Princeton Plasma Physics Laboratory; the server has a simple command line client interface and is subscribed to by about 100 researchers from tokamak projects in the US, Europe, and Asia. This service produced nearly 13000 PTRANSP/TRANSP simulations in the four year periodmore » FY 2005 through FY 2008. Major archives of TRANSP results are maintained at PPPL, MIT, General Atomics, and JET. Recent utilization, counting experimental analysis simulations as well as predictive simulations, more than doubled from slightly over 2000 simulations per year in FY 2005 and FY 2006 to over 4300 simulations per year in FY 2007 and FY 2008. PTRANSP predictive simulations applied to ITER increased eight fold from 30 simulations per year in FY 2005 and FY 2006 to 240 simulations per year in FY 2007 and FY 2008, accounting for more than half of combined PTRANSP/TRANSP service CPU resource utilization in FY 2008. PTRANSP studies focused on ITER played a key role in journal articles. Examples of validation studies carried out for momentum transport in PTRANSP simulations were presented at the 2008 IAEA conference. The increase in number of PTRANSP simulations has continued (more than 7000 TRANSP/PTRANSP simulations in 2010) and results of PTRANSP simulations appear in conference proceedings, for example the 2010 IAEA conference, and in peer reviewed papers. PTRANSP provides a bridge to the Fusion Simulation Program (FSP) and to the future of integrated modeling. Through years of widespread usage, each of the many parts of the PTRANSP suite of codes has been thoroughly validated against experimental data and benchmarked against other codes. At the same time, architectural modernizations are improving the modularity of the PTRANSP code base. The NUBEAM neutral beam and fusion products fast ion model, the Plasma State data repository (developed originally in the SWIM SciDAC project and adapted for use in PTRANSP), and other components are already shared with the SWIM, FACETS, and CPES SciDAC FSP prototype projects. Thus, the PTRANSP code is already serving as a bridge between our present integrated modeling capability and future capability. As the Fusion Simulation Program builds toward the facility currently available in the PTRANSP suite of codes, early versions of the FSP core plasma model will need to be benchmarked against the PTRANSP simulations. This will be necessary to build user confidence in FSP, but this benchmarking can only be done if PTRANSP itself is maintained and developed.« less

  4. Effects of molybdenum on water utilization, antioxidative defense system and osmotic-adjustment ability in winter wheat (Triticum aestivum) under drought stress.

    PubMed

    Wu, Songwei; Hu, Chengxiao; Tan, Qiling; Nie, Zhaojun; Sun, Xuecheng

    2014-10-01

    Molybdenum (Mo), as an essential trace element in plants, plays an essential role in abiotic stress tolerance of plants. To obtain a better understanding of drought tolerance enhanced by Mo, a hydroponic trial was conducted to investigate the effects of Mo on water utilization, antioxidant enzymes, non-enzymatic antioxidants, and osmotic-adjustment products in the Mo-efficient '97003' and Mo-inefficient '97014' under PEG simulated drought stress. Our results indicate that Mo application significantly enhanced Pn, chlorophyll, dry matter, grain yield, biomass, RWC and WUE and decreased Tr, Gs and water loss of wheat under drought stress, suggesting that Mo application improved the water utilization capacity in wheat. The activities of antioxidant enzymes such as superoxide dismutase, peroxidase, catalase, ascorbate peroxidase and the contents of non-enzymatic antioxidants content such as ascorbic acid, reduced glutathione, carotenoid were significantly increased and malonaldehyde contents were decreased by Mo application under PEG simulated drought stress, suggesting that Mo application enhanced the ability of scavenging active oxygen species. The osmotic-adjustment products such as soluble protein, proline and soluble sugar were also increased by Mo application under PEG simulated drought stress, indicating that Mo improved the osmotic adjustment ability in wheat. It is hypothesized that Mo application might improve the drought tolerance of wheat by enhancing water utilization capability and the abilities of antioxidative defense and osmotic adjustment. Similarities and differences between the Mo-efficient and Mo-inefficient cultivars wheat in response to Mo under drought stress are discussed. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  5. Using whole disease modeling to inform resource allocation decisions: economic evaluation of a clinical guideline for colorectal cancer using a single model.

    PubMed

    Tappenden, Paul; Chilcott, Jim; Brennan, Alan; Squires, Hazel; Glynne-Jones, Rob; Tappenden, Janine

    2013-06-01

    To assess the feasibility and value of simulating whole disease and treatment pathways within a single model to provide a common economic basis for informing resource allocation decisions. A patient-level simulation model was developed with the intention of being capable of evaluating multiple topics within National Institute for Health and Clinical Excellence's colorectal cancer clinical guideline. The model simulates disease and treatment pathways from preclinical disease through to detection, diagnosis, adjuvant/neoadjuvant treatments, follow-up, curative/palliative treatments for metastases, supportive care, and eventual death. The model parameters were informed by meta-analyses, randomized trials, observational studies, health utility studies, audit data, costing sources, and expert opinion. Unobservable natural history parameters were calibrated against external data using Bayesian Markov chain Monte Carlo methods. Economic analysis was undertaken using conventional cost-utility decision rules within each guideline topic and constrained maximization rules across multiple topics. Under usual processes for guideline development, piecewise economic modeling would have been used to evaluate between one and three topics. The Whole Disease Model was capable of evaluating 11 of 15 guideline topics, ranging from alternative diagnostic technologies through to treatments for metastatic disease. The constrained maximization analysis identified a configuration of colorectal services that is expected to maximize quality-adjusted life-year gains without exceeding current expenditure levels. This study indicates that Whole Disease Model development is feasible and can allow for the economic analysis of most interventions across a disease service within a consistent conceptual and mathematical infrastructure. This disease-level modeling approach may be of particular value in providing an economic basis to support other clinical guidelines. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. Leveraging Mechanism Simplicity and Strategic Averaging to Identify Signals from Highly Heterogeneous Spatial and Temporal Ozone Data

    NASA Astrophysics Data System (ADS)

    Brown-Steiner, B.; Selin, N. E.; Prinn, R. G.; Monier, E.; Garcia-Menendez, F.; Tilmes, S.; Emmons, L. K.; Lamarque, J. F.; Cameron-Smith, P. J.

    2017-12-01

    We summarize two methods to aid in the identification of ozone signals from underlying spatially and temporally heterogeneous data in order to help research communities avoid the sometimes burdensome computational costs of high-resolution high-complexity models. The first method utilizes simplified chemical mechanisms (a Reduced Hydrocarbon Mechanism and a Superfast Mechanism) alongside a more complex mechanism (MOZART-4) within CESM CAM-Chem to extend the number of simulated meteorological years (or add additional members to an ensemble) for a given modeling problem. The Reduced Hydrocarbon mechanism is twice as fast, and the Superfast mechanism is three times faster than the MOZART-4 mechanism. We show that simplified chemical mechanisms are largely capable of simulating surface ozone across the globe as well as the more complex chemical mechanisms, and where they are not capable, a simple standardized anomaly emulation approach can correct for their inadequacies. The second method uses strategic averaging over both temporal and spatial scales to filter out the highly heterogeneous noise that underlies ozone observations and simulations. This method allows for a selection of temporal and spatial averaging scales that match a particular signal strength (between 0.5 and 5 ppbv), and enables the identification of regions where an ozone signal can rise above the ozone noise over a given region and a given period of time. In conjunction, these two methods can be used to "scale down" chemical mechanism complexity and quantitatively determine spatial and temporal scales that could enable research communities to utilize simplified representations of atmospheric chemistry and thereby maximize their productivity and efficiency given computational constraints. While this framework is here applied to ozone data, it could also be applied to a broad range of geospatial data sets (observed or modeled) that have spatial and temporal coverage.

  7. An IBM PC-based math model for space station solar array simulation

    NASA Technical Reports Server (NTRS)

    Emanuel, E. M.

    1986-01-01

    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  8. Towards the simulation of molecular collisions with a superconducting quantum computer

    NASA Astrophysics Data System (ADS)

    Geller, Michael

    2013-05-01

    I will discuss the prospects for the use of large-scale, error-corrected quantum computers to simulate complex quantum dynamics such as molecular collisions. This will likely require millions qubits. I will also discuss an alternative approach [M. R. Geller et al., arXiv:1210.5260] that is ideally suited for today's superconducting circuits, which uses the single-excitation subspace (SES) of a system of n tunably coupled qubits. The SES method allows many operations in the unitary group SU(n) to be implemented in a single step, bypassing the need for elementary gates, thereby making large computations possible without error correction. The method enables universal quantum simulation, including simulation of the time-dependent Schrodinger equation, and we argue that a 1000-qubit SES processor should be capable of achieving quantum speedup relative to a petaflop supercomputer. We speculate on the utility and practicality of such a simulator for atomic and molecular collision physics. Work supported by the US National Science Foundation CDI program.

  9. Stochastic simulation of uranium migration at the Hanford 300 Area.

    PubMed

    Hammond, Glenn E; Lichtner, Peter C; Rockhold, Mark L

    2011-03-01

    This work focuses on the quantification of groundwater flow and subsequent U(VI) transport uncertainty due to heterogeneity in the sediment permeability at the Hanford 300 Area. U(VI) migration at the site is simulated with multiple realizations of stochastically-generated high resolution permeability fields and comparisons are made of cumulative water and U(VI) flux to the Columbia River. The massively parallel reactive flow and transport code PFLOTRAN is employed utilizing 40,960 processor cores on DOE's petascale Jaguar supercomputer to simultaneously execute 10 transient, variably-saturated groundwater flow and U(VI) transport simulations within 3D heterogeneous permeability fields using the code's multi-realization simulation capability. Simulation results demonstrate that the cumulative U(VI) flux to the Columbia River is less responsive to fine scale heterogeneity in permeability and more sensitive to the distribution of permeability within the river hyporheic zone and mean permeability of larger-scale geologic structures at the site. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Control system and method for a universal power conditioning system

    DOEpatents

    Lai, Jih-Sheng; Park, Sung Yeul; Chen, Chien-Liang

    2014-09-02

    A new current loop control system method is proposed for a single-phase grid-tie power conditioning system that can be used under a standalone or a grid-tie mode. This type of inverter utilizes an inductor-capacitor-inductor (LCL) filter as the interface in between inverter and the utility grid. The first set of inductor-capacitor (LC) can be used in the standalone mode, and the complete LCL can be used for the grid-tie mode. A new admittance compensation technique is proposed for the controller design to avoid low stability margin while maintaining sufficient gain at the fundamental frequency. The proposed current loop controller system and admittance compensation technique have been simulated and tested. Simulation results indicate that without the admittance path compensation, the current loop controller output duty cycle is largely offset by an undesired admittance path. At the initial simulation cycle, the power flow may be erratically fed back to the inverter causing catastrophic failure. With admittance path compensation, the output power shows a steady-state offset that matches the design value. Experimental results show that the inverter is capable of both a standalone and a grid-tie connection mode using the LCL filter configuration.

  11. Evolution of Software-Only-Simulation at NASA IV and V

    NASA Technical Reports Server (NTRS)

    McCarty, Justin; Morris, Justin; Zemerick, Scott

    2014-01-01

    Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.

  12. Rover Graphical Simulator

    NASA Technical Reports Server (NTRS)

    Bon, Bruce; Seraji, Homayoun

    2007-01-01

    Rover Graphical Simulator (RGS) is a package of software that generates images of the motion of a wheeled robotic exploratory vehicle (rover) across terrain that includes obstacles and regions of varying traversability. The simulated rover moves autonomously, utilizing reasoning and decision-making capabilities of a fuzzy-logic navigation strategy to choose its path from an initial to a final state. RGS provides a graphical user interface for control and monitoring of simulations. The numerically simulated motion is represented as discrete steps with a constant time interval between updates. At each simulation step, a dot is placed at the old rover position and a graphical symbol representing the rover is redrawn at the new, updated position. The effect is to leave a trail of dots depicting the path traversed by the rover, the distances between dots being proportional to the local speed. Obstacles and regions of low traversability are depicted as filled circles, with buffer zones around them indicated by enclosing circles. The simulated robot is equipped with onboard sensors that can detect regional terrain traversability and local obstacles out to specified ranges. RGS won the NASA Group Achievement Award in 2002.

  13. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  14. Features of MCNP6 Relevant to Medical Radiation Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, H. Grady III; Goorley, John T.

    2012-08-29

    MCNP (Monte Carlo N-Particle) is a general-purpose Monte Carlo code for simulating the transport of neutrons, photons, electrons, positrons, and more recently other fundamental particles and heavy ions. Over many years MCNP has found a wide range of applications in many different fields, including medical radiation physics. In this presentation we will describe and illustrate a number of significant recently-developed features in the current version of the code, MCNP6, having particular utility for medical physics. Among these are major extensions of the ability to simulate large, complex geometries, improvement in memory requirements and speed for large lattices, introduction of mesh-basedmore » isotopic reaction tallies, advances in radiography simulation, expanded variance-reduction capabilities, especially for pulse-height tallies, and a large number of enhancements in photon/electron transport.« less

  15. Current status of robotic simulators in acquisition of robotic surgical skills.

    PubMed

    Kumar, Anup; Smith, Roger; Patel, Vipul R

    2015-03-01

    This article provides an overview of the current status of simulator systems in robotic surgery training curriculum, focusing on available simulators for training, their comparison, new technologies introduced in simulation focusing on concepts of training along with existing challenges and future perspectives of simulator training in robotic surgery. The different virtual reality simulators available in the market like dVSS, dVT, RoSS, ProMIS and SEP have shown face, content and construct validity in robotic skills training for novices outside the operating room. Recently, augmented reality simulators like HoST, Maestro AR and RobotiX Mentor have been introduced in robotic training providing a more realistic operating environment, emphasizing more on procedure-specific robotic training . Further, the Xperience Team Trainer, which provides training to console surgeon and bed-side assistant simultaneously, has been recently introduced to emphasize the importance of teamwork and proper coordination. Simulator training holds an important place in current robotic training curriculum of future robotic surgeons. There is a need for more procedure-specific augmented reality simulator training, utilizing advancements in computing and graphical capabilities for new innovations in simulator technology. Further studies are required to establish its cost-benefit ratio along with concurrent and predictive validity.

  16. Verification of bubble tracking method and DNS examinations of single- and two-phase turbulent channel flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tryggvason, Gretar; Bolotnov, Igor; Fang, Jun

    2017-03-30

    Direct numerical simulation (DNS) has been regarded as a reliable data source for the development and validation of turbulence models along with experiments. The realization of DNS usually involves a very fine mesh that should be able to resolve all relevant turbulence scales down to Kolmogorov scale [1]. As the most computationally expensive approach compared to other CFD techniques, DNS applications used to be limited to flow studies at very low Reynolds numbers. Thanks to the tremendous growth of computing power over the past decades, the simulation capability of DNS has now started overlapping with some of the most challengingmore » engineering problems. One of those examples in nuclear engineering is the turbulent coolant flow inside reactor cores. Coupled with interface tracking methods (ITM), the simulation capability of DNS can be extended to more complicated two-phase flow regimes. Departure from nucleate boiling (DNB) is the limiting critical heat flux phenomena for the majority of accidents that are postulated to occur in pressurized water reactors (PWR) [2]. As one of the major modeling and simulation (M&S) challenges pursued by CASL, the prediction capability is being developed for the onset of DNB utilizing multiphase-CFD (M-CFD) approach. DNS (coupled with ITM) can be employed to provide closure law information for the multiphase flow modeling at CFD scale. In the presented work, research groups at NCSU and UND will focus on applying different ITM to different geometries. Higher void fraction flow analysis at reactor prototypical conditions will be performed, and novel analysis methods will be developed, implemented and verified for the challenging flow conditions.« less

  17. System Advisor Model, SAM 2014.1.14: General Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, Nate; Dobos, Aron P.; Freeman, Janine

    2014-02-01

    This document describes the capabilities of the U.S. Department of Energy and National Renewable Energy Laboratory's System Advisor Model (SAM), Version 2013.9.20, released on September 9, 2013. SAM is a computer model that calculates performance and financial metrics of renewable energy systems. Project developers, policy makers, equipment manufacturers, and researchers use graphs and tables of SAM results in the process of evaluating financial, technology, and incentive options for renewable energy projects. SAM simulates the performance of photovoltaic, concentrating solar power, solar water heating, wind, geothermal, biomass, and conventional power systems. The financial model can represent financial structures for projects thatmore » either buy and sell electricity at retail rates (residential and commercial) or sell electricity at a price determined in a power purchase agreement (utility). SAM's advanced simulation options facilitate parametric and sensitivity analyses, and statistical analysis capabilities are available for Monte Carlo simulation and weather variability (P50/P90) studies. SAM can also read input variables from Microsoft Excel worksheets. For software developers, the SAM software development kit (SDK) makes it possible to use SAM simulation modules in their applications written in C/C++, C#, Java, Python, and MATLAB. NREL provides both SAM and the SDK as free downloads at http://sam.nrel.gov. Technical support and more information about the software are available on the website.« less

  18. Automated Rendezvous and Capture System Development and Simulation for NASA

    NASA Technical Reports Server (NTRS)

    Roe, Fred D.; Howard, Richard T.; Murphy, Leslie

    2004-01-01

    The United States does not have an Automated Rendezvous and Capture Docking (AR&C) capability and is reliant on manned control for rendezvous and docking of orbiting spacecraft. T h i s reliance on the labor intensive manned interface for control of rendezvous and docking vehicles has a significant impact on the cost of the operation of the International Space Station (ISS) and precludes the use of any U.S. expendable launch capabilities for Space Station resupply. The Marshall Space Flight Center (MSFC) has conducted pioneering research in the development of an automated rendezvous and capture (or docking) (AR&C) system for U.S. space vehicles. This A M C system was tested extensively using hardware-in-the-loop simulations in the Flight Robotics Laboratory, and a rendezvous sensor, the Video Guidance Sensor was developed and successfully flown on the Space Shuttle on flights STS-87 and STS-95, proving the concept of a video- based sensor. Further developments in sensor technology and vehicle and target configuration have lead to continued improvements and changes in AR&C system development and simulation. A new Advanced Video Guidance Sensor (AVGS) with target will be utilized as the primary navigation sensor on the Demonstration of Autonomous Rendezvous Technologies (DART) flight experiment in 2004. Realtime closed-loop simulations will be performed to validate the improved AR&C systems prior to flight.

  19. Advanced development of atmospheric models. [SEASAT Program support

    NASA Technical Reports Server (NTRS)

    Kesel, P. G.; Langland, R. A.; Stephens, P. L.; Welleck, R. E.; Wolff, P. M.

    1979-01-01

    A set of atmospheric analysis and prediction models was developed in support of the SEASAT Program existing objective analysis models which utilize a 125x125 polar stereographic grid of the Northern Hemisphere, which were modified in order to incorporate and assess the impact of (real or simulated) satellite data in the analysis of a two-day meteorological scenario in January 1979. Program/procedural changes included: (1) a provision to utilize winds in the sea level pressure and multi-level height analyses (1000-100 MBS); (2) The capability to perform a pre-analysis at two control levels (1000 MBS and 250 MBS); (3) a greater degree of wind- and mass-field coupling, especially at these controls levels; (4) an improved facility to bogus the analyses based on results of the preanalysis; and (5) a provision to utilize (SIRS) satellite thickness values and cloud motion vectors in the multi-level height analysis.

  20. Distributed Task Offloading in Heterogeneous Vehicular Crowd Sensing

    PubMed Central

    Liu, Yazhi; Wang, Wendong; Ma, Yuekun; Yang, Zhigang; Yu, Fuxing

    2016-01-01

    The ability of road vehicles to efficiently execute different sensing tasks varies because of the heterogeneity in their sensing ability and trajectories. Therefore, the data collection sensing task, which requires tempo-spatial sensing data, becomes a serious problem in vehicular sensing systems, particularly those with limited sensing capabilities. A utility-based sensing task decomposition and offloading algorithm is proposed in this paper. The utility function for a task executed by a certain vehicle is built according to the mobility traces and sensing interfaces of the vehicle, as well as the sensing data type and tempo-spatial coverage requirements of the sensing task. Then, the sensing tasks are decomposed and offloaded to neighboring vehicles according to the utilities of the neighboring vehicles to the decomposed sensing tasks. Real trace-driven simulation shows that the proposed task offloading is able to collect much more comprehensive and uniformly distributed sensing data than other algorithms. PMID:27428967

  1. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNamara, A; Held, K; Paganetti, H

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less

  2. Status of simulation in health care education: an international survey.

    PubMed

    Qayumi, Karim; Pachev, George; Zheng, Bin; Ziv, Amitai; Koval, Valentyna; Badiei, Sadia; Cheng, Adam

    2014-01-01

    Simulation is rapidly penetrating the terrain of health care education and has gained growing acceptance as an educational method and patient safety tool. Despite this, the state of simulation in health care education has not yet been evaluated on a global scale. In this project, we studied the global status of simulation in health care education by determining the degree of financial support, infrastructure, manpower, information technology capabilities, engagement of groups of learners, and research and scholarly activities, as well as the barriers, strengths, opportunities for growth, and other aspects of simulation in health care education. We utilized a two-stage process, including an online survey and a site visit that included interviews and debriefings. Forty-two simulation centers worldwide participated in this study, the results of which show that despite enormous interest and enthusiasm in the health care community, use of simulation in health care education is limited to specific areas and is not a budgeted item in many institutions. Absence of a sustainable business model, as well as sufficient financial support in terms of budget, infrastructure, manpower, research, and scholarly activities, slows down the movement of simulation. Specific recommendations are made based on current findings to support simulation in the next developmental stages.

  3. Status Report on Modelling and Simulation Capabilities for Nuclear-Renewable Hybrid Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, C.; Epiney, A.; Talbot, P.

    This report summarizes the current status of the modeling and simulation capabilities developed for the economic assessment of Nuclear-Renewable Hybrid Energy Systems (N-R HES). The increasing penetration of variable renewables is altering the profile of the net demand, with which the other generators on the grid have to cope. N-R HES analyses are being conducted to determine the potential feasibility of mitigating the resultant volatility in the net electricity demand by adding industrial processes that utilize either thermal or electrical energy as stabilizing loads. This coordination of energy generators and users is proposed to mitigate the increase in electricity costmore » and cost volatility through the production of a saleable commodity. Overall, the financial performance of a system that is comprised of peaking units (i.e. gas turbine), baseload supply (i.e. nuclear power plant), and an industrial process (e.g. hydrogen plant) should be optimized under the constraint of satisfying an electricity demand profile with a certain level of variable renewable (wind) penetration. The optimization should entail both the sizing of the components/subsystems that comprise the system and the optimal dispatch strategy (output at any given moment in time from the different subsystems). Some of the capabilities here described have been reported separately in [1, 2, 3]. The purpose of this report is to provide an update on the improvement and extension of those capabilities and to illustrate their integrated application in the economic assessment of N-R HES.« less

  4. Land use/land cover and land capability data for evaluating land utilization and official land use planning in Indramayu Regency, West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Ambarwulan, W.; Widiatmaka; Nahib, I.

    2018-05-01

    Land utilization in Indonesia is regulated in an official spatial land use planning (OSLUP), stipulated by government regulations. However in fact, land utilizations are often develops inconsistent with regulations. OSLUP itself is also not usually compatible with sustainable land utilizations. This study aims to evaluate current land utilizations and OSLUP in Indramayu Regency, West Java. The methodology used is the integrated analysis using land use and land cover (LU/LC) data, land capability data and spatial pattern in OSLUP. Actual LU/LC are interpreted using SPOT-6 imagery of 2014. The spatial data of land capabilities are derived from land capability classification using field data and laboratory analysis. The confrontation between these spatial data is interpreted in terms of future direction for sustainable land use planning. The results shows that Indramayu regency consists of 8 types of LU/LC. Land capability in research area range from class II to VIII. Only a small portion of the land in Indramayu has been used in accordance with land capability, but most of the land is used exceeding its land capability.

  5. Investigation of a compact coaxially fed switched oscillator

    NASA Astrophysics Data System (ADS)

    Wang, Yuwei; Chen, Dongqun; Zhang, Jiande; Cao, Shengguang; Li, Da; Liu, Chebo

    2013-09-01

    To generate a relative high frequency mesoband microwave, a compact coaxially fed transmission line switched oscillator with high voltage capability is investigated. The characteristic impedance and voltage capability of the low impedance transmission line (LITL) have been analyzed. It is shown that the working voltage of the oscillator can reach up to 200 kV when it is filled by pressurized nitrogen and charged by a nanosecond driving source. By utilizing a commercial electromagnetic simulation code, the transient performance of the switched oscillator with a lumped resistance load is simulated. It is illustrated that the center frequency of the output signal reaches up to ˜0.6 GHz when the spark gap practically closes with a single channel. Besides, the influence of the closing mode and rapidity of the spark gap, the permittivity of the insulator at the output end of the LITL, and the load impedance on the transient performance of the designed oscillator has been analyzed in quantification. Finally, the good transient performance of the switched oscillator has been preliminarily proved by the experiment.

  6. Computer-aided analysis and design of the shape rolling process for producing turbine engine airfoils

    NASA Technical Reports Server (NTRS)

    Lahoti, G. D.; Akgerman, N.; Altan, T.

    1978-01-01

    Mild steel (AISI 1018) was selected as model cold-rolling material and Ti-6Al-4V and INCONEL 718 were selected as typical hot-rolling and cold-rolling alloys, respectively. The flow stress and workability of these alloys were characterized and friction factor at the roll/workpiece interface was determined at their respective working conditions by conducting ring tests. Computer-aided mathematical models for predicting metal flow and stresses, and for simulating the shape-rolling process were developed. These models utilize the upper-bound and the slab methods of analysis, and are capable of predicting the lateral spread, roll-separating force, roll torque and local stresses, strains and strain rates. This computer-aided design (CAD) system is also capable of simulating the actual rolling process and thereby designing roll-pass schedule in rolling of an airfoil or similar shape. The predictions from the CAD system were verified with respect to cold rolling of mild steel plates. The system is being applied to cold and hot isothermal rolling of an airfoil shape, and will be verified with respect to laboratory experiments under controlled conditions.

  7. System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures

    NASA Technical Reports Server (NTRS)

    Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger

    2007-01-01

    This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

  8. A Cooperative Human-Adaptive Traffic Simulation (CHATS)

    NASA Technical Reports Server (NTRS)

    Phillips, Charles T.; Ballin, Mark G.

    1999-01-01

    NASA is considering the development of a Cooperative Human-Adaptive Traffic Simulation (CHATS), to examine and evaluate performance of the National Airspace System (NAS) as the aviation community moves toward free flight. CHATS will be specifically oriented toward simulating strategic decision-making by airspace users and by the service provider s traffic management personnel, within the context of different airspace and rules assumptions. It will use human teams to represent these interests and make decisions, and will rely on computer modeling and simulation to calculate the impacts of these decisions. The simulation objectives will be to examine: 1. evolution of airspace users and the service provider s strategies, through adaptation to new operational environments; 2. air carriers competitive and cooperative behavior; 3. expected benefits to airspace users and the service provider as compared to the current NAS; 4. operational limitations of free flight concepts due to congestion and safety concerns. This paper describes an operational concept for CHATS, and presents a high-level functional design which would utilize a combination of existing and new models and simulation capabilities.

  9. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  10. Development of a Higher Fidelity Model for the Cascade Distillation Subsystem (CDS)

    NASA Technical Reports Server (NTRS)

    Perry, Bruce; Anderson, Molly

    2014-01-01

    Significant improvements have been made to the ACM model of the CDS, enabling accurate predictions of dynamic operations with fewer assumptions. The model has been utilized to predict how CDS performance would be impacted by changing operating parameters, revealing performance trade-offs and possibilities for improvement. CDS efficiency is driven by the THP coefficient of performance, which in turn is dependent on heat transfer within the system. Based on the remaining limitations of the simulation, priorities for further model development include: center dot Relaxing the assumption of total condensation center dot Incorporating dynamic simulation capability for the buildup of dissolved inert gasses in condensers center dot Examining CDS operation with more complex feeds center dot Extending heat transfer analysis to all surfaces

  11. A Bivariate Mixed Distribution with a Heavy-tailed Component and its Application to Single-site Daily Rainfall Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.

    2013-02-06

    This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less

  12. Experimental Validation: Subscale Aircraft Ground Facilities and Integrated Test Capability

    NASA Technical Reports Server (NTRS)

    Bailey, Roger M.; Hostetler, Robert W., Jr.; Barnes, Kevin N.; Belcastro, Celeste M.; Belcastro, Christine M.

    2005-01-01

    Experimental testing is an important aspect of validating complex integrated safety critical aircraft technologies. The Airborne Subscale Transport Aircraft Research (AirSTAR) Testbed is being developed at NASA Langley to validate technologies under conditions that cannot be flight validated with full-scale vehicles. The AirSTAR capability comprises a series of flying sub-scale models, associated ground-support equipment, and a base research station at NASA Langley. The subscale model capability utilizes a generic 5.5% scaled transport class vehicle known as the Generic Transport Model (GTM). The AirSTAR Ground Facilities encompass the hardware and software infrastructure necessary to provide comprehensive support services for the GTM testbed. The ground facilities support remote piloting of the GTM aircraft, and include all subsystems required for data/video telemetry, experimental flight control algorithm implementation and evaluation, GTM simulation, data recording/archiving, and audio communications. The ground facilities include a self-contained, motorized vehicle serving as a mobile research command/operations center, capable of deployment to remote sites when conducting GTM flight experiments. The ground facilities also include a laboratory based at NASA LaRC providing near identical capabilities as the mobile command/operations center, as well as the capability to receive data/video/audio from, and send data/audio to the mobile command/operations center during GTM flight experiments.

  13. Development and Validation of an Automated Simulation Capability in Support of Integrated Demand Management

    NASA Technical Reports Server (NTRS)

    Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh

    2017-01-01

    Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, while automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies to be carried out that can inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. The paper describes the initial validation of the automated simulation capability against results from previous IDM HITL experiments, quantifying the differences. The simulator is then used to explore the performance of the IDM concept under the simple scenario of a capacity constrained airport under a wide range of wind conditions.

  14. Effect of Microscopic Damage Events on Static and Ballistic Impact Strength of Triaxial Braid Composites

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.

    2010-01-01

    The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.

  15. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  16. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  17. Numerical Simulations of Subscale Wind Turbine Rotor Inboard Airfoils at Low Reynolds Number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, Myra L.; Maniaci, David Charles; Resor, Brian R.

    2015-04-01

    New blade designs are planned to support future research campaigns at the SWiFT facility in Lubbock, Texas. The sub-scale blades will reproduce specific aerodynamic characteristics of utility-scale rotors. Reynolds numbers for megawatt-, utility-scale rotors are generally above 2-8 million. The thickness of inboard airfoils for these large rotors are typically as high as 35-40%. The thickness and the proximity to three-dimensional flow of these airfoils present design and analysis challenges, even at the full scale. However, more than a decade of experience with the airfoils in numerical simulation, in the wind tunnel, and in the field has generated confidence inmore » their performance. Reynolds number regimes for the sub-scale rotor are significantly lower for the inboard blade, ranging from 0.7 to 1 million. Performance of the thick airfoils in this regime is uncertain because of the lack of wind tunnel data and the inherent challenge associated with numerical simulations. This report documents efforts to determine the most capable analysis tools to support these simulations in an effort to improve understanding of the aerodynamic properties of thick airfoils in this Reynolds number regime. Numerical results from various codes of four airfoils are verified against previously published wind tunnel results where data at those Reynolds numbers are available. Results are then computed for other Reynolds numbers of interest.« less

  18. PV source based high voltage gain current fed converter

    NASA Astrophysics Data System (ADS)

    Saha, Soumya; Poddar, Sahityika; Chimonyo, Kudzai B.; Arunkumar, G.; Elangovan, D.

    2017-11-01

    This work involves designing and simulation of a PV source based high voltage gain, current fed converter. It deals with an isolated DC-DC converter which utilizes boost converter topology. The proposed converter is capable of high voltage gain and above all have very high efficiency levels as proved by the simulation results. The project intends to produce an output of 800 V dc from a 48 V dc input. The simulation results obtained from PSIM application interface were used to analyze the performance of the proposed converter. Transformer used in the circuit steps up the voltage as well as to provide electrical isolation between the low voltage and high voltage side. Since the converter involves high switching frequency of 100 kHz, ultrafast recovery diodes are employed in the circuitry. The major application of the project is for future modeling of solar powered electric hybrid cars.

  19. Analysis of in-trail following dynamics of CDTI-equipped aircraft. [Cockpit Displays of Traffic Information

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Goka, T.

    1982-01-01

    In connection with the necessity to provide greater terminal area capacity, attention is given to approaches in which the required increase in capacity will be obtained by making use of more automation and by involving the pilot to a larger degree in the air traffic control (ATC) process. It was recommended that NASA should make extensive use of its research aircraft and cockpit simulators to assist the FAA in examining the capabilities and limitations of cockpit displays of traffic information (CDTI). A program was organized which utilizes FAA ATC (ground-based) simulators and NASA aircraft and associated cockpit simulators in a research project which explores applications of the CDTI system. The present investigation is concerned with several questions related to the CDTI-based terminal area traffic tactical control concepts. Attention is given to longitudinal separation criteria, a longitudinal following model, longitudinal capture, combined longitudinal/vertical control, and lateral control.

  20. Cryogenic, high speed, turbopump bearing cooling requirements

    NASA Technical Reports Server (NTRS)

    Dolan, Fred J.; Gibson, Howard G.; Cannon, James L.; Cody, Joe C.

    1988-01-01

    Although the Space Shuttle Main Engine (SSME) has repeatedly demonstrated the capability to perform during launch, the High Pressure Oxidizer Turbopump (HPOTP) main shaft bearings have not met their 7.5 hour life requirement. A tester is being employed to provide the capability of subjecting full scale bearings and seals to speeds, loads, propellants, temperatures, and pressures which simulate engine operating conditions. The tester design permits much more elaborate instrumentation and diagnostics than could be accommodated in an SSME turbopump. Tests were made to demonstrate the facilities; and the devices' capabilities, to verify the instruments in its operating environment and to establish a performance baseline for the flight type SSME HPOTP Turbine Bearing design. Bearing performance data from tests are being utilized to generate: (1) a high speed, cryogenic turbopump bearing computer mechanical model, and (2) a much improved, very detailed thermal model to better understand bearing internal operating conditions. Parametric tests were also made to determine the effects of speed, axial loads, coolant flow rate, and surface finish degradation on bearing performance.

  1. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    NASA Astrophysics Data System (ADS)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  2. High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair.

    PubMed

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K

    2018-01-01

    Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed.

  3. High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2018-01-01

    Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed. PMID:29706894

  4. Measuring Pilot Workload in a Moving-base Simulator. Part 2: Building Levels of Workload

    NASA Technical Reports Server (NTRS)

    Kantowitz, B. H.; Hart, S. G.; Bortolussi, M. R.; Shively, R. J.; Kantowitz, S. C.

    1984-01-01

    Pilot behavior in flight simulators often use a secondary task as an index of workload. His routine to regard flying as the primary task and some less complex task as the secondary task. While this assumption is quite reasonable for most secondary tasks used to study mental workload in aircraft, the treatment of flying a simulator through some carefully crafted flight scenario as a unitary task is less justified. The present research acknowledges that total mental workload depends upon the specific nature of the sub-tasks that a pilot must complete as a first approximation, flight tasks were divided into three levels of complexity. The simplest level (called the Base Level) requires elementary maneuvers that do not utilize all the degrees of freedom of which an aircraft, or a moving-base simulator; is capable. The second level (called the Paired Level) requires the pilot to simultaneously execute two Base Level tasks. The third level (called the Complex Level) imposes three simultaneous constraints upon the pilot.

  5. Evaluation of Boreal Summer Monsoon Intraseasonal Variability in the GASS-YOTC Multi-Model Physical Processes Experiment

    NASA Astrophysics Data System (ADS)

    Mani, N. J.; Waliser, D. E.; Jiang, X.

    2014-12-01

    While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.

  6. SES cupola interactive display design environment

    NASA Technical Reports Server (NTRS)

    Vu, Bang Q.; Kirkhoff, Kevin R.

    1989-01-01

    The Systems Engineering Simulator, located at the Lyndon B. Johnson Space Center in Houston, Texas, is tasked with providing a real-time simulator for developing displays and controls targeted for the Space Station Freedom. These displays and controls will exist inside an enclosed workstation located on the space station. The simulation is currently providing the engineering analysis environment for NASA and contractor personnel to design, prototype, and test alternatives for graphical presentation of data to an astronaut while he performs specified tasks. A highly desirable aspect of this environment is to have the capability to rapidly develop and bring on-line a number of different displays for use in determining the best utilization of graphics techniques in achieving maximum efficiency of the test subject fulfilling his task. The Systems Engineering Simulator now has available a tool which assists in the rapid development of displays for these graphic workstations. The Display Builder was developed in-house to provide an environment which allows easy construction and modification of displays within minutes of receiving requirements for specific tests.

  7. Urban nonpoint source pollution buildup and washoff models for simulating storm runoff quality in the Los Angeles County.

    PubMed

    Wang, Long; Wei, Jiahua; Huang, Yuefei; Wang, Guangqian; Maqsood, Imran

    2011-07-01

    Many urban nonpoint source pollution models utilize pollutant buildup and washoff functions to simulate storm runoff quality of urban catchments. In this paper, two urban pollutant washoff load models are derived using pollutant buildup and washoff functions. The first model assumes that there is no residual pollutant after a storm event while the second one assumes that there is always residual pollutant after each storm event. The developed models are calibrated and verified with observed data from an urban catchment in the Los Angeles County. The application results show that the developed model with consideration of residual pollutant is more capable of simulating nonpoint source pollution from urban storm runoff than that without consideration of residual pollutant. For the study area, residual pollutant should be considered in pollutant buildup and washoff functions for simulating urban nonpoint source pollution when the total runoff volume is less than 30 mm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Jet transport performance in thunderstorm wind shear conditions

    NASA Technical Reports Server (NTRS)

    Mccarthy, J.; Blick, E. F.; Bensch, R. R.

    1979-01-01

    Several hours of three dimensional wind data were collected in the thunderstorm approach-to-landing environment, using an instrumented Queen Air airplane. These data were used as input to a numerical simulation of aircraft response, concentrating on fixed-stick assumptions, while the aircraft simulated an instrument landing systems approach. Output included airspeed, vertical displacement, pitch angle, and a special approach deterioration parameter. Theory and the results of approximately 1000 simulations indicated that about 20 percent of the cases contained serious wind shear conditions capable of causing a critical deterioration of the approach. In particular, the presence of high energy at the airplane's phugoid frequency was found to have a deleterious effect on approach quality. Oscillations of the horizontal wind at the phugoid frequency were found to have a more serious effect than vertical wind. A simulation of Eastern flight 66, which crashed at JFK in 1975, served to illustrate the points of the research. A concept of a real-time wind shear detector was outlined utilizing these results.

  9. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  10. Facts controllers and HVDC enhance power transmission (A manufacturer`s perspective)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juette, G.; Renz, K.

    1995-12-31

    Various types of FACTS as well as HVDC have been available for some time. New ones have been developed recently. Their respective benefits are well proven and have been made known. System studies have to be done to make full use of FACTS and HVDC problem solving capabilities. Siemens is offering digital models for correct representation of several FACTS devices and HVDC in widely used time-based simulation study programs. The manufacturers are doing their homework. It is up to the utility industry to make use of it now!

  11. A mass storage system for supercomputers based on Unix

    NASA Technical Reports Server (NTRS)

    Richards, J.; Kummell, T.; Zarlengo, D. G.

    1988-01-01

    The authors present the design, implementation, and utilization of a large mass storage subsystem (MSS) for the numerical aerodynamics simulation. The MSS supports a large networked, multivendor Unix-based supercomputing facility. The MSS at Ames Research Center provides all processors on the numerical aerodynamics system processing network, from workstations to supercomputers, the ability to store large amounts of data in a highly accessible, long-term repository. The MSS uses Unix System V and is capable of storing hundreds of thousands of files ranging from a few bytes to 2 Gb in size.

  12. Improving the Utility of the CATs Video Cam and Tri-axial Accelerometer for Examining Foraging in Top Marine Predators

    DTIC Science & Technology

    2015-09-30

    measurements of foraging and swimming performance in marine vertebrates. The CATS units are capable of recording motion with 9-degrees of freedom at high...1. Designing of a novel tag holder for tuna telemetry The idea of this novel tag design is to use the hydrodynamic forces appearing when tuna swim ...drag. Increment of the drag force associated with the attached tag was 16% for the simulated speed of swimming 8 m/s. The data obtained are

  13. Simulator for concurrent processing data flow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.

    1992-01-01

    A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.

  14. Assessing sorbent injection mercury control effectiveness in flue gas streams

    USGS Publications Warehouse

    Carey, T.R.; Richardson, C.F.; Chang, R.; Meserole, F.B.; Rostam-Abadi, M.; Chen, S.

    2000-01-01

    One promising approach for removing mercury from coal-fired, utility flue gas involves the direct injection of mercury sorbents. Although this method has been effective at removing mercury in municipal waste incinerators, tests conducted to date on utility coal-fired boilers show that mercury removal is much more difficult in utility flue gas. EPRI is conducting research to investigate mercury removal using sorbents in this application. Bench-scale, pilot-scale, and field tests have been conducted to determine the ability of different sorbents to remove mercury in simulated and actual flue gas streams. This paper focuses on recent bench-scale and field test results evaluating the adsorption characteristics of activated carbon and fly ash and the use of these results to develop a predictive mercury removal model. Field tests with activated carbon show that adsorption characteristics measured in the lab agree reasonably well with characteristics measured in the field. However, more laboratory and field data will be needed to identify other gas phase components which may impact performance. This will allow laboratory tests to better simulate field conditions and provide improved estimates of sorbent performance for specific sites. In addition to activated carbon results, bench-scale and modeling results using fly ash are presented which suggest that certain fly ashes are capable of adsorbing mercury.

  15. Equilibrium cycle pin by pin transport depletion calculations with DeCART

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kochunas, B.; Downar, T.; Taiwo, T.

    As the Advanced Fuel Cycle Initiative (AFCI) program has matured it has become more important to utilize more advanced simulation methods. The work reported here was performed as part of the AFCI fellowship program to develop and demonstrate the capability of performing high fidelity equilibrium cycle calculations. As part of the work here, a new multi-cycle analysis capability was implemented in the DeCART code which included modifying the depletion modules to perform nuclide decay calculations, implementing an assembly shuffling pattern description, and modifying iteration schemes. During the work, stability issues were uncovered with respect to converging simultaneously the neutron flux,more » isotopics, and fluid density and temperature distributions in 3-D. Relaxation factors were implemented which considerably improved the stability of the convergence. To demonstrate the capability two core designs were utilized, a reference UOX core and a CORAIL core. Full core equilibrium cycle calculations were performed on both cores and the discharge isotopics were compared. From this comparison it was noted that the improved modeling capability was not drastically different in its prediction of the discharge isotopics when compared to 2-D single assembly or 2-D core models. For fissile isotopes such as U-235, Pu-239, and Pu-241 the relative differences were 1.91%, 1.88%, and 0.59%), respectively. While this difference may not seem large it translates to mass differences on the order of tens of grams per assembly, which may be significant for the purposes of accounting of special nuclear material. (authors)« less

  16. Surprise Realistic Mock Disasters—The Most Effective Means of Disaster Training

    PubMed Central

    Campanale, Ralph P.

    1964-01-01

    Realism introduced in several large scale surprise mock-disaster tests proved to be a real challenge to a disaster-conscious hospital staff that had previously undergone fairly extensive disaster training and testing, utilizing conventional methods. Serious weaknesses, flaws, omissions and deficiencies in disaster capability were dramatically and conclusively revealed by use of what appeared to be a “live” disaster setting with smoke, fire, explosions; adverse weather and light conditions; realistically-simulated “casualites” especially prepared not only to look but to act the part; selected harassment incidents from well-documented disasters, such as utility failures, automobile accident on the main access route, overload of telephone switchboard, and invasion of hospital and disaster site by distraught relatives and the morbidly curious. Imagesp436-ap436-bp436-c PMID:14232161

  17. Development of on-line monitoring system for Nuclear Power Plant (NPP) using neuro-expert, noise analysis, and modified neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subekti, M.; Center for Development of Reactor Safety Technology, National Nuclear Energy Agency of Indonesia, Puspiptek Complex BO.80, Serpong-Tangerang, 15340; Ohno, T.

    2006-07-01

    The neuro-expert has been utilized in previous monitoring-system research of Pressure Water Reactor (PWR). The research improved the monitoring system by utilizing neuro-expert, conventional noise analysis and modified neural networks for capability extension. The parallel method applications required distributed architecture of computer-network for performing real-time tasks. The research aimed to improve the previous monitoring system, which could detect sensor degradation, and to perform the monitoring demonstration in High Temperature Engineering Tested Reactor (HTTR). The developing monitoring system based on some methods that have been tested using the data from online PWR simulator, as well as RSG-GAS (30 MW research reactormore » in Indonesia), will be applied in HTTR for more complex monitoring. (authors)« less

  18. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring

    PubMed Central

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-01-01

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper. PMID:27649186

  19. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring.

    PubMed

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-09-14

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper.

  20. Visual and motion cueing in helicopter simulation

    NASA Technical Reports Server (NTRS)

    Bray, R. S.

    1985-01-01

    Early experience in fixed-cockpit simulators, with limited field of view, demonstrated the basic difficulties of simulating helicopter flight at the level of subjective fidelity required for confident evaluation of vehicle characteristics. More recent programs, utilizing large-amplitude cockpit motion and a multiwindow visual-simulation system have received a much higher degree of pilot acceptance. However, none of these simulations has presented critical visual-flight tasks that have been accepted by the pilots as the full equivalent of flight. In this paper, the visual cues presented in the simulator are compared with those of flight in an attempt to identify deficiencies that contribute significantly to these assessments. For the low-amplitude maneuvering tasks normally associated with the hover mode, the unique motion capabilities of the Vertical Motion Simulator (VMS) at Ames Research Center permit nearly a full representation of vehicle motion. Especially appreciated in these tasks are the vertical-acceleration responses to collective control. For larger-amplitude maneuvering, motion fidelity must suffer diminution through direct attenuation through high-pass filtering washout of the computer cockpit accelerations or both. Experiments were conducted in an attempt to determine the effects of these distortions on pilot performance of height-control tasks.

  1. Knowledge-based computer systems for radiotherapy planning.

    PubMed

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  2. SRMS History, Evolution and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Jorgensen, Glenn; Bains, Elizabeth

    2011-01-01

    Early in the development of the Space Shuttle, it became clear that NASA needed a method of deploying and retrieving payloads from the payload bay. The Shuttle Remote Manipulator System (SRMS) was developed to fill this need. The 50 foot long robotic arm is an anthropomorphic design consisting of three electromechanical joints, six degrees of freedom, and two boom segments. Its composite boom construction provided a light weight solution needed for space operations. Additionally, a method of capturing payloads with the arm was required and a unique End Effector was developed using an electromechanical snare mechanism. The SRMS is operated using a Displays and Controls Panel and hand controllers located within the aft crew compartment of the shuttle. Although the SRMS was originally conceived to deploy and retrieve payloads, its generic capabilities allowed it to perform many other functions not originally conceived of. Over the years it has been used for deploying and retrieving constrained and free flying payloads, maneuvering and supporting EVA astronauts, satellite repair, International Space Station construction, and as a viewing aid for on-orbit International Space Station operations. After the Columbia accident, a robotically compatible Orbiter Boom Sensor System (OBSS) was developed and used in conjunction with the SRMS to scan the Thermal Protection System (TPS) of the shuttle. These scans ensure there is not a breach of the TPS prior to shuttle re-entry. Ground operations and pre mission simulation, analysis and planning played a major role in the success of the SRMS program. A Systems Engineering Simulator (SES) was developed to provide a utility complimentary to open loop engineering simulations. This system provided a closed-loop real-time pilot-driven simulation giving visual feedback, display and control panel interaction, and integration with other vehicle systems, such as GN&C. It has been useful for many more applications than traditional training. Evolution of the simulations, guided by the Math Model Working Group, showed the utility of input from multiple modeling groups with a structured forum for discussion.There were many unique development challenges in the areas of hardware, software, certification, modeling and simulation. Over the years, upgrades and enhancements were implemented to increase the capability, performance and safety of the SRMS. The history and evolution of the SRMS program provided many lessons learned that can be used for future space robotic systems.

  3. Passive Microfluidic device for Sub Millisecond Mixing

    PubMed Central

    McMahon, Jay; Mohamed, Hisham; Barnard, David; Shaikh, Tanvir R.; Mannella, Carmen A.; Wagenknecht, Terence; Lu, Toh-Ming

    2009-01-01

    We report the investigation of a novel microfluidic mixing device to achieve submillisecond mixing. The micromixer combines two fluid streams of several microliters per second into a mixing compartment integrated with two T- type premixers and 4 butterfly-shaped in-channel mixing elements. We have employed three dimensional fluidic simulations to evaluate the mixing efficiency, and have constructed physical devices utilizing conventional microfabrication techniques. The simulation indicated thorough mixing at flow rate as low as 6 µL/s. The corresponding mean residence time is 0.44 ms for 90% of the particles simulated, or 0.49 ms for 95% of the particles simulated, respectively. The mixing efficiency of the physical device was also evaluated using fluorescein dye solutions and FluoSphere-red nanoparticles suspensions. The constructed micromixers achieved thorough mixing at the same flow rate of 6 µL/s, with the mixing indices of 96% ± 1%, and 98% ± 1% for the dye and the nanoparticle, respectively. The experimental results are consistent with the simulation data. The device demonstrated promising capabilities for time resolved studies for macromolecular dynamics of biological macromolecules. PMID:20161619

  4. Integrated dynamic analysis simulation of space stations with controllable solar array

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    A methodology is formulated and presented for the integrated structural dynamic analysis of space stations with controllable solar arrays and non-controllable appendages. The structural system flexibility characteristics are considered in the dynamic analysis by a synthesis technique whereby free-free space station modal coordinates and cantilever appendage coordinates are inertially coupled. A digital simulation of this analysis method is described and verified by comparison of interaction load solutions with other methods of solution. Motion equations are simulated for both the zero gravity and artificial gravity (spinning) orbital conditions. Closed loop controlling dynamics for both orientation control of the arrays and attitude control of the space station are provided in the simulation by various generic types of controlling systems. The capability of the simulation as a design tool is demonstrated by utilizing typical space station and solar array structural representations and a specific structural perturbing force. Response and interaction load solutions are presented for this structural configuration and indicate the importance of using an integrated type analysis for the predictions of structural interactions.

  5. Threat radar system simulations

    NASA Astrophysics Data System (ADS)

    Miller, L.

    The capabilities, requirements, and goals of radar emitter simulators are discussed. Simulators are used to evaluate competing receiver designs, to quantify the performance envelope of a radar system, and to model the characteristics of a transmitted signal waveform. A database of candidate threat systems is developed and, in concert with intelligence data on a given weapons system, permits upgrading simulators to new projected threat capabilities. Four currently available simulation techniques are summarized, noting the usefulness of developing modular software for fast controlled-cost upgrades of simulation capabilities.

  6. LCP- LIFETIME COST AND PERFORMANCE MODEL FOR DISTRIBUTED PHOTOVOLTAIC SYSTEMS

    NASA Technical Reports Server (NTRS)

    Borden, C. S.

    1994-01-01

    The Lifetime Cost and Performance (LCP) Model was developed to assist in the assessment of Photovoltaic (PV) system design options. LCP is a simulation of the performance, cost, and revenue streams associated with distributed PV power systems. LCP provides the user with substantial flexibility in specifying the technical and economic environment of the PV application. User-specified input parameters are available to describe PV system characteristics, site climatic conditions, utility purchase and sellback rate structures, discount and escalation rates, construction timing, and lifetime of the system. Such details as PV array orientation and tilt angle, PV module and balance-of-system performance attributes, and the mode of utility interconnection are user-specified. LCP assumes that the distributed PV system is utility grid interactive without dedicated electrical storage. In combination with a suitable economic model, LCP can provide an estimate of the expected net present worth of a PV system to the owner, as compared to electricity purchased from a utility grid. Similarly, LCP might be used to perform sensitivity analyses to identify those PV system parameters having significant impact on net worth. The user describes the PV system configuration to LCP via the basic electrical components. The module is the smallest entity in the PV system which is modeled. A PV module is defined in the simulation by its short circuit current, which varies over the system lifetime due to degradation and failure. Modules are wired in series to form a branch circuit. Bypass diodes are allowed between modules in the branch circuits. Branch circuits are then connected in parallel to form a bus. A collection of buses is connected in parallel to form an increment to capacity of the system. By choosing the appropriate series-parallel wiring design, the user can specify the current, voltage, and reliability characteristics of the system. LCP simulation of system performance is site-specific and follows a three-step procedure. First the hourly power produced by the PV system is computed using a selected year's insolation and temperature profile. For this step it is assumed that there are no module failures or degradation. Next, the monthly simulation is performed involving a month to month progression through the lifetime of the system. In this step, the effects of degradation, failure, dirt accumulation and operations/maintenance efforts on PV system performance over time are used to compute the monthly power capability fraction. The resulting monthly power capability fractions are applied to the hourly power matrix from the first step, giving the anticipated hourly energy output over the lifetime of the system. PV system energy output is compared with the PV system owner's electricity demand for each hour. The amount of energy to be purchased from or sold to the utility grid is then determined. Monthly expenditures on the PV system and the purchase of electricity from the utility grid are also calculated. LCP generates output reports pertaining to the performance of the PV system, and system costs and revenues. The LCP model, written in SIMSCRIPT 2.5 for batch execution on an IBM 370 series computer, was developed in 1981.

  7. An improved artificial bee colony algorithm based on balance-evolution strategy for unmanned combat aerial vehicle path planning.

    PubMed

    Li, Bai; Gong, Li-gang; Yang, Wen-lun

    2014-01-01

    Unmanned combat aerial vehicles (UCAVs) have been of great interest to military organizations throughout the world due to their outstanding capabilities to operate in dangerous or hazardous environments. UCAV path planning aims to obtain an optimal flight route with the threats and constraints in the combat field well considered. In this work, a novel artificial bee colony (ABC) algorithm improved by a balance-evolution strategy (BES) is applied in this optimization scheme. In this new algorithm, convergence information during the iteration is fully utilized to manipulate the exploration/exploitation accuracy and to pursue a balance between local exploitation and global exploration capabilities. Simulation results confirm that BE-ABC algorithm is more competent for the UCAV path planning scheme than the conventional ABC algorithm and two other state-of-the-art modified ABC algorithms.

  8. Design of plate directional heat transmission structure based on layered thermal metamaterials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, L. K.; Yu, Z. F.; Huang, J., E-mail: slk-0-1999@163.com

    2016-02-15

    Invisibility cloaks based on transformation optics are often closed structures; however, such a structure limits the kinds of objects that can be placed in the cloak. In this work, we adopt a transformation thermodynamics approach to design an “open cloak”, called a plate directional heat transmission structure, which is capable of guiding heat fluxes to the flank region of the metamaterial device. The most fascinating and unique feature of the device is that the lower surface can remain at a lower temperature compared with the SiO{sub 2} aerogel thermal insulation material. Our results are expected to markedly enhance capabilities inmore » thermal protection, thermal-energy utilization, and domains beyond. In addition to the theoretical analysis, the present design is demonstrated in numerical simulations based on finite element calculations.« less

  9. High Altitude Long Endurance UAV Analysis Model Development and Application Study Comparing Solar Powered Airplane and Airship Station-Keeping Capabilities

    NASA Technical Reports Server (NTRS)

    Ozoroski, Thomas A.; Nickol, Craig L.; Guynn, Mark D.

    2015-01-01

    There have been ongoing efforts in the Aeronautics Systems Analysis Branch at NASA Langley Research Center to develop a suite of integrated physics-based computational utilities suitable for modeling and analyzing extended-duration missions carried out using solar powered aircraft. From these efforts, SolFlyte has emerged as a state-of-the-art vehicle analysis and mission simulation tool capable of modeling both heavier-than-air (HTA) and lighter-than-air (LTA) vehicle concepts. This study compares solar powered airplane and airship station-keeping capability during a variety of high altitude missions, using SolFlyte as the primary analysis component. Three Unmanned Aerial Vehicle (UAV) concepts were designed for this study: an airplane (Operating Empty Weight (OEW) = 3285 kilograms, span = 127 meters, array area = 450 square meters), a small airship (OEW = 3790 kilograms, length = 115 meters, array area = 570 square meters), and a large airship (OEW = 6250 kilograms, length = 135 meters, array area = 1080 square meters). All the vehicles were sized for payload weight and power requirements of 454 kilograms and 5 kilowatts, respectively. Seven mission sites distributed throughout the United States were selected to provide a basis for assessing the vehicle energy budgets and site-persistent operational availability. Seasonal, 30-day duration missions were simulated at each of the sites during March, June, September, and December; one-year duration missions were simulated at three of the sites. Atmospheric conditions during the simulated missions were correlated to National Climatic Data Center (NCDC) historical data measurements at each mission site, at four flight levels. Unique features of the SolFlyte model are described, including methods for calculating recoverable and energy-optimal flight trajectories and the effects of shadows on solar energy collection. Results of this study indicate that: 1) the airplane concept attained longer periods of on-site capability than either airship concept, and 2) the airship concepts can attain higher levels of energy collection and storage than the airplane concept; however, attaining these energy benefits requires adverse design trades of reduced performance (small airship) or excessive solar array area (large airship).

  10. Accessing defect dynamics using intense, nanosecond pulsed ion beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Persaud, A.; Barnard, J. J.; Guo, H.

    2015-06-18

    Gaining in-situ access to relaxation dynamics of radiation induced defects will lead to a better understanding of materials and is important for the verification of theoretical models and simulations. We show preliminary results from experiments at the new Neutralized Drift Compression Experiment (NDCX-II) at Lawrence Berkeley National Laboratory that will enable in-situ access to defect dynamics through pump-probe experiments. Here, the unique capabilities of the NDCX-II accelerator to generate intense, nanosecond pulsed ion beams are utilized. Preliminary data of channeling experiments using lithium and potassium ions and silicon membranes are shown. We compare these data to simulation results using Crystalmore » Trim. Furthermore, we discuss the improvements to the accelerator to higher performance levels and the new diagnostics tools that are being incorporated.« less

  11. Optimal mapping of irregular finite element domains to parallel processors

    NASA Technical Reports Server (NTRS)

    Flower, J.; Otto, S.; Salama, M.

    1987-01-01

    Mapping the solution domain of n-finite elements into N-subdomains that may be processed in parallel by N-processors is an optimal one if the subdomain decomposition results in a well-balanced workload distribution among the processors. The problem is discussed in the context of irregular finite element domains as an important aspect of the efficient utilization of the capabilities of emerging multiprocessor computers. Finding the optimal mapping is an intractable combinatorial optimization problem, for which a satisfactory approximate solution is obtained here by analogy to a method used in statistical mechanics for simulating the annealing process in solids. The simulated annealing analogy and algorithm are described, and numerical results are given for mapping an irregular two-dimensional finite element domain containing a singularity onto the Hypercube computer.

  12. Simulated imaging properties of a series of magnetic electron lenses

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    1995-01-01

    The paraxial lens data were determined for a series of symmetrical magnetic lenses of equal lens diameter but variable air gap width for a wide range of lens excitations using the three-dimensional electrodynamic computer code MAFIA. The results are compared with a similar study done by Liebman and Grad wherein the field distributions within the lenses were measured experimentally with a resistance network analogue. Using these fields the lens data were obtained through numerical trajectory tracing. The utility of using MAFIA, instead of experimental methods for lens design is shown by the excellent agreement of the simulated results compared to experiment. Also demonstrated is the capability of using MAFIA to investigate aberration sources such as higher order off-axis magnetic field and space-charge effects.

  13. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data

    NASA Astrophysics Data System (ADS)

    White, Andrew D.; Knight, Chris; Hocky, Glen M.; Voth, Gregory A.

    2017-01-01

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  14. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data.

    PubMed

    White, Andrew D; Knight, Chris; Hocky, Glen M; Voth, Gregory A

    2017-01-28

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  15. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  16. Status of simulation in health care education: an international survey

    PubMed Central

    Qayumi, Karim; Pachev, George; Zheng, Bin; Ziv, Amitai; Koval, Valentyna; Badiei, Sadia; Cheng, Adam

    2014-01-01

    Simulation is rapidly penetrating the terrain of health care education and has gained growing acceptance as an educational method and patient safety tool. Despite this, the state of simulation in health care education has not yet been evaluated on a global scale. In this project, we studied the global status of simulation in health care education by determining the degree of financial support, infrastructure, manpower, information technology capabilities, engagement of groups of learners, and research and scholarly activities, as well as the barriers, strengths, opportunities for growth, and other aspects of simulation in health care education. We utilized a two-stage process, including an online survey and a site visit that included interviews and debriefings. Forty-two simulation centers worldwide participated in this study, the results of which show that despite enormous interest and enthusiasm in the health care community, use of simulation in health care education is limited to specific areas and is not a budgeted item in many institutions. Absence of a sustainable business model, as well as sufficient financial support in terms of budget, infrastructure, manpower, research, and scholarly activities, slows down the movement of simulation. Specific recommendations are made based on current findings to support simulation in the next developmental stages. PMID:25489254

  17. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.

    PubMed

    Shariff, Azmi Mohd; Zaini, Dzulkarnain

    2010-10-15

    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.

  18. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M.: "Flexible Simulation Framework to Couple Processes in Complex 3D Models for Subsurface Utilization Assessment.", Energy Procedia, 97, 2016 p. 494-501.

  19. Evolutionary Design of a Robotic Material Defect Detection System

    NASA Technical Reports Server (NTRS)

    Ballard, Gary; Howsman, Tom; Craft, Mike; ONeil, Daniel; Steincamp, Jim; Howell, Joe T. (Technical Monitor)

    2002-01-01

    During the post-flight inspection of SSME engines, several inaccessible regions must be disassembled to inspect for defects such as cracks, scratches, gouges, etc. An improvement to the inspection process would be the design and development of very small robots capable of penetrating these inaccessible regions and detecting the defects. The goal of this research was to utilize an evolutionary design approach for the robotic detection of these types of defects. A simulation and visualization tool was developed prior to receiving the hardware as a development test bed. A small, commercial off-the-shelf (COTS) robot was selected from several candidates as the proof of concept robot. The basic approach to detect the defects was to utilize Cadmium Sulfide (CdS) sensors to detect changes in contrast of an illuminated surface. A neural network, optimally designed utilizing a genetic algorithm, was employed to detect the presence of the defects (cracks). By utilization of the COTS robot and US sensors, the research successfully demonstrated that an evolutionarily designed neural network can detect the presence of surface defects.

  20. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  1. Science and applications-driven OSSE platform for terrestrial hydrology using NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Peters-Lidard, C. D.; Harrison, K.; Santanello, J. A.; Bach Kirschbaum, D.

    2014-12-01

    Observing System Simulation Experiments (OSSEs) are often conducted to evaluate the worth of existing data and data yet to be collected from proposed new missions. As missions increasingly require a broader ``Earth systems'' focus, it is important that the OSSEs capture the potential benefits of the observations on end-use applications. Towards this end, the results from the OSSEs must also be evaluated with a suite of metrics that capture the value, uncertainty, and information content of the observations while factoring in both science and societal impacts. In this presentation, we present the development of an end-to-end and end-use application oriented OSSE platform using the capabilities of the NASA Land Information System (LIS) developed for terrestrial hydrology. Four case studies that demonstrate the capabilities of the system will be presented: (1) A soil moisture OSSE that employs simulated L-band measurements and examines their impacts towards applications such as floods and droughts. The experiment also uses a decision-theory based analysis to assess the economic utility of observations towards improving drought and flood risk estimates, (2) A GPM-relevant study quantifies the impact of improved precipitation retrievals from GPM towards improving landslide forecasts, (3) A case study that examines the utility of passive microwave soil moisture observations towards weather prediction, and (4) OSSEs used for developing science requirements for the GRACE-2 mission. These experiments also demonstrate the value of a comprehensive modeling environment such as LIS for conducting end-to-end OSSEs by linking satellite observations, physical models, data assimilation algorithms and end-use application models in a single integrated framework.

  2. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  3. Damage Characterization of EBC-SiCSiC Ceramic Matrix Composites Under Imposed Thermal Gradient Testing

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew P.; Morscher, Gregory N.; Zhu, Dongming

    2014-01-01

    Due to their high temperature capabilities, Ceramic Matrix Composite (CMC) components are being developed for use in hot-section aerospace engine applications. Harsh engine environments have led to the development of Environmental Barrier Coatings (EBCs) for silicon-based CMCs to further increase thermal and environmental capabilities. This study aims at understanding the damage mechanisms associated with these materials under simulated operating conditions. A high heat-flux laser testing rig capable of imposing large through-thickness thermal gradients by means of controlled laser beam heating and back-side air cooling is used. Tests are performed on uncoated composites, as well as CMC substrates that have been coated with state-of-the-art ceramic EBC systems. Results show that the use of the EBCs may help increase temperature capability and creep resistance by reducing the effects of stressed oxidation and environmental degradation. Also, the ability of electrical resistance (ER) and acoustic emission (AE) measurements to monitor material condition and damage state during high temperature testing is shown; suggesting their usefulness as a valuable health monitoring technique. Micromechanics models are used to describe the localized stress state of the composite system, which is utilized along with ER modeling concepts to develop an electromechanical model capable of characterizing material behavior.

  4. Computer simulation of on-orbit manned maneuvering unit operations

    NASA Technical Reports Server (NTRS)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  5. NASA In-Situ Resource Utilization Project-and Seals Challenges

    NASA Technical Reports Server (NTRS)

    Sacksteder, Kurt; Linne, Diane

    2006-01-01

    A viewgraph presentation on NASA's In-Situ Resource Utilization Project and Seals Challenges is shown. The topics include: 1) What Are Space Resources?; 2) Space Resource Utilization for Exploration; 3) ISRU Enables Affordable, Sustainable & Flexible Exploration; 4) Propellant from the Moon Could Revolutionize Space Transportation; 5) NASA ISRU Capability Roadmap Study, 2005; 6) Timeline for ISRU Capability Implementation; 7) Lunar ISRU Implementation Approach; 8) ISRU Technical-to-Mission Capability Roadmap; 9) ISRU Resources & Products of Interest; and 10) Challenging Seals Requirements for ISRU.

  6. LightForce photon-pressure collision avoidance: Efficiency analysis in the current debris environment and long-term simulation perspective

    NASA Astrophysics Data System (ADS)

    Yang Yang, Fan; Nelson, Bron; Aziz, Jonathan; Carlino, Roberto; Dono Perez, Andres; Faber, Nicolas; Foster, Cyrus; Frost, Chad; Henze, Chris; Karacalıoğlu, Arif Göktuğ; Levit, Creon; Marshall, William; Mason, James; O'Toole, Conor; Swenson, Jason; Worden, Simon P.; Stupl, Jan

    2016-09-01

    This work provides an efficiency analysis of the LightForce space debris collision avoidance scheme in the current debris environment and describes a simulation approach to assess its impact on the long-term evolution of the space debris environment. LightForce aims to provide just-in-time collision avoidance by utilizing photon pressure from ground-based industrial lasers. These ground stations impart minimal accelerations to increase the miss distance for a predicted conjunction between two objects. In the first part of this paper we will present research that investigates the short-term effect of a few systems consisting of 20 kW class lasers directed by 1.5 m diameter telescopes using adaptive optics. The results found such a network of ground stations to mitigate more than 85 percent of conjunctions and could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. While these are impressive numbers that indicate LightForce's utility in the short-term, the remaining 15 % of possible collisions contain (among others) conjunctions between two massive objects that would add large amount of debris if they collide. Still, conjunctions between massive objects and smaller objects can be mitigated. Hence, we choose to expand the capabilities of the simulation software to investigate the overall effect of a network of LightForce stations on the long-term debris evolution. In the second part of this paper, we will present the planned simulation approach for that effort. For the efficiency analysis of collision avoidance in the current debris environment, we utilize a simulation approach that uses the entire Two Line Element (TLE) catalog in LEO for a given day as initial input. These objects are propagated for one year and an all-on-all conjunction analysis is performed. For conjunctions that fall below a range threshold, we calculate the probability of collision and record those values. To assess efficiency, we compare a baseline (without collision avoidance) conjunction analysis with an analysis where LightForce is active. Using that approach, we take into account that collision avoidance maneuvers could have effects on third objects. Performing all-on-all conjunction analyses for extended period of time requires significant computer resources; hence we implemented this simulation utilizing a highly parallel approach on the NASA Pleiades supercomputer.

  7. LightForce photon-pressure collision avoidance: Efficiency analysis in the current debris environment and long-term simulation perspective

    PubMed Central

    Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Carlino, Roberto; Perez, Andres Dono; Faber, Nicolas; Foster, Cyrus; Frost, Chad; Henze, Chris; Karacalıoğlu, Arif Göktuğ; Levit, Creon; Marshall, William; Mason, James; O’Toole, Conor; Swenson, Jason; Worden, Simon P.; Stupl, Jan

    2017-01-01

    This work provides an efficiency analysis of the LightForce space debris collision avoidance scheme in the current debris environment and describes a simulation approach to assess its impact on the long-term evolution of the space debris environment. LightForce aims to provide just-in-time collision avoidance by utilizing photon pressure from ground-based industrial lasers. These ground stations impart minimal accelerations to increase the miss distance for a predicted conjunction between two objects. In the first part of this paper we will present research that investigates the short-term effect of a few systems consisting of 20 kW class lasers directed by 1.5 m diameter telescopes using adaptive optics. The results found such a network of ground stations to mitigate more than 85 percent of conjunctions and could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. While these are impressive numbers that indicate LightForce’s utility in the short-term, the remaining 15 % of possible collisions contain (among others) conjunctions between two massive objects that would add large amount of debris if they collide. Still, conjunctions between massive objects and smaller objects can be mitigated. Hence, we choose to expand the capabilities of the simulation software to investigate the overall effect of a network of LightForce stations on the long-term debris evolution. In the second part of this paper, we will present the planned simulation approach for that effort. For the efficiency analysis of collision avoidance in the current debris environment, we utilize a simulation approach that uses the entire Two Line Element (TLE) catalog in LEO for a given day as initial input. These objects are propagated for one year and an all-on-all conjunction analysis is performed. For conjunctions that fall below a range threshold, we calculate the probability of collision and record those values. To assess efficiency, we compare a baseline (without collision avoidance) conjunction analysis with an analysis where LightForce is active. Using that approach, we take into account that collision avoidance maneuvers could have effects on third objects. Performing all-on-all conjunction analyses for extended period of time requires significant computer resources; hence we implemented this simulation utilizing a highly parallel approach on the NASA Pleiades supercomputer. PMID:29302129

  8. LightForce photon-pressure collision avoidance: Efficiency analysis in the current debris environment and long-term simulation perspective.

    PubMed

    Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Carlino, Roberto; Perez, Andres Dono; Faber, Nicolas; Foster, Cyrus; Frost, Chad; Henze, Chris; Karacalıoğlu, Arif Göktuğ; Levit, Creon; Marshall, William; Mason, James; O'Toole, Conor; Swenson, Jason; Worden, Simon P; Stupl, Jan

    2016-09-01

    This work provides an efficiency analysis of the LightForce space debris collision avoidance scheme in the current debris environment and describes a simulation approach to assess its impact on the long-term evolution of the space debris environment. LightForce aims to provide just-in-time collision avoidance by utilizing photon pressure from ground-based industrial lasers. These ground stations impart minimal accelerations to increase the miss distance for a predicted conjunction between two objects. In the first part of this paper we will present research that investigates the short-term effect of a few systems consisting of 20 kW class lasers directed by 1.5 m diameter telescopes using adaptive optics. The results found such a network of ground stations to mitigate more than 85 percent of conjunctions and could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. While these are impressive numbers that indicate LightForce's utility in the short-term, the remaining 15 % of possible collisions contain (among others) conjunctions between two massive objects that would add large amount of debris if they collide. Still, conjunctions between massive objects and smaller objects can be mitigated. Hence, we choose to expand the capabilities of the simulation software to investigate the overall effect of a network of LightForce stations on the long-term debris evolution. In the second part of this paper, we will present the planned simulation approach for that effort. For the efficiency analysis of collision avoidance in the current debris environment, we utilize a simulation approach that uses the entire Two Line Element (TLE) catalog in LEO for a given day as initial input. These objects are propagated for one year and an all-on-all conjunction analysis is performed. For conjunctions that fall below a range threshold, we calculate the probability of collision and record those values. To assess efficiency, we compare a baseline (without collision avoidance) conjunction analysis with an analysis where LightForce is active. Using that approach, we take into account that collision avoidance maneuvers could have effects on third objects. Performing all-on-all conjunction analyses for extended period of time requires significant computer resources; hence we implemented this simulation utilizing a highly parallel approach on the NASA Pleiades supercomputer.

  9. Upset Simulation and Training Initiatives for U.S. Navy Commercial Derived Aircraft

    NASA Technical Reports Server (NTRS)

    Donaldson, Steven; Priest, James; Cunningham, Kevin; Foster, John V.

    2012-01-01

    Militarized versions of commercial platforms are growing in popularity due to many logistical benefits in the form of commercial off-the-shelf (COTS) parts, established production methods, and commonality for different certifications. Commercial data and best practices are often leveraged to reduce procurement and engineering development costs. While the developmental and cost reduction benefits are clear, these militarized aircraft are routinely operated in flight at significantly different conditions and in significantly different manners than for routine commercial flight. Therefore they are at a higher risk of flight envelope exceedance. This risk may lead to departure from controlled flight and/or aircraft loss1. Historically, the risk of departure from controlled flight for military aircraft has been mitigated by piloted simulation training and engineering analysis of typical aircraft response. High-agility military aircraft simulation databases are typically developed to include high angles of attack (AoA) and sideslip due to the dynamic nature of their missions and have been developed for many tactical configurations over the previous decades. These aircraft simulations allow for a more thorough understanding of the vehicle flight dynamics characteristics at high AoA and sideslip. In recent years, government sponsored research on transport airplane aerodynamic characteristics at high angles of attack has produced a growing understanding of stall/post-stall behavior. This research along with recent commercial airline training initiatives has resulted in improved understanding of simulator-based training requirements and simulator model fidelity.2-5 In addition, inflight training research over the past decade has produced a database of pilot performance and recurrency metrics6. Innovative solutions to aerodynamically model large commercial aircraft for upset conditions such as high AoA, high sideslip, and ballistic damage, as well as capability to accurately account for scaling factors, is necessary to develop realistic engineering and training simulations. Such simulations should significantly reduce the risk of departure from controlled flight, loss of aircraft, and ease the airworthiness certification process. The characteristics of commercial derivative aircraft are exemplified by the P-8A Multi-mission Maritime Aircraft (MMA) aircraft, and the largest benefits of initial investigation are likely to be yielded from this platform. The database produced would also be utilized by flight dynamics engineers as a means to further develop and investigate vehicle flight characteristics as mission tactics evolve through the years ahead. This paper will describe ongoing efforts by the U.S. Navy to develop a methodology for simulation and training for large commercial-derived transport aircraft at unusual attitudes, typically experienced during an aircraft upset. This methodology will be applied to a representative Navy aircraft (P-8A) and utilized to develop a robust simulation that should accurately represent aircraft response in these extremes. Simulation capabilities would then extend to flight dynamics analysis and simulation, as well as potential training applications. Recent evaluations of integrated academic, ground-based simulation, and in-flight upset training will be described along with important lessons learned, specific to military requirements.

  10. Simulating the Impact of Premixed Charge Compression Ignition on Light-Duty Diesel Fuel Economy and Emissions of Particulates and NOx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Zhiming; Daw, C Stuart; Wagner, Robert M

    2013-01-01

    We utilize the Powertrain Systems Analysis Toolkit (PSAT) combined with transient engine and aftertreatment component models implemented in Matlab/Simulink to simulate the effect of premixed charge compression ignition (PCCI) on the fuel economy and emissions of light-duty diesel-powered conventional and hybrid electric vehicles (HEVs). Our simulated engine is capable of both conventional diesel combustion (CDC) and premixed charge compression ignition (PCCI) over real transient driving cycles. Our simulated aftertreatment train consists of a diesel oxidation catalyst (DOC), lean NOx trap (LNT), and catalyzed diesel particulate filter (DPF). The results demonstrate that, in the simulated conventional vehicle, PCCI can significantly reducemore » fuel consumption and emissions by reducing the need for LNT and DPF regeneration. However, the opportunity for PCCI operation in the simulated HEV is limited because the engine typically experiences higher loads and multiple stop-start transients that are outside the allowable PCCI operating range. Thus developing ways of extending the PCCI operating range combined with improved control strategies for engine and emissions control management will be especially important for realizing the potential benefits of PCCI in HEVs.« less

  11. SAM Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactormore » concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.« less

  12. A benchmark system to optimize our defense against an attack on the US food supply using the Risk Reduction Effectiveness and Capabilities Assessment Program.

    PubMed

    Hodoh, Ofia; Dallas, Cham E; Williams, Paul; Jaine, Andrew M; Harris, Curt

    2015-01-01

    A predictive system was developed and tested in a series of exercises with the objective of evaluating the preparedness and effectiveness of the multiagency response to food terrorism attacks. A computerized simulation model, Risk Reduction Effectiveness and Capabilities Assessment Program (RRECAP), was developed to identify the key factors that influence the outcomes of an attack and quantify the relative reduction of such outcomes caused by each factor. The model was evaluated in a set of Tabletop and Full-Scale Exercises that simulate biological and chemical attacks on the food system. More than 300 participants representing more than 60 federal, state, local, and private sector agencies and organizations. The exercises showed that agencies could use RRECAP to identify and prioritize their advance preparation to mitigate such attacks with minimal expense. RRECAP also demonstrated the relative utility and limitations of the ability of medical resources to treat patients if responders do not recognize and mitigate the attack rapidly, and the exercise results showed that proper advance preparation would reduce these deficiencies. Using computer simulation prediction of the medical outcomes of food supply attacks to identify optimal remediation activities and quantify the benefits of various measures provides a significant tool to agencies in both the public and private sector as they seek to prepare for such an attack.

  13. Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States

    DOE PAGES

    Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy; ...

    2016-07-14

    Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic ismore » capable of producing health and climate benefits of between $54 and $120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately $690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. In the end, this work demonstrates health and climate benefits of off shore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.« less

  14. Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy

    Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic ismore » capable of producing health and climate benefits of between $54 and $120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately $690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. In the end, this work demonstrates health and climate benefits of off shore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.« less

  15. Health and climate benefits of offshore wind facilities in the Mid-Atlantic United States

    NASA Astrophysics Data System (ADS)

    Buonocore, Jonathan J.; Luckow, Patrick; Fisher, Jeremy; Kempton, Willett; Levy, Jonathan I.

    2016-07-01

    Electricity from fossil fuels contributes substantially to both climate change and the health burden of air pollution. Renewable energy sources are capable of displacing electricity from fossil fuels, but the quantity of health and climate benefits depend on site-specific attributes that are not often included in quantitative models. Here, we link an electrical grid simulation model to an air pollution health impact assessment model and US regulatory estimates of the impacts of carbon to estimate the health and climate benefits of offshore wind facilities of different sizes in two different locations. We find that offshore wind in the Mid-Atlantic is capable of producing health and climate benefits of between 54 and 120 per MWh of generation, with the largest simulated facility (3000 MW off the coast of New Jersey) producing approximately 690 million in benefits in 2017. The variability in benefits per unit generation is a function of differences in locations (Maryland versus New Jersey), simulated years (2012 versus 2017), and facility generation capacity, given complexities of the electrical grid and differences in which power plants are offset. This work demonstrates health and climate benefits of offshore wind, provides further evidence of the utility of geographically-refined modeling frameworks, and yields quantitative insights that would allow for inclusion of both climate and public health in benefits assessments of renewable energy.

  16. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  17. LightForce Photon-pressure Collision Avoidance: Efficiency Analysis in the Current Debris Environment and Long-Term Simulation Perspective

    NASA Technical Reports Server (NTRS)

    Yang, Fan Y.; Nelson, Bron; Carlino, Roberto; Perez, Andres D.; Faber, Nicolas; Henze, Chris; Karacahoglu, Arif G.; O'Toole, Conor; Swenson, Jason; Stupl, Jan

    2015-01-01

    This work provides an efficiency analysis of the LightForce space debris collision avoidance scheme in the current debris environment and describes a simulation approach to assess its impact on the long-term evolution of the space debris environment. LightForce aims to provide just-in-time collision avoidance by utilizing photon pressure from ground-based industrial lasers. These ground stations impart minimal accelerations to increase the miss distance for a predicted conjunction between two objects. In the first part of this paper we will present research that investigates the short-term effect of a few systems consisting of 10kW class lasers directed by 1.5 m diameter telescopes using adaptive optics. The results found such a network of ground stations to mitigate more than 85 percent of conjunctions and could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. While these are impressive numbers that indicate LightForce's utility in the short-term, the remaining 15 percent of possible collisions contain (among others) conjunctions between two massive objects that would add large amount of debris if they collide. Still, conjunctions between massive objects and smaller objects can be mitigated. Hence we choose to expand the capabilities of the simulation software to investigate the overall effect of a network of LightForce stations on the long-term debris evolution. In the second part of this paper, we will present the planed simulation approach for that effort.

  18. Capturing the Energy Absorbing Mechanisms of Composite Structures under Crash Loading

    NASA Astrophysics Data System (ADS)

    Wade, Bonnie

    As fiber reinforced composite material systems become increasingly utilized in primary aircraft and automotive structures, the need to understand their contribution to the crashworthiness of the structure is of great interest to meet safety certification requirements. The energy absorbing behavior of a composite structure, however, is not easily predicted due to the great complexity of the failure mechanisms that occur within the material. Challenges arise both in the experimental characterization and in the numerical modeling of the material/structure combination. At present, there is no standardized test method to characterize the energy absorbing capability of composite materials to aide crashworthy structural design. In addition, although many commercial finite element analysis codes exist and offer a means to simulate composite failure initiation and propagation, these models are still under development and refinement. As more metallic structures are replaced by composite structures, the need for both experimental guidelines to characterize the energy absorbing capability of a composite structure, as well as guidelines for using numerical tools to simulate composite materials in crash conditions has become a critical matter. This body of research addresses both the experimental characterization of the energy absorption mechanisms occurring in composite materials during crushing, as well as the numerical simulation of composite materials undergoing crushing. In the experimental investigation, the specific energy absorption (SEA) of a composite material system is measured using a variety of test element geometries, such as corrugated plates and tubes. Results from several crush experiments reveal that SEA is not a constant material property for laminated composites, and varies significantly with the geometry of the test specimen used. The variation of SEA measured for a single material system requires that crush test data must be generated for a range of different test geometries in order to define the range of its energy absorption capability. Further investigation from the crush tests has led to the development of a direct link between geometric features of the crush specimen and its resulting SEA. Through micrographic analysis, distinct failure modes are shown to be guided by the geometry of the specimen, and subsequently are shown to directly influence energy absorption. A new relationship between geometry, failure mode, and SEA has been developed. This relationship has allowed for the reduction of the element-level crush testing requirement to characterize the composite material energy absorption capability. In the numerical investigation, the LS-DYNA composite material model MAT54 is selected for its suitability to model composite materials beyond failure determination, as required by crush simulation, and its capability to remain within the scope of ultimately using this model for large-scale crash simulation. As a result of this research, this model has been thoroughly investigated in depth for its capacity to simulate composite materials in crush, and results from several simulations of the element-level crush experiments are presented. A modeling strategy has been developed to use MAT54 for crush simulation which involves using the experimental data collected from the coupon- and element-level crush tests to directly calibrate the crush damage parameter in MAT54 such that it may be used in higher-level simulations. In addition, the source code of the material model is modified to improve upon its capability. The modifications include improving the elastic definition such that the elastic response to multi-axial load cases can be accurately portrayed simultaneously in each element, which is a capability not present in other composite material models. Modifications made to the failure determination and post-failure model have newly emphasized the post-failure stress degradation scheme rather than the failure criterion which is traditionally considered the most important composite material model definition for crush simulation. The modification efforts have also validated the use of the MAT54 failure criterion and post-failure model for crash modeling when its capabilities and limitations are well understood, and for this reason guidelines for using MAT54 for composite crush simulation are presented. This research has effectively (a) developed and demonstrated a procedure that defines a set of experimental crush results that characterize the energy absorption capability of a composite material system, (b) used the experimental results in the development and refinement of a composite material model for crush simulation, (c) explored modifying the material model to improve its use in crush modeling, and (d) provided experimental and modeling guidelines for composite structures under crush at the element-level in the scope of the Building Block Approach.

  19. Observability-Based Guidance and Sensor Placement

    NASA Astrophysics Data System (ADS)

    Hinson, Brian T.

    Control system performance is highly dependent on the quality of sensor information available. In a growing number of applications, however, the control task must be accomplished with limited sensing capabilities. This thesis addresses these types of problems from a control-theoretic point-of-view, leveraging system nonlinearities to improve sensing performance. Using measures of observability as an information quality metric, guidance trajectories and sensor distributions are designed to improve the quality of sensor information. An observability-based sensor placement algorithm is developed to compute optimal sensor configurations for a general nonlinear system. The algorithm utilizes a simulation of the nonlinear system as the source of input data, and convex optimization provides a scalable solution method. The sensor placement algorithm is applied to a study of gyroscopic sensing in insect wings. The sensor placement algorithm reveals information-rich areas on flexible insect wings, and a comparison to biological data suggests that insect wings are capable of acting as gyroscopic sensors. An observability-based guidance framework is developed for robotic navigation with limited inertial sensing. Guidance trajectories and algorithms are developed for range-only and bearing-only navigation that improve navigation accuracy. Simulations and experiments with an underwater vehicle demonstrate that the observability measure allows tuning of the navigation uncertainty.

  20. Fuego/Scefire MPMD Coupling L2 Milestone Executive Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, Flint; Tencer, John; Pautz, Shawn D.

    2017-09-01

    This milestone campaign was focused on coupling Sandia physics codes SIERRA low Mach module Fuego and RAMSES Boltzmann transport code Sceptre(Scefire). Fuego enables simulation of low Mach, turbulent, reacting, particle laden flows on unstructured meshes using CVFEM for abnormal thermal environments throughout SNL and the larger national security community. Sceptre provides simulation for photon, neutron, and charged particle transport on unstructured meshes using Discontinuous Galerkin for radiation effects calculations at SNL and elsewhere. Coupling these ”best of breed” codes enables efficient modeling of thermal/fluid environments with radiation transport, including fires (pool, propellant, composite) as well as those with directed radiantmore » fluxes. We seek to improve the experience of Fuego users who require radiation transport capabilities in two ways. The first is performance. We achieve this through leveraging additional computational resources for Scefire, reducing calculation times while leaving unaffected resources for fluid physics. This approach is new to Fuego, which previously utilized the same resources for both fluid and radiation solutions. The second improvement enables new radiation capabilities, including spectral (banded) radiation, beam boundary sources, and alternate radiation solvers (i.e. Pn). This summary provides an overview of these achievements.« less

  1. 3D printed mitral valve models: affordable simulation for robotic mitral valve repair.

    PubMed

    Premyodhin, Ned; Mandair, Divneet; Ferng, Alice S; Leach, Timothy S; Palsma, Ryan P; Albanna, Mohammad Z; Khalpey, Zain I

    2018-01-01

    3D printed mitral valve (MV) models that capture the suture response of real tissue may be utilized as surgical training tools. Leveraging clinical imaging modalities, 3D computerized modelling and 3D printing technology to produce affordable models complements currently available virtual simulators and paves the way for patient- and pathology-specific preoperative rehearsal. We used polyvinyl alcohol, a dissolvable thermoplastic, to 3D print moulds that were casted with liquid platinum-cure silicone yielding flexible, low-cost MV models capable of simulating valvular tissue. Silicone-moulded MV models were fabricated for 2 morphologies: the normal MV and the P2 flail. The moulded valves were plication and suture tested in a laparoscopic trainer box with a da Vinci Si robotic surgical system. One cardiothoracic surgery fellow and 1 attending surgeon qualitatively evaluated the ability of the valves to recapitulate tissue feel through surveys utilizing the 5-point Likert-type scale to grade impressions of the valves. Valves produced with the moulding and casting method maintained anatomical dimensions within 3% of directly 3D printed acrylonitrile butadiene styrene controls for both morphologies. Likert-type scale mean scores corresponded with a realistic material response to sutures (5.0/5), tensile strength that is similar to real MV tissue (5.0/5) and anatomical appearance resembling real MVs (5.0/5), indicating that evaluators 'agreed' that these aspects of the model were appropriate for training. Evaluators 'somewhat agreed' that the overall model durability was appropriate for training (4.0/5) due to the mounting design. Qualitative differences in repair quality were notable between fellow and attending surgeon. 3D computer-aided design, 3D printing and fabrication techniques can be applied to fabricate affordable, high-quality educational models for technical training that are capable of differentiating proficiency levels among users. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  2. Novel high-fidelity realistic explosion damage simulation for urban environments

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  3. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  4. Today's Business Simulation Industry

    ERIC Educational Resources Information Center

    Summers, Gary J.

    2004-01-01

    New technologies are transforming the business simulation industry. The technologies come from research in computational fields of science, and they endow simulations with new capabilities and qualities. These capabilities and qualities include computerized behavioral simulations, online feedback and coaching, advanced interfaces, learning on…

  5. Life sciences utilization of Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chambers, Lawrence P.

    1992-01-01

    Space Station Freedom will provide the United States' first permanently manned laboratory in space. It will allow, for the first time, long term systematic life sciences investigations in microgravity. This presentation provides a top-level overview of the planned utilization of Space Station Freedom by NASA's Life Sciences Division. The historical drivers for conducting life sciences research on a permanently manned laboratory in space as well as the advantages that a space station platform provides for life sciences research are discussed. This background information leads into a description of NASA's strategy for having a fully operational International Life Sciences Research Facility by the year 2000. Achieving this capability requires the development of the five discipline focused 'common core' facilities. Once developed, these facilities will be brought to the space station during the Man-Tended Capability phase, checked out and brought into operation. Their delivery must be integrated with the Space Station Freedom manifest. At the beginning of Permanent Manned Capability, the infrastructure is expected to be completed and the Life Sciences Division's SSF Program will become fully operational. A brief facility description, anticipated launch date and a focused objective is provided for each of the life sciences facilities, including the Biomedical Monitoring and Countermeasures (BMAC) Facility, Gravitational Biology Facility (GBF), Gas Grain Simulation Facility (GGSF), Centrifuge Facility (CF), and Controlled Ecological Life Support System (CELSS) Test Facility. In addition, hardware developed by other NASA organizations and the SSF International Partners for an International Life Sciences Research Facility is also discussed.

  6. Development of Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Capability 2 and Experimental Plans

    NASA Technical Reports Server (NTRS)

    Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.; hide

    2006-01-01

    The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.

  7. Visualization Methods for Viability Studies of Inspection Modules for the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Mobasher, Amir A.

    2005-01-01

    An effective simulation of an object, process, or task must be similar to that object, process, or task. A simulation could consist of a physical device, a set of mathematical equations, a computer program, a person, or some combination of these. There are many reasons for the use of simulators. Although some of the reasons are unique to a specific situation, there are many general reasons and purposes for using simulators. Some are listed but not limited to (1) Safety, (2) Scarce resources, (3) Teaching/education, (4) Additional capabilities, (5) Flexibility and (6) Cost. Robot simulators are in use for all of these reasons. Virtual environments such as simulators will eliminate physical contact with humans and hence will increase the safety of work environment. Corporations with limited funding and resources may utilize simulators to accomplish their goals while saving manpower and money. A computer simulation is safer than working with a real robot. Robots are typically a scarce resource. Schools typically don t have a large number of robots, if any. Factories don t want the robots not performing useful work unless absolutely necessary. Robot simulators are useful in teaching robotics. A simulator gives a student hands-on experience, if only with a simulator. The simulator is more flexible. A user can quickly change the robot configuration, workcell, or even replace the robot with a different one altogether. In order to be useful, a robot simulator must create a model that accurately performs like the real robot. A powerful simulator is usually thought of as a combination of a CAD package with simulation capabilities. Computer Aided Design (CAD) techniques are used extensively by engineers in virtually all areas of engineering. Parts are designed interactively aided by the graphical display of both wireframe and more realistic shaded renderings. Once a part s dimensions have been specified to the CAD package, designers can view the part from any direction to examine how it will look and perform in relation to other parts. If changes are deemed necessary, the designer can easily make the changes and view the results graphically. However, a complex process of moving parts intended for operation in a complex environment can only be fully understood through the process of animated graphical simulation. A CAD package with simulation capabilities allows the designer to develop geometrical models of the process being designed, as well as the environment in which the process will be used, and then test the process in graphical animation much as the actual physical system would be run . By being able to operate the system of moving and stationary parts, the designer is able to see in simulation how the system will perform under a wide variety of conditions. If, for example, undesired collisions occur between parts of the system, design changes can be easily made without the expense or potential danger of testing the physical system.

  8. Cost Benefit and Alternatives Analysis of Distribution Systems with Energy Storage Systems: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Tom; Nagarajan, Adarsh; Baggu, Murali

    This paper explores monetized and non-monetized benefits from storage interconnected to distribution system through use cases illustrating potential applications for energy storage in California's electric utility system. This work supports SDG&E in its efforts to quantify, summarize, and compare the cost and benefit streams related to implementation and operation of energy storage on its distribution feeders. This effort develops the cost benefit and alternatives analysis platform, integrated with QSTS feeder simulation capability, and analyzed use cases to explore the cost-benefit of implementation and operation of energy storage for feeder support and market participation.

  9. Evaluation of a Cyber Security System for Hospital Network.

    PubMed

    Faysel, Mohammad A

    2015-01-01

    Most of the cyber security systems use simulated data in evaluating their detection capabilities. The proposed cyber security system utilizes real hospital network connections. It uses a probabilistic data mining algorithm to detect anomalous events and takes appropriate response in real-time. On an evaluation using real-world hospital network data consisting of incoming network connections collected for a 24-hour period, the proposed system detected 15 unusual connections which were undetected by a commercial intrusion prevention system for the same network connections. Evaluation of the proposed system shows a potential to secure protected patient health information on a hospital network.

  10. Gpu Implementation of a Viscous Flow Solver on Unstructured Grids

    NASA Astrophysics Data System (ADS)

    Xu, Tianhao; Chen, Long

    2016-06-01

    Graphics processing units have gained popularities in scientific computing over past several years due to their outstanding parallel computing capability. Computational fluid dynamics applications involve large amounts of calculations, therefore a latest GPU card is preferable of which the peak computing performance and memory bandwidth are much better than a contemporary high-end CPU. We herein focus on the detailed implementation of our GPU targeting Reynolds-averaged Navier-Stokes equations solver based on finite-volume method. The solver employs a vertex-centered scheme on unstructured grids for the sake of being capable of handling complex topologies. Multiple optimizations are carried out to improve the memory accessing performance and kernel utilization. Both steady and unsteady flow simulation cases are carried out using explicit Runge-Kutta scheme. The solver with GPU acceleration in this paper is demonstrated to have competitive advantages over the CPU targeting one.

  11. Identification of Defects in Piles Through Dynamic Testing

    NASA Astrophysics Data System (ADS)

    Liao, Shutao T.; Roesset, Jose M.

    1997-04-01

    The objective of this work was to evaluate the theoretical capabilities of the non-destructive impact-response method in detecting the existence of a single defect in a pile, its location and its length. The cross-section of the pile is assumed to be circular and the defects are assumed to be axisymmetric in geometry. As mentioned in the companion paper, special codes utilizing one-dimensional (1-D) and three-dimensional (3-D) axisymmetric finite element models were developed to simulate the responses of defective piles to an impact load. Extensive parametric studies were then performed. In each study, the results from the direct use of time histories of displacements or velocities and the mechanical admittance (or mobility) function were compared in order to assess their capabilities. The effects of the length and the width of a defect were also investigated using these methods. Int. J. Numer. Anal. Meth. Geomech., vol. 21, 277-291 (1997)

  12. Theory and design of compact hybrid microphone arrays on two-dimensional planes for three-dimensional soundfield analysis.

    PubMed

    Chen, Hanchi; Abhayapala, Thushara D; Zhang, Wen

    2015-11-01

    Soundfield analysis based on spherical harmonic decomposition has been widely used in various applications; however, a drawback is the three-dimensional geometry of the microphone arrays. In this paper, a method to design two-dimensional planar microphone arrays that are capable of capturing three-dimensional (3D) spatial soundfields is proposed. Through the utilization of both omni-directional and first order microphones, the proposed microphone array is capable of measuring soundfield components that are undetectable to conventional planar omni-directional microphone arrays, thus providing the same functionality as 3D arrays designed for the same purpose. Simulations show that the accuracy of the planar microphone array is comparable to traditional spherical microphone arrays. Due to its compact shape, the proposed microphone array greatly increases the feasibility of 3D soundfield analysis techniques in real-world applications.

  13. Available Transfer Capability Determination Using Hybrid Evolutionary Algorithm

    NASA Astrophysics Data System (ADS)

    Jirapong, Peeraool; Ongsakul, Weerakorn

    2008-10-01

    This paper proposes a new hybrid evolutionary algorithm (HEA) based on evolutionary programming (EP), tabu search (TS), and simulated annealing (SA) to determine the available transfer capability (ATC) of power transactions between different control areas in deregulated power systems. The optimal power flow (OPF)-based ATC determination is used to evaluate the feasible maximum ATC value within real and reactive power generation limits, line thermal limits, voltage limits, and voltage and angle stability limits. The HEA approach simultaneously searches for real power generations except slack bus in a source area, real power loads in a sink area, and generation bus voltages to solve the OPF-based ATC problem. Test results on the modified IEEE 24-bus reliability test system (RTS) indicate that ATC determination by the HEA could enhance ATC far more than those from EP, TS, hybrid TS/SA, and improved EP (IEP) algorithms, leading to an efficient utilization of the existing transmission system.

  14. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  15. Autonomous flight and remote site landing guidance research for helicopters

    NASA Technical Reports Server (NTRS)

    Denton, R. V.; Pecklesma, N. J.; Smith, F. W.

    1987-01-01

    Automated low-altitude flight and landing in remote areas within a civilian environment are investigated, where initial cost, ongoing maintenance costs, and system productivity are important considerations. An approach has been taken which has: (1) utilized those technologies developed for military applications which are directly transferable to a civilian mission; (2) exploited and developed technology areas where new methods or concepts are required; and (3) undertaken research with the potential to lead to innovative methods or concepts required to achieve a manual and fully automatic remote area low-altitude and landing capability. The project has resulted in a definition of system operational concept that includes a sensor subsystem, a sensor fusion/feature extraction capability, and a guidance and control law concept. These subsystem concepts have been developed to sufficient depth to enable further exploration within the NASA simulation environment, and to support programs leading to the flight test.

  16. Simulation realization of 2-D wavelength/time system utilizing MDW code for OCDMA system

    NASA Astrophysics Data System (ADS)

    Azura, M. S. A.; Rashidi, C. B. M.; Aljunid, S. A.; Endut, R.; Ali, N.

    2017-11-01

    This paper presents a realization of Wavelength/Time (W/T) Two-Dimensional Modified Double Weight (2-D MDW) code for Optical Code Division Multiple Access (OCDMA) system based on Spectral Amplitude Coding (SAC) approach. The MDW code has the capability to suppress Phase-Induce Intensity Noise (PIIN) and minimizing the Multiple Access Interference (MAI) noises. At the permissible BER 10-9, the 2-D MDW (APD) had shown minimum effective received power (Psr) = -71 dBm that can be obtained at the receiver side as compared to 2-D MDW (PIN) only received -61 dBm. The results show that 2-D MDW (APD) has better performance in achieving same BER with longer optical fiber length and with less received power (Psr). Also, the BER from the result shows that MDW code has the capability to suppress PIIN ad MAI.

  17. Reduced Order Modeling of Combustion Instability in a Gas Turbine Model Combustor

    NASA Astrophysics Data System (ADS)

    Arnold-Medabalimi, Nicholas; Huang, Cheng; Duraisamy, Karthik

    2017-11-01

    Hydrocarbon fuel based propulsion systems are expected to remain relevant in aerospace vehicles for the foreseeable future. Design of these devices is complicated by combustion instabilities. The capability to model and predict these effects at reduced computational cost is a requirement for both design and control of these devices. This work focuses on computational studies on a dual swirl model gas turbine combustor in the context of reduced order model development. Full fidelity simulations are performed utilizing URANS and Hybrid RANS-LES with finite rate chemistry. Following this, data decomposition techniques are used to extract a reduced basis representation of the unsteady flow field. These bases are first used to identify sensor locations to guide experimental interrogations and controller feedback. Following this, initial results on developing a control-oriented reduced order model (ROM) will be presented. The capability of the ROM will be further assessed based on different operating conditions and geometric configurations.

  18. EPS analysis of nominal STS-1 flight

    NASA Technical Reports Server (NTRS)

    Wolfgram, D. F.; Pipher, M. D.

    1980-01-01

    The results of electrical power system (EPS) analysis of the planned Shuttle Transportation System Flight 1 mission are presented. The capability of the orbiter EPS to support the planned flight and to provide program tape information and supplementary data specifically requested by the flight operations directorate was assessed. The analysis was accomplished using the orbiter version of the spacecraft electrical power simulator program, operating from a modified version of orbiter electrical equipment utilization baseline revision four. The results indicate that the nominal flight, as analyzed, is within the capabilities of the orbiter power generation system, but that a brief, and minimal, current overload may exist between main distributor 1 and mid power controlled 1, and that inverter 9 may the overloaded for extended periods of time. A comparison of results with launch commit criteria also indicated that some of the presently existing launch redlines may be violated during the terminal countdown.

  19. LOX/Methane Main Engine Igniter Tests and Modeling

    NASA Technical Reports Server (NTRS)

    Breisacher, Kevin J.; Ajmani, Kumund

    2008-01-01

    The LOX/methane propellant combination is being considered for the Lunar Surface Access Module ascent main engine propulsion system. The proposed switch from the hypergolic propellants used in the Apollo lunar ascent engine to LOX/methane propellants requires the development of igniters capable of highly reliable performance in a lunar surface environment. An ignition test program was conducted that used an in-house designed LOX/methane spark torch igniter. The testing occurred in Cell 21 of the Research Combustion Laboratory to utilize its altitude capability to simulate a space vacuum environment. Approximately 750 ignition test were performed to evaluate the effects of methane purity, igniter body temperature, spark energy level and frequency, mixture ratio, flowrate, and igniter geometry on the ability to obtain successful ignitions. Ignitions were obtained down to an igniter body temperature of approximately 260 R with a 10 torr back-pressure. The data obtained is also being used to anchor a CFD based igniter model.

  20. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  1. Evaluating the Special Needs of the Military for Radiation Biodosimetry for Tactical Warfare against Deployed Troops: Comparing Military to Civilian Needs for Biodosimetry Methods

    PubMed Central

    Flood, Ann Barry; Ali, Arif N.; Boyle, Holly K.; Du, Gaixin; Satinsky, Victoria A.; Swarts, Steven G.; Williams, Benjamin B.; Demidenko, Eugene; Schreiber, Wilson; Swartz, Harold M.

    2016-01-01

    Objectives The aim of this paper is to delineate characteristics of biodosimetry most suitable for assessing individuals who have potentially been exposed to significant radiation from a nuclear device explosion, when the primary population targeted by the explosion and needing rapid assessment for triage is civilians vs. deployed military personnel. Methods We first carry out a systematic analysis of the requirements for biodosimetry to meet the military's needs to assess deployed troops in a warfare situation, which include accomplishing the military mission. We then systematically compare and contrast the military's special capabilities to respond and carry out biodosimetry for deployed troops in warfare, in contrast to those available to respond and conduct biodosimetry for civilians who have been targeted, e.g., by terrorists. We then compare the effectiveness of different biodosimetry methods to address military vs. civilian needs and capabilities in these scenarios and, using five representative types of biodosimetry with sufficient published data to be useful for the simulations, we estimate the number of individuals who could be assessed by military vs. civilian responders within the timeframe needed for triage decisions. Conclusions Analyses based on these scenarios indicate that, in comparison to responses for a civilian population, a wartime military response for deployed troops has both more complex requirements for and greater capabilities to utilize different types of biodosimetry to evaluate radiation exposure in a very short timeframe after the exposure occurs. Greater complexity for the deployed military is based on factors such as a greater likelihood of partial or whole body exposure, conditions that include exposure to neutrons, and a greater likelihood of combined injury. Our simulations showed, for both the military and civilian response, that a very fast rate of initiating the processing (24,000 per day) is needed to have at least some methods capable of completing the assessment of 50,000 people within a 2 or 6 day timeframe following exposure. This in turn suggests a very high capacity (i.e., laboratories, devices, supplies and expertise) would be necessary to achieve these rates. These simulations also demonstrated the practical importance of the military's superior capacity to minimize time to transport samples to offsite facilities and utilize the results to carry out triage quickly. Assuming sufficient resources and the fastest daily rate to initiate processing victims, the military scenario revealed that two biodosimetry methods could achieve the necessary throughput to triage 50,000 victims in 2 days (i.e., the timeframe needed for injured victims) and all five achieved the targeted throughput within 6 days. In contrast, simulations based on the civilian scenario revealed that no method could process 50,000 people in 2 days and only two could succeed within 6 days. PMID:27356061

  2. Building Airport Surface HITL Simulation Capability

    NASA Technical Reports Server (NTRS)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  3. Effect of Thermal and Chemical Treatment on the Microstructural, Mechanical and Machining Performance of W319 Al-Si-Cu Cast Alloy Engine Blocks and Directionally Solidified Machinability Test Blocks

    NASA Astrophysics Data System (ADS)

    Szablewski, Daniel

    The research presented in this work is focused on making a link between casting microstructural, mechanical and machining properties for 319 Al-Si sand cast components. In order to achieve this, a unique Machinability Test Block (MTB) is designed to simulate the Nemak V6 Al-Si engine block solidification behavior. This MTB is then utilized to cast structures with in-situ nano-alumina particle master alloy additions that are Mg based, as well as independent in-situ Mg additions, and Sr additions to the MTB. The Universal Metallurgical Simulator and Analyzer (UMSA) Technology Platform is utilized for characterization of each cast structure at different Secondary Dendrite Arm Spacing (SDAS) levels. The rapid quench method and Jominy testing is used to assess the capability of the nano-alumina master alloy to modify the microstructure at different SDAS levels. Mechanical property assessment of the MTB is done at different SDAS levels on cast structures with master alloy additions described above. Weibull and Quality Index statistical analysis tools are then utilized to assess the mechanical properties. The MTB is also used to study single pass high speed face milling and bi-metallic cutting operations where the Al-Si hypoeutectic structure is combined with hypereutectoid Al-Si liners and cast iron cylinder liners. These studies are utilized to aid the implementation of Al-Si liners into the Nemak V6 engine block and bi-metallic cutting of the head decks. Machining behavior is also quantified for the investigated microstructures, and the Silicon Modification Level (SiML) is utilized for microstructural analysis as it relates to the machining behavior.

  4. Molecular characterization of whey protein hydrolysate fractions with ferrous chelating and enhanced iron solubility capabilities.

    PubMed

    O'Loughlin, Ian B; Kelly, Phil M; Murray, Brian A; FitzGerald, Richard J; Brodkorb, Andre

    2015-03-18

    The ferrous (Fe2+) chelating capabilities of WPI hydrolysate fractions produced via cascade membrane filtration were investigated, specifically 1 kDa permeate (P) and 30 kDa retentate (R) fractions. The 1 kDa-P possessed a Fe2+ chelating capability at 1 g L(-1) equivalent to 84.4 μM EDTA (for 30 kDa-R the value was 8.7 μM EDTA). Fourier transformed infrared (FTIR) spectroscopy was utilized to investigate the structural characteristics of hydrolysates and molecular interactions with Fe2+. Solid-phase extraction was employed to enrich for chelating activity; the most potent chelating fraction was enriched in histidine and lysine. The solubility of ferrous sulfate solutions (10 mM) over a range of pH values was significantly (P<0.05) improved in dispersions of hydrolysate fraction solutions (10 g protein L(-1)). Total iron solubility was improved by 72% in the presence of the 1 kDa-P fraction following simulated gastrointestinal digestion (SGID) compared to control FeSO4·7H2O solutions.

  5. A High Temperature Silicon Carbide mosfet Power Module With Integrated Silicon-On-Insulator-Based Gate Drive

    DOE PAGES

    Wang, Zhiqiang; Shi, Xiaojie; Tolbert, Leon M.; ...

    2014-04-30

    Here we present a board-level integrated silicon carbide (SiC) MOSFET power module for high temperature and high power density application. Specifically, a silicon-on-insulator (SOI)-based gate driver capable of operating at 200°C ambient temperature is designed and fabricated. The sourcing and sinking current capability of the gate driver are tested under various ambient temperatures. Also, a 1200 V/100 A SiC MOSFET phase-leg power module is developed utilizing high temperature packaging technologies. The static characteristics, switching performance, and short-circuit behavior of the fabricated power module are fully evaluated at different temperatures. Moreover, a buck converter prototype composed of the SOI gate drivermore » and SiC power module is built for high temperature continuous operation. The converter is operated at different switching frequencies up to 100 kHz, with its junction temperature monitored by a thermosensitive electrical parameter and compared with thermal simulation results. The experimental results from the continuous operation demonstrate the high temperature capability of the power module at a junction temperature greater than 225°C.« less

  6. Efficient Power-Transfer Capability Analysis of the TET System Using the Equivalent Small Parameter Method.

    PubMed

    Yanzhen Wu; Hu, A P; Budgett, D; Malpas, S C; Dissanayake, T

    2011-06-01

    Transcutaneous energy transfer (TET) enables the transfer of power across the skin without direct electrical connection. It is a mechanism for powering implantable devices for the lifetime of a patient. For maximum power transfer, it is essential that TET systems be resonant on both the primary and secondary sides, which requires considerable design effort. Consequently, a strong need exists for an efficient method to aid the design process. This paper presents an analytical technique appropriate to analyze complex TET systems. The system's steady-state solution in closed form with sufficient accuracy is obtained by employing the proposed equivalent small parameter method. It is shown that power-transfer capability can be correctly predicted without tedious iterative simulations or practical measurements. Furthermore, for TET systems utilizing a current-fed push-pull soft switching resonant converter, it is found that the maximum energy transfer does not occur when the primary and secondary resonant tanks are "tuned" to the nominal resonant frequency. An optimal turning point exists, corresponding to the system's maximum power-transfer capability when optimal tuning capacitors are applied.

  7. Development of display design and command usage guidelines for Spacelab experiment computer applications

    NASA Technical Reports Server (NTRS)

    Dodson, D. W.; Shields, N. L., Jr.

    1979-01-01

    Individual Spacelab experiments are responsible for developing their CRT display formats and interactive command scenarios for payload crew monitoring and control of experiment operations via the Spacelab Data Display System (DDS). In order to enhance crew training and flight operations, it was important to establish some standardization of the crew/experiment interface among different experiments by providing standard methods and techniques for data presentation and experiment commanding via the DDS. In order to establish optimum usage guidelines for the Spacelab DDS, the capabilities and limitations of the hardware and Experiment Computer Operating System design had to be considered. Since the operating system software and hardware design had already been established, the Display and Command Usage Guidelines were constrained to the capabilities of the existing system design. Empirical evaluations were conducted on a DDS simulator to determine optimum operator/system interface utilization of the system capabilities. Display parameters such as information location, display density, data organization, status presentation and dynamic update effects were evaluated in terms of response times and error rates.

  8. A Satellite Data Analysis and CubeSat Instrument Simulator Tool for Simultaneous Multi-spacecraft Measurements of Solar Energetic Particles

    NASA Astrophysics Data System (ADS)

    Vannitsen, Jordan; Rizzitelli, Federico; Wang, Kaiti; Segret, Boris; Juang, Jyh-Ching; Miau, Jiun-Jih

    2017-12-01

    This paper presents a Multi-satellite Data Analysis and Simulator Tool (MDAST), developed with the original goal to support the science requirements of a Martian 3-Unit CubeSat mission profile named Bleeping Interplanetary Radiation Determination Yo-yo (BIRDY). MDAST was firstly designed and tested by taking into account the positions, attitudes, instruments field of view and energetic particles flux measurements from four spacecrafts (ACE, MSL, STEREO A, and STEREO B). Secondly, the simulated positions, attitudes and instrument field of view from the BIRDY CubeSat have been adapted for input. And finally, this tool can be used for data analysis of the measurements from the four spacecrafts mentioned above so as to simulate the instrument trajectory and observation capabilities of the BIRDY CubeSat. The onset, peak and end time of a solar particle event is specifically defined and identified with this tool. It is not only useful for the BIRDY mission but also for analyzing data from the four satellites aforementioned and can be utilized for other space weather missions with further customization.

  9. Use of advanced modeling techniques to optimize thermal packaging designs.

    PubMed

    Formato, Richard M; Potami, Raffaele; Ahmed, Iftekhar

    2010-01-01

    Through a detailed case study the authors demonstrate, for the first time, the capability of using advanced modeling techniques to correctly simulate the transient temperature response of a convective flow-based thermal shipper design. The objective of this case study was to demonstrate that simulation could be utilized to design a 2-inch-wall polyurethane (PUR) shipper to hold its product box temperature between 2 and 8 °C over the prescribed 96-h summer profile (product box is the portion of the shipper that is occupied by the payload). Results obtained from numerical simulation are in excellent agreement with empirical chamber data (within ±1 °C at all times), and geometrical locations of simulation maximum and minimum temperature match well with the corresponding chamber temperature measurements. Furthermore, a control simulation test case was run (results taken from identical product box locations) to compare the coupled conduction-convection model with a conduction-only model, which to date has been the state-of-the-art method. For the conduction-only simulation, all fluid elements were replaced with "solid" elements of identical size and assigned thermal properties of air. While results from the coupled thermal/fluid model closely correlated with the empirical data (±1 °C), the conduction-only model was unable to correctly capture the payload temperature trends, showing a sizeable error compared to empirical values (ΔT > 6 °C). A modeling technique capable of correctly capturing the thermal behavior of passively refrigerated shippers can be used to quickly evaluate and optimize new packaging designs. Such a capability provides a means to reduce the cost and required design time of shippers while simultaneously improving their performance. Another advantage comes from using thermal modeling (assuming a validated model is available) to predict the temperature distribution in a shipper that is exposed to ambient temperatures which were not bracketed during its validation. Thermal packaging is routinely used by the pharmaceutical industry to provide passive and active temperature control of their thermally sensitive products from manufacture through end use (termed the cold chain). In this study, the authors focus on passive temperature control (passive control does not require any external energy source and is entirely based on specific and/or latent heat of shipper components). As temperature-sensitive pharmaceuticals are being transported over longer distances, cold chain reliability is essential. To achieve reliability, a significant amount of time and resources must be invested in design, test, and production of optimized temperature-controlled packaging solutions. To shorten the cumbersome trial and error approach (design/test/design/test …), computer simulation (virtual prototyping and testing of thermal shippers) is a promising method. Although several companies have attempted to develop such a tool, there has been limited success to date. Through a detailed case study the authors demonstrate, for the first time, the capability of using advanced modeling techniques to correctly simulate the transient temperature response of a coupled conductive/convective-based thermal shipper. A modeling technique capable of correctly capturing shipper thermal behavior can be used to develop packaging designs more quickly, reducing up-front costs while also improving shipper performance.

  10. Nationwide Buildings Energy Research enabled through an integrated Data Intensive Scientific Workflow and Advanced Analysis Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.

    2014-01-28

    Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create andmore » execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern Interconnect domain, which they are now planning to extend to predict the demand for the complete century. The initial study raised their data demands from a few GBs to 400GB for the 3year study and expected tens of TBs for the full century.« less

  11. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  12. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  13. User's guide to the Variably Saturated Flow (VSF) process to MODFLOW

    USGS Publications Warehouse

    Thoms, R. Brad; Johnson, Richard L.; Healy, Richard W.

    2006-01-01

    A new process for simulating three-dimensional (3-D) variably saturated flow (VSF) using Richards' equation has been added to the 3-D modular finite-difference ground-water model MODFLOW. Five new packages are presented here as part of the VSF Process--the Richards' Equation Flow (REF1) Package, the Seepage Face (SPF1) Package, the Surface Ponding (PND1) Package, the Surface Evaporation (SEV1) Package, and the Root Zone Evapotranspiration (RZE1) Package. Additionally, a new Adaptive Time-Stepping (ATS1) Package is presented for use by both the Ground-Water Flow (GWF) Process and VSF. The VSF Process allows simulation of flow in unsaturated media above the ground-water zone and facilitates modeling of ground-water/surface-water interactions. Model performance is evaluated by comparison to an analytical solution for one-dimensional (1-D) constant-head infiltration (Dirichlet boundary condition), field experimental data for a 1-D constant-head infiltration, laboratory experimental data for two-dimensional (2-D) constant-flux infiltration (Neumann boundary condition), laboratory experimental data for 2-D transient drainage through a seepage face, and numerical model results (VS2DT) of a 2-D flow-path simulation using realistic surface boundary conditions. A hypothetical 3-D example case also is presented to demonstrate the new capability using periodic boundary conditions (for example, daily precipitation) and varied surface topography over a larger spatial scale (0.133 square kilometer). The new model capabilities retain the modular structure of the MODFLOW code and preserve MODFLOW's existing capabilities as well as compatibility with commercial pre-/post-processors. The overall success of the VSF Process in simulating mixed boundary conditions and variable soil types demonstrates its utility for future hydrologic investigations. This report presents a new flow package implementing the governing equations for variably saturated ground-water flow, four new boundary condition packages unique to unsaturated flow, the Adaptive Time-Stepping Package for use with both the GWF Process and the new VSF Process, detailed descriptions of the input and output files for each package, and six simulation examples verifying model performance.

  14. Scenario management and automated scenario generation

    NASA Astrophysics Data System (ADS)

    McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee

    2006-05-01

    The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.

  15. Hitchhiker mission operations: Past, present, and future

    NASA Technical Reports Server (NTRS)

    Anderson, Kathryn

    1995-01-01

    What is mission operations? Mission operations is an iterative process aimed at achieving the greatest possible mission success with the resources available. The process involves understanding of the science objectives, investigation of which system capabilities can best meet these objectives, integration of the objectives and resources into a cohesive mission operations plan, evaluation of the plan through simulations, and implementation of the plan in real-time. In this paper, the authors present a comprehensive description of what the Hitchhiker mission operations approach is and why it is crucial to mission success. The authors describe the significance of operational considerations from the beginning and throughout the experiment ground and flight systems development. The authors also address the necessity of training and simulations. Finally, the authors cite several examples illustrating the benefits of understanding and utilizing the mission operations process.

  16. Analytical Investigation of the Limits for the In-Plane Thermal Conductivity Measurement Using a Suspended Membrane Setup

    NASA Astrophysics Data System (ADS)

    Linseis, V.; Völklein, F.; Reith, H.; Woias, P.; Nielsch, K.

    2018-06-01

    An analytical study has been performed on the measurement capabilities of a 100-nm thin suspended membrane setup for the in-plane thermal conductivity measurements of thin film samples using the 3 ω measurement technique, utilizing a COSMOL Multiphysics simulation. The maximum measurement range under observance of given boundary conditions has been studied. Three different exemplary sample materials, with a thickness from the nanometer to the micrometer range and a thermal conductivity from 0.4 W/mK up to 100 W/mK have been investigated as showcase studies. The results of the simulations have been compared to a previously published evaluation model, in order to determine the deviation between both and thereby the measurement limit. As thermal transport properties are temperature dependent, all calculations refer to constant room temperature conditions.

  17. A high accuracy sequential solver for simulation and active control of a longitudinal combustion instability

    NASA Technical Reports Server (NTRS)

    Shyy, W.; Thakur, S.; Udaykumar, H. S.

    1993-01-01

    A high accuracy convection scheme using a sequential solution technique has been developed and applied to simulate the longitudinal combustion instability and its active control. The scheme has been devised in the spirit of the Total Variation Diminishing (TVD) concept with special source term treatment. Due to the substantial heat release effect, a clear delineation of the key elements employed by the scheme, i.e., the adjustable damping factor and the source term treatment has been made. By comparing with the first-order upwind scheme previously utilized, the present results exhibit less damping and are free from spurious oscillations, offering improved quantitative accuracy while confirming the spectral analysis reported earlier. A simple feedback type of active control has been found to be capable of enhancing or attenuating the magnitude of the combustion instability.

  18. Development and evaluation of the Screening Trajectory Ozone Prediction System (STOPS, version 1.0)

    NASA Astrophysics Data System (ADS)

    Czader, B. H.; Percell, P.; Byun, D.; Choi, Y.

    2014-11-01

    A hybrid Lagrangian-Eulerian modeling tool has been developed using the Eulerian framework of the Community Multiscale Air Quality (CMAQ) model. It is a moving nest that utilizes saved original CMAQ simulation results to provide boundary conditions, initial conditions, as well as emissions and meteorological parameters necessary for a simulation. Given that these file are available, this tool can run independently from the CMAQ whole domain simulation and it is designed to simulate source - receptor relationship upon changes in emissions. In this tool, the original CMAQ's horizontal domain is reduced to a small sub-domain that follows a trajectory defined by the mean mixed-layer wind. It has the same vertical structure and physical and chemical interactions as CMAQ except advection calculation. The advantage of this tool compared to other Lagrangian models is its capability of utilizing realistic boundary conditions that change with space and time as well as detailed chemistry treatment. The correctness of the algorithms and the overall performance was evaluated against CMAQ simulation results. Its performance depends on the atmospheric conditions occurring during the simulation period with the comparisons being most similar to CMAQ results under uniform wind conditions. The mean bias varies between -0.03 and -0.78 and the slope is between 0.99 and 1.01 for different analyzed cases. For complicated meteorological condition, such as wind circulation, the simulated mixing ratios deviate from CMAQ values as a result of Lagrangian approach of using mean wind for its movement, but are still close, with the mean varying between 0.07 and -4.29 and slope varying between 0.95 and 1.063 for different analyzed cases. For historical reasons this hybrid Lagrangian - Eulerian tool is named the Screening Trajectory Ozone Prediction System (STOPS) but its use is not limited to ozone prediction as similarly to CMAQ it can simulate concentrations of many species, including particulate matter and some toxic compounds, such as formaldehyde and 1,3-butadiene.

  19. Marshall Space Flight Center's Impact Testing Facility Capabilities

    NASA Technical Reports Server (NTRS)

    Evans, Steve; Finchum, Andy; Hubbs, Whitney; Gray, Perry

    2008-01-01

    Marshall Space Flight Center's (MSFC) Impact Testing Facility (ITF) serves as an important installation for space and missile related materials science research. The ITF was established and began its research in spacecraft debris shielding in the early 1960s, then played a major role in the International Space Station debris shield development. As NASA became more interested in launch debris and in-flight impact concerns, the ITF grew to include research in a variety of impact genres. Collaborative partnerships with the DoD led to a wider range of impact capabilities being relocated to MSFC as a result of the closure of Particle Impact Facilities in Santa Barbara, California, The Particle Impact Facility had a 30 year history in providing evaluations of aerospace materials and components during flights through rain, ice, and solid particle environments at subsonic through hypersonic velocities. The facility's unique capabilities were deemed a 'National Asset' by the DoD, The ITF now has capabilities including environmental, ballistic, and hypervelocity impact testing utilizing an array of air, powder, and two-stage light gas guns to accommodate a variety of projectile and target types and sizes. Relocated test equipment was dated and in need of upgrade. Numerous upgrades including new instrumentation, triggering circuitry, high speed photography, and optimized sabot designs have been implemented. Other recent research has included rain drop demise characterization tests to obtain data for inclusion in on-going model development. Future ITF improvements will be focused on continued instrumentation and performance enhancements. These enhancements will allow further, more in-depth, characterization of rain drop demise characterization and evaluation of ice crystal impact. Performance enhancements also include increasing the upper velocity limit of the current environmental guns to allow direct environmental simulation for missile components. The current and proposed ITF capabilities range from rain to micrometeoroids allowing the widest test parameter range possible for materials investigations in support of space, atmospheric, and ground environments. These test capabilities including hydrometeor, single/multi-particle, ballistic gas guns, exploding wire gun, and light gas guns combined with Smooth Particle Hydrodynamics Code (SPHC) simulations represent the widest range of impact test capabilities in the country.

  20. Marshall Space Flight Center's Impact Testing Facility Capabilities

    NASA Technical Reports Server (NTRS)

    Evans, Steve; Finchum, Andy; Hubbs, Whitney

    2008-01-01

    Marshall Space Flight Center's (MSFC) Impact Testing Facility (ITF) serves as an important installation for space and missile related materials science research. The ITF was established and began its research in spacecraft debris shielding in the early 1960% then played a major role in the International Space Station debris shield development. As NASA became more interested in launch debris and in-flight impact concerns, the ITF grew to include research in a variety of impact genres. Collaborative partnerships with the DoD led to a wider range of impact capabilities being relocated to MSFC as a result of the closure of Particle Impact Facilities in Santa Barbara, California. The Particle Impact Facility had a 30 year history in providing evaluations of aerospace materials and components during flights through rain, ice, and solid particle environments at subsonic through hypersonic velocities. The facility's unique capabilities were deemed a "National Asset" by the DoD. The ITF now has capabilities including environmental, ballistic, and hypervelocity impact testing utilizing an array of air, powder, and two-stage light gas guns to accommodate a variety of projectile and target types and sizes. Relocated test equipment was dated and in need of upgrade. Numerous upgrades including new instrumentation, triggering circuitry, high speed photography, and optimized sabot designs have been implemented. Other recent research has included rain drop demise characterization tests to obtain data for inclusion in on-going model development. Future ITF improvements will be focused on continued instrumentation and performance enhancements. These enhancements will allow further, more in-depth, characterization of rain drop demise characterization and evaluation of ice crystal impact. Performance enhancements also include increasing the upper velocity limit of the current environmental guns to allow direct environmental simulation for missile components. The current and proposed ITF capabilities range from rain to micrometeoroids allowing the widest test parameter range possible for materials investigations in support of space, atmospheric, and ground environments. These test capabilities including hydrometeor, single/multi-particle, ballistic gas grins, exploding wire gun, and light gas guns combined with Smooth Particle Hydrodynamics Code (SPHC) simulations represent the widest range of impact test capabilities in the country.

  1. DET/MPS - The GSFC Energy Balance Programs

    NASA Technical Reports Server (NTRS)

    Jagielski, J. M.

    1994-01-01

    Direct Energy Transfer (DET) and MultiMission Spacecraft Modular Power System (MPS) computer programs perform mathematical modeling and simulation to aid in design and analysis of DET and MPS spacecraft power system performance in order to determine energy balance of subsystem. DET spacecraft power system feeds output of solar photovoltaic array and nickel cadmium batteries directly to spacecraft bus. MPS system, Standard Power Regulator Unit (SPRU) utilized to operate array at array's peak power point. DET and MPS perform minute-by-minute simulation of performance of power system. Results of simulation focus mainly on output of solar array and characteristics of batteries. Both packages limited in terms of orbital mechanics, they have sufficient capability to calculate data on eclipses and performance of arrays for circular or near-circular orbits. DET and MPS written in FORTRAN-77 with some VAX FORTRAN-type extensions. Both available in three versions: GSC-13374, for DEC VAX-series computers running VMS. GSC-13443, for UNIX-based computers. GSC-13444, for Apple Macintosh computers.

  2. Simulation evaluation of a low-altitude helicopter flight guidance system adapted for a helmet-mounted display

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Zelenka, Richard E.; Hardy, Gordon H.; Dearing, Munro G.

    1992-01-01

    A computer aiding concept for low-altitude helicopter flight was developed and evaluated in a real-time piloted simulation. The concept included an optimal control trajectory-generation algorithm based upon dynamic programming and a helmet-mounted display (HMD) presentation of a pathway-in-the-sky, a phantom aircraft, and flight-path vector/predictor guidance symbology. The trajectory-generation algorithm uses knowledge of the global mission requirements, a digital terrain map, aircraft performance capabilities, and advanced navigation information to determine a trajectory between mission way points that seeks valleys to minimize threat exposure. The pilot evaluation was conducted at NASA ARC moving base Vertical Motion Simulator (VMS) by pilots representing NASA, the U.S. Army, the Air Force, and the helicopter industry. The pilots manually tracked the trajectory generated by the algorithm utilizing the HMD symbology. The pilots were able to satisfactorily perform the tracking tasks while maintaining a high degree of awareness of the outside world.

  3. Extravehicular mobility unit thermal simulator

    NASA Technical Reports Server (NTRS)

    Hixon, C. W.; Phillips, M. A.

    1973-01-01

    The analytical methods, thermal model, and user's instructions for the SIM bay extravehicular mobility unit (EMU) routine are presented. This digital computer program was developed for detailed thermal performance predictions of the crewman performing a command module extravehicular activity during transearth coast. It accounts for conductive, convective, and radiative heat transfer as well as fluid flow and associated flow control components. The program is a derivative of the Apollo lunar surface EMU digital simulator. It has the operational flexibility to accept card or magnetic tape for both the input data and program logic. Output can be tabular and/or plotted and the mission simulation can be stopped and restarted at the discretion of the user. The program was developed for the NASA-JSC Univac 1108 computer system and several of the capabilities represent utilization of unique features of that system. Analytical methods used in the computer routine are based on finite difference approximations to differential heat and mass balance equations which account for temperature or time dependent thermo-physical properties.

  4. Study on photochemical analysis system (VLES) for EUV lithography

    NASA Astrophysics Data System (ADS)

    Sekiguchi, A.; Kono, Y.; Kadoi, M.; Minami, Y.; Kozawa, T.; Tagawa, S.; Gustafson, D.; Blackborow, P.

    2007-03-01

    A system for photo-chemical analysis of EUV lithography processes has been developed. This system has consists of 3 units: (1) an exposure that uses the Z-Pinch (Energetiq Tech.) EUV Light source (DPP) to carry out a flood exposure, (2) a measurement system RDA (Litho Tech Japan) for the development rate of photo-resists, and (3) a simulation unit that utilizes PROLITH (KLA-Tencor) to calculate the resist profiles and process latitude using the measured development rate data. With this system, preliminary evaluation of the performance of EUV lithography can be performed without any lithography tool (Stepper and Scanner system) that is capable of imaging and alignment. Profiles for 32 nm line and space pattern are simulated for the EUV resist (Posi-2 resist by TOK) by using VLES that hat has sensitivity at the 13.5nm wavelength. The simulation successfully predicts the resist behavior. Thus it is confirmed that the system enables efficient evaluation of the performance of EUV lithography processes.

  5. TAS: A Transonic Aircraft/Store flow field prediction code

    NASA Technical Reports Server (NTRS)

    Thompson, D. S.

    1983-01-01

    A numerical procedure has been developed that has the capability to predict the transonic flow field around an aircraft with an arbitrarily located, separated store. The TAS code, the product of a joint General Dynamics/NASA ARC/AFWAL research and development program, will serve as the basis for a comprehensive predictive method for aircraft with arbitrary store loadings. This report described the numerical procedures employed to simulate the flow field around a configuration of this type. The validity of TAS code predictions is established by comparison with existing experimental data. In addition, future areas of development of the code are outlined. A brief description of code utilization is also given in the Appendix. The aircraft/store configuration is simulated using a mesh embedding approach. The computational domain is discretized by three meshes: (1) a planform-oriented wing/body fine mesh, (2) a cylindrical store mesh, and (3) a global Cartesian crude mesh. This embedded mesh scheme enables simulation of stores with fins of arbitrary angular orientation.

  6. Simulation of Watts Bar Unit 1 Initial Startup Tests with Continuous Energy Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godfrey, Andrew T; Gehin, Jess C; Bekar, Kursat B

    2014-01-01

    The Consortium for Advanced Simulation of Light Water Reactors* is developing a collection of methods and software products known as VERA, the Virtual Environment for Reactor Applications. One component of the testing and validation plan for VERA is comparison of neutronics results to a set of continuous energy Monte Carlo solutions for a range of pressurized water reactor geometries using the SCALE component KENO-VI developed by Oak Ridge National Laboratory. Recent improvements in data, methods, and parallelism have enabled KENO, previously utilized predominately as a criticality safety code, to demonstrate excellent capability and performance for reactor physics applications. The highlymore » detailed and rigorous KENO solutions provide a reliable nu-meric reference for VERAneutronics and also demonstrate the most accurate predictions achievable by modeling and simulations tools for comparison to operating plant data. This paper demonstrates the performance of KENO-VI for the Watts Bar Unit 1 Cycle 1 zero power physics tests, including reactor criticality, control rod worths, and isothermal temperature coefficients.« less

  7. Employment of single-diode model to elucidate the variations in photovoltaic parameters under different electrical and thermal conditions

    PubMed Central

    Hameed, Shilan S.; Aziz, Fakhra; Sulaiman, Khaulah; Ahmad, Zubair

    2017-01-01

    In this research work, numerical simulations are performed to correlate the photovoltaic parameters with various internal and external factors influencing the performance of solar cells. Single-diode modeling approach is utilized for this purpose and theoretical investigations are compared with the reported experimental evidences for organic and inorganic solar cells at various electrical and thermal conditions. Electrical parameters include parasitic resistances (Rs and Rp) and ideality factor (n), while thermal parameters can be defined by the cells temperature (T). A comprehensive analysis concerning broad spectral variations in the short circuit current (Isc), open circuit voltage (Voc), fill factor (FF) and efficiency (η) is presented and discussed. It was generally concluded that there exists a good agreement between the simulated results and experimental findings. Nevertheless, the controversial consequence of temperature impact on the performance of organic solar cells necessitates the development of a complementary model which is capable of well simulating the temperature impact on these devices performance. PMID:28793325

  8. Computer aiding for low-altitude helicopter flight

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.

    1991-01-01

    A computer-aiding concept for low-altitude helicopter flight was developed and evaluated in a real-time piloted simulation. The concept included an optimal control trajectory-generated algorithm based on dynamic programming, and a head-up display (HUD) presentation of a pathway-in-the-sky, a phantom aircraft, and flight-path vector/predictor symbol. The trajectory-generation algorithm uses knowledge of the global mission requirements, a digital terrain map, aircraft performance capabilities, and advanced navigation information to determine a trajectory between mission waypoints that minimizes threat exposure by seeking valleys. The pilot evaluation was conducted at NASA Ames Research Center's Sim Lab facility in both the fixed-base Interchangeable Cab (ICAB) simulator and the moving-base Vertical Motion Simulator (VMS) by pilots representing NASA, the U.S. Army, and the U.S. Air Force. The pilots manually tracked the trajectory generated by the algorithm utilizing the HUD symbology. They were able to satisfactorily perform the tracking tasks while maintaining a high degree of awareness of the outside world.

  9. A fast parallel clustering algorithm for molecular simulation trajectories.

    PubMed

    Zhao, Yutong; Sheong, Fu Kit; Sun, Jian; Sander, Pedro; Huang, Xuhui

    2013-01-15

    We implemented a GPU-powered parallel k-centers algorithm to perform clustering on the conformations of molecular dynamics (MD) simulations. The algorithm is up to two orders of magnitude faster than the CPU implementation. We tested our algorithm on four protein MD simulation datasets ranging from the small Alanine Dipeptide to a 370-residue Maltose Binding Protein (MBP). It is capable of grouping 250,000 conformations of the MBP into 4000 clusters within 40 seconds. To achieve this, we effectively parallelized the code on the GPU and utilize the triangle inequality of metric spaces. Furthermore, the algorithm's running time is linear with respect to the number of cluster centers. In addition, we found the triangle inequality to be less effective in higher dimensions and provide a mathematical rationale. Finally, using Alanine Dipeptide as an example, we show a strong correlation between cluster populations resulting from the k-centers algorithm and the underlying density. © 2012 Wiley Periodicals, Inc. Copyright © 2012 Wiley Periodicals, Inc.

  10. SHARP Multiphysics Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Y. Q.; Shemon, E. R.; Mahadevan, Vijay S.

    SHARP, developed under the NEAMS Reactor Product Line, is an advanced modeling and simulation toolkit for the analysis of advanced nuclear reactors. SHARP is comprised of three physics modules currently including neutronics, thermal hydraulics, and structural mechanics. SHARP empowers designers to produce accurate results for modeling physical phenomena that have been identified as important for nuclear reactor analysis. SHARP can use existing physics codes and take advantage of existing infrastructure capabilities in the MOAB framework and the coupling driver/solver library, the Coupled Physics Environment (CouPE), which utilizes the widely used, scalable PETSc library. This report aims at identifying the coupled-physicsmore » simulation capability of SHARP by introducing the demonstration example called sahex in advance of the SHARP release expected by Mar 2016. sahex consists of 6 fuel pins with cladding, 1 control rod, sodium coolant and an outer duct wall that encloses all the other components. This example is carefully chosen to demonstrate the proof of concept for solving more complex demonstration examples such as EBR II assembly and ABTR full core. The workflow of preparing the input files, running the case and analyzing the results is demonstrated in this report. Moreover, an extension of the sahex model called sahex_core, which adds six homogenized neighboring assemblies to the full heterogeneous sahex model, is presented to test homogenization capabilities in both Nek5000 and PROTEUS. Some primary information on the configuration and build aspects for the SHARP toolkit, which includes capability to auto-download dependencies and configure/install with optimal flags in an architecture-aware fashion, is also covered by this report. A step-by-step instruction is provided to help users to create their cases. Details on these processes will be provided in the SHARP user manual that will accompany the first release.« less

  11. Recovery Discontinuous Galerkin Jacobian-free Newton-Krylov Method for all-speed flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HyeongKae Park; Robert Nourgaliev; Vincent Mousseau

    2008-07-01

    There is an increasing interest to develop the next generation simulation tools for the advanced nuclear energy systems. These tools will utilize the state-of-art numerical algorithms and computer science technology in order to maximize the predictive capability, support advanced reactor designs, reduce uncertainty and increase safety margins. In analyzing nuclear energy systems, we are interested in compressible low-Mach number, high heat flux flows with a wide range of Re, Ra, and Pr numbers. Under these conditions, the focus is placed on turbulent heat transfer, in contrast to other industries whose main interest is in capturing turbulent mixing. Our objective ismore » to develop singlepoint turbulence closure models for large-scale engineering CFD code, using Direct Numerical Simulation (DNS) or Large Eddy Simulation (LES) tools, requireing very accurate and efficient numerical algorithms. The focus of this work is placed on fully-implicit, high-order spatiotemporal discretization based on the discontinuous Galerkin method solving the conservative form of the compressible Navier-Stokes equations. The method utilizes a local reconstruction procedure derived from weak formulation of the problem, which is inspired by the recovery diffusion flux algorithm of van Leer and Nomura [?] and by the piecewise parabolic reconstruction [?] in the finite volume method. The developed methodology is integrated into the Jacobianfree Newton-Krylov framework [?] to allow a fully-implicit solution of the problem.« less

  12. Development and Assessment of a Novel Training Package for Basic Maneuvering Tasks on a Flight Simulator Using Self Instruction Methods and Above Real Time Training (ARTT)

    NASA Technical Reports Server (NTRS)

    Ali, Syed Firasat; Khan, M. Javed; Rossi, Marcia J.; Heath, Bruce e.; Crane, Peter; Ward, Marcus; Crier, Tomyka; Knighten, Tremaine; Culpepper, Christi

    2007-01-01

    One result of the relatively recent advances in computing technology has been the decreasing cost of computers and increasing computational power. This has allowed high fidelity airplane simulations to be run on personal computers (PC). Thus, simulators are now used routinely by pilots to substitute real flight hours for simulated flight hours for training for an aircraft type rating thereby reducing the cost of flight training. However, FAA regulations require that such substitution training must be supervised by Certified Flight Instructors (CFI). If the CFI presence could be reduced or eliminated for certain tasks this would mean a further cost savings to the pilot. This would require that the flight simulator have a certain level of 'intelligence' in order to provide feedback on pilot performance similar to that of a CFI. The 'intelligent' flight simulator would have at least the capability to use data gathered from the flight to create a measure for the performance of the student pilot. Also, to fully utilize the advances in computational power, the simulator would be capable of interacting with the student pilot using the best possible training interventions. This thesis reports on the two studies conducted at Tuskegee University investigating the effects of interventions on the learning of two flight maneuvers on a flight simulator and the robustness and accuracy of calculated performance indices as compared to CFI evaluations of performance. The intent of these studies is to take a step in the direction of creating an 'intelligent' flight simulator. The first study deals with the comparisons of novice pilot performance trained at different levels of above real-time to execute a level S-turn. The second study examined the effect of out-of-the-window (OTW) visual cues in the form of hoops on the performance of novice pilots learning to fly a landing approach on the flight simulator. The reliability/robustness of the computed performance metrics was assessed by comparing them with the evaluations of the landing approach maneuver by a number of CFIs.

  13. Analysis and simulation tools for solar array power systems

    NASA Astrophysics Data System (ADS)

    Pongratananukul, Nattorn

    This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including semiconductor properties, allows simulation of electrical properties as well as the evaluation of the impact of environmental conditions. A second developed tool provides a co-simulation platform with the capability to verify the performance of an actual digital controller implemented in programmable hardware such as a DSP processor, while the entire solar array including the DC-DC power converter is modeled in software algorithms running on a computer. This "virtual plant" allows developing and debugging code for the digital controller, and also to improve the control algorithm. One important task in solar arrays is to track the maximum power point on the array in order to maximize the power that can be delivered. Digital controllers implemented with programmable processors are particularly attractive for this task because sophisticated tracking algorithms can be implemented and revised when needed to optimize their performance. The proposed co-simulation tools are thus very valuable in developing and optimizing the control algorithm, before the system is built. Examples that demonstrate the effectiveness of the proposed methodologies are presented. The proposed simulation tools are also valuable in the design of multi-channel arrays. In the specific system that we have designed and tested, the control algorithm is implemented on a single digital signal processor. In each of the channels the maximum power point is tracked individually. In the prototype we built, off-the-shelf commercial DC-DC converters were utilized. At the end, the overall performance of the entire system was evaluated using solar array simulators capable of simulating various I-V characteristics, and also by using an electronic load. Experimental results are presented.

  14. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  15. Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA)

    NASA Technical Reports Server (NTRS)

    Banker, Brian F.; Robinson, Travis

    2016-01-01

    The proposed paper will cover ongoing effort named HESTIA (Human Exploration Spacecraft Testbed for Integration and Advancement), led at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) to promote a cross-subsystem approach to developing Mars-enabling technologies with the ultimate goal of integrated system optimization. HESTIA also aims to develop the infrastructure required to rapidly test these highly integrated systems at a low cost. The initial focus is on the common fluids architecture required to enable human exploration of mars, specifically between life support and in-situ resource utilization (ISRU) subsystems. An overview of the advancements in both integrated technologies, in infrastructure, in simulation, and in modeling capabilities will be presented, as well as the results and findings of integrated testing,. Due to the enormous mass gear-ratio required for human exploration beyond low-earth orbit, (for every 1 kg of payload landed on Mars, 226 kg will be required on Earth), minimization of surface hardware and commodities is paramount. Hardware requirements can be minimized by reduction of equipment performing similar functions though for different subsystems. If hardware could be developed which meets the requirements of both life support and ISRU it could result in the reduction of primary hardware and/or reduction in spares. Minimization of commodities to the surface of mars can be achieved through the creation of higher efficiency systems producing little to no undesired waste, such as a closed-loop life support subsystem. Where complete efficiency is impossible or impractical, makeup commodities could be manufactured via ISRU. Although, utilization of ISRU products (oxygen and water) for crew consumption holds great promise of reducing demands on life support hardware, there exist concerns as to the purity and transportation of commodities. To date, ISRU has been focused on production rates and purities for propulsion needs. The meshing of requirements between all potential users, producers, and cleaners of oxygen and water is crucial to guiding the development of technologies which will be used to perform these functions. Various new capabilities are being developed as part of HESTIA, which will enable the integrated testing of these technologies. This includes the upgrading of a 20' diameter habitat chamber to eventually support long duration (90+ day) human-in-the-loop testing of advanced life support systems. Additionally, a 20' diameter vacuum chamber is being modified to create Mars atmospheric pressures and compositions. This chamber, designated the Mars Environment Chamber (MEC), will eventually be upgraded to include a dusty environment and thermal shroud to simulate conditions on the surface of Mars. In view that individual technologies will be in geographically diverse locations across NASA facilities and elsewhere in the world, schedule and funding constraints will likely limit the frequency of physical integration. When this is the case, absent subsystems can be either digitally or physically simulated. Using the Integrated Power Avionics and Software (iPAS) environment, HESTIA is able to bring together data from various subsystems in simulated surroundings, insert faults, errors, time delays, etc., and feed data into computer models or physical systems capable of reproducing the output of the absent subsystems for the consumption of a local subsystems. Although imperfect, this capability provides opportunities to test subsystem integration and interactions at a fraction of the cost. When a subsystem technology is too immature for integrated testing, models can be produced using the General-Use Nodal Network Solver (GUNNS) capability to simulate the overall system performance. In doing so, even technologies not yet on the drawing board can be integrated and overall system performance estimated. Through the integrated development of technologies, as well as of the infrastructure to rapidly and at a low cost, model, simulate, and test subsystem technologies early in their development, HESTIA is pioneering a new way of developing the future of human space exploration.

  16. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  17. Simulation-Based Approach for Site-Specific Optimization of Hydrokinetic Turbine Arrays

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, F.; Chawdhary, S.; Yang, X.; Khosronejad, A.; Angelidis, D.

    2014-12-01

    A simulation-based approach has been developed to enable site-specific optimization of tidal and current turbine arrays in real-life waterways. The computational code is based on the St. Anthony Falls Laboratory Virtual StreamLab (VSL3D), which is able to carry out high-fidelity simulations of turbulent flow and sediment transport processes in rivers and streams taking into account the arbitrary geometrical complexity characterizing natural waterways. The computational framework can be used either in turbine-resolving mode, to take into account all geometrical details of the turbine, or with the turbines parameterized as actuator disks or actuator lines. Locally refined grids are employed to dramatically increase the resolution of the simulation and enable efficient simulations of multi-turbine arrays. Turbine/sediment interactions are simulated using the coupled hydro-morphodynamic module of VSL3D. The predictive capabilities of the resulting computational framework will be demonstrated by applying it to simulate turbulent flow past a tri-frame configuration of hydrokinetic turbines in a rigid-bed turbulent open channel flow as well as turbines mounted on mobile bed open channels to investigate turbine/sediment interactions. The utility of the simulation-based approach for guiding the optimal development of turbine arrays in real-life waterways will also be discussed and demonstrated. This work was supported by NSF grant IIP-1318201. Simulations were carried out at the Minnesota Supercomputing Institute.

  18. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  19. Effects of ATC automation on precision approaches to closely space parallel runways

    NASA Technical Reports Server (NTRS)

    Slattery, R.; Lee, K.; Sanford, B.

    1995-01-01

    Improved navigational technology (such as the Microwave Landing System and the Global Positioning System) installed in modern aircraft will enable air traffic controllers to better utilize available airspace. Consequently, arrival traffic can fly approaches to parallel runways separated by smaller distances than are currently allowed. Previous simulation studies of advanced navigation approaches have found that controller workload is increased when there is a combination of aircraft that are capable of following advanced navigation routes and aircraft that are not. Research into Air Traffic Control automation at Ames Research Center has led to the development of the Center-TRACON Automation System (CTAS). The Final Approach Spacing Tool (FAST) is the component of the CTAS used in the TRACON area. The work in this paper examines, via simulation, the effects of FAST used for aircraft landing on closely spaced parallel runways. The simulation contained various combinations of aircraft, equipped and unequipped with advanced navigation systems. A set of simulations was run both manually and with an augmented set of FAST advisories to sequence aircraft, assign runways, and avoid conflicts. The results of the simulations are analyzed, measuring the airport throughput, aircraft delay, loss of separation, and controller workload.

  20. Use of High-resolution WRF Simulations to Forecast Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, William E.; LaCasse, K.; Goodman, S. J.

    2006-01-01

    Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of recent forecast models such as WRF, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Six-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data yield the most realistic simulations. An array of subjective and objective statistical metrics are employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.

  1. Modification of a Macromechanical Finite-Element Based Model for Impact Analysis of Triaxially-Braided Composites

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Blinzler, Brina J.; Binienda, Wieslaw K.

    2010-01-01

    A macro level finite element-based model has been developed to simulate the mechanical and impact response of triaxially-braided polymer matrix composites. In the analytical model, the triaxial braid architecture is simulated by using four parallel shell elements, each of which is modeled as a laminated composite. For the current analytical approach, each shell element is considered to be a smeared homogeneous material. The commercial transient dynamic finite element code LS-DYNA is used to conduct the simulations, and a continuum damage mechanics model internal to LS-DYNA is used as the material constitutive model. The constitutive model requires stiffness and strength properties of an equivalent unidirectional composite. Simplified micromechanics methods are used to determine the equivalent stiffness properties, and results from coupon level tests on the braided composite are utilized to back out the required strength properties. Simulations of quasi-static coupon tests of several representative braided composites are conducted to demonstrate the correlation of the model. Impact simulations of a represented braided composites are conducted to demonstrate the capability of the model to predict the penetration velocity and damage patterns obtained experimentally.

  2. Scope of Work for Integration Management and Installation Services of the National Ignition Facility Beampath Infrastructure System

    NASA Astrophysics Data System (ADS)

    Coyle, P. D.

    2000-03-01

    The goal of the National Ignition Facility (NIF) project is to provide an above ground experimental capability for maintaining nuclear competence and weapons effects simulation and to provide a facility capable of achieving fusion ignition using solid-state lasers as the energy driver. The facility will incorporate 192 laser beams, which will be focused onto a small target located at the center of a spherical target chamber-the energy from the laser beams will be deposited in a few billionths of a second. The target will then implode, forcing atomic nuclei to sufficiently high temperatures and densities necessary to achieve a miniature fusion reaction. The NIF is under construction, at Livermore, California, located approximately 50 miles southeast of San Francisco, California. The University of California, Lawrence Livermore National Laboratory (LLNL), operating under Prime Contract W-7405-ENG. 48 with the U.S. Department of Energy (DOE), shall subcontract for Integration Management and Installation (IMI) Services for the Beampath Infrastructure System (BIS). The BIS includes Beampath Hardware and Beampath Utilities. Conventional Facilities work for the NIF Laser and Target Area Building (LTAB) and Optics Assembly Building (OAB) is over 86 percent constructed. This Scope of Work is for Integration Management and Installation (IMI) Services corresponding to Management Services, Design Integration Services, Construction Services, and Commissioning Services for the NIB BIS. The BIS includes Beampath Hardware and Beampath Utilities. Beampath Hardware and Beampath Utilities include beampath vessels, enclosures, and beam tubes; auxiliary and utility systems; and support structures. A substantial amount of GFE will be provided by the University for installation as part of the infrastructure packages.

  3. Patient Care Utility Module for DEPMEDS Hospitals

    DTIC Science & Technology

    1991-06-05

    identified in the patient care utility capability in Deployable Medical S-:tems (DEPMEDS) hospitals, especially in the Intensive Care Unit (ICU). A...identified in the patient care utility capability in Deployable Medical Systems (DEPMEDS) hospitals, especially in the Intensive Care Unit (ICU). A...REQUEST FROM DEFENSE MEDICAL STANDARDIZATION BOARD TO STUDY SPACE AROUND PATIENT BEDSIDE IN DEPHEDS HOSPITALS 28 DEFENSE MEDICAL STANDARDIZATION BOARD FONT

  4. Recombinant organisms capable of fermenting cellobiose

    DOEpatents

    Ingram, Lonnie O.; Lai, Xiaokuang; Moniruzzaman, Mohammed; York, Sean W.

    2000-01-01

    This invention relates to a recombinant microorganism which expresses pyruvate decarboxylase, alcohol dehydrogenase, Klebsiella phospho-.beta.-glucosidase and Klebsiella (phosphoenolpyruvate-dependent phosphotransferase system) cellobiose-utilizing Enzyme II, wherein said phospho-.beta.-glucosidase and said (phosphoenolpyruvate-dependent phosphotransferase) cellobiose-utilizing Enzyme II are heterologous to said microorganism and wherein said microorganism is capable of utilizing both hemicellulose and cellulose, including cellobiose, in the production of ethanol.

  5. Accommodation of an N-(deoxyguanosin-8-yl)-2-acetylaminofluorene adduct in the active site of human DNA polymerase ι: Hoogsteen or Watson-Crick base pairing?†

    PubMed Central

    Donny-Clark, Kerry; Shapiro, Robert; Broyde, Suse

    2009-01-01

    Bypass across DNA lesions by specialized polymerases is essential for maintenance of genomic stability. Human DNA polymerase ι (polι) is a bypass polymerase of the Y family. Crystal structures of polι suggest that Hoogsteen base pairing is employed to bypass minor groove DNA lesions, placing them on the spacious major groove side of the enzyme. Primer extension studies have shown that polι is also capable of error-free nucleotide incorporation opposite the bulky major groove adduct N-(deoxyguanosin-8-yl)-2-acetyl-aminofluorene (dG-AAF). We present molecular dynamics simulations and free energy calculations suggesting that Watson-Crick base pairing could be employed in polι for bypass of dG-AAF. In polι with Hoogsteen paired dG-AAF the bulky AAF moiety would reside on the cramped minor groove side of the template. The Hoogsteen-capable conformation distorts the active site, disrupting interactions necessary for error-free incorporation of dC opposite the lesion. Watson-Crick pairing places the AAF rings on the spacious major groove side, similar to the position of minor groove adducts observed with Hoogsteen pairing. Watson-Crick paired structures show a well-ordered active site, with a near reaction-ready ternary complex. Thus our results suggest that polι would utilize the same spacious region for lesion bypass of both major and minor groove adducts. Therefore, purine adducts with bulk on the minor groove side would use Hoogsteen pairing, while adducts with the bulky lesion on the major groove side would utilize Watson-Crick base pairing as indicated by our MD simulations for dG-AAF. This suggests the possibility of an expanded role for polι in lesion bypass. PMID:19072536

  6. Accommodation of an N-(deoxyguanosin-8-yl)-2-acetylaminofluorene adduct in the active site of human DNA polymerase iota: Hoogsteen or Watson-Crick base pairing?

    PubMed

    Donny-Clark, Kerry; Shapiro, Robert; Broyde, Suse

    2009-01-13

    Bypass across DNA lesions by specialized polymerases is essential for maintenance of genomic stability. Human DNA polymerase iota (poliota) is a bypass polymerase of the Y family. Crystal structures of poliota suggest that Hoogsteen base pairing is employed to bypass minor groove DNA lesions, placing them on the spacious major groove side of the enzyme. Primer extension studies have shown that poliota is also capable of error-free nucleotide incorporation opposite the bulky major groove adduct N-(deoxyguanosin-8-yl)-2-acetylaminofluorene (dG-AAF). We present molecular dynamics simulations and free energy calculations suggesting that Watson-Crick base pairing could be employed in poliota for bypass of dG-AAF. In poliota with Hoogsteen-paired dG-AAF the bulky AAF moiety would reside on the cramped minor groove side of the template. The Hoogsteen-capable conformation distorts the active site, disrupting interactions necessary for error-free incorporation of dC opposite the lesion. Watson-Crick pairing places the AAF rings on the spacious major groove side, similar to the position of minor groove adducts observed with Hoogsteen pairing. Watson-Crick-paired structures show a well-ordered active site, with a near reaction-ready ternary complex. Thus our results suggest that poliota would utilize the same spacious region for lesion bypass of both major and minor groove adducts. Therefore, purine adducts with bulk on the minor groove side would use Hoogsteen pairing, while adducts with the bulky lesion on the major groove side would utilize Watson-Crick base pairing as indicated by our MD simulations for dG-AAF. This suggests the possibility of an expanded role for poliota in lesion bypass.

  7. Implementation of a Tabulated Failure Model Into a Generalized Composite Material Model Suitable for Use in Impact Problems

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Carney, Kelly S.; Dubois, Paul; Hoffarth, Canio; Khaled, Bilal; Shyamsunder, Loukham; Rajan, Subramaniam; Blankenhorn, Gunther

    2017-01-01

    The need for accurate material models to simulate the deformation, damage and failure of polymer matrix composites under impact conditions is becoming critical as these materials are gaining increased use in the aerospace and automotive communities. The aerospace community has identified several key capabilities which are currently lacking in the available material models in commercial transient dynamic finite element codes. To attempt to improve the predictive capability of composite impact simulations, a next generation material model is being developed for incorporation within the commercial transient dynamic finite element code LS-DYNA. The material model, which incorporates plasticity, damage and failure, utilizes experimentally based tabulated input to define the evolution of plasticity and damage and the initiation of failure as opposed to specifying discrete input parameters such as modulus and strength. The plasticity portion of the orthotropic, three-dimensional, macroscopic composite constitutive model is based on an extension of the Tsai-Wu composite failure model into a generalized yield function with a non-associative flow rule. For the damage model, a strain equivalent formulation is used to allow for the uncoupling of the deformation and damage analyses. For the failure model, a tabulated approach is utilized in which a stress or strain based invariant is defined as a function of the location of the current stress state in stress space to define the initiation of failure. Failure surfaces can be defined with any arbitrary shape, unlike traditional failure models where the mathematical functions used to define the failure surface impose a specific shape on the failure surface. In the current paper, the complete development of the failure model is described and the generation of a tabulated failure surface for a representative composite material is discussed.

  8. High-temperature MIRAGE XL (LFRA) IRSP system development

    NASA Astrophysics Data System (ADS)

    McHugh, Steve; Franks, Greg; LaVeigne, Joe

    2017-05-01

    The development of very-large format infrared detector arrays has challenged the IR scene projector community to develop larger-format infrared emitter arrays. Many scene projector applications also require much higher simulated temperatures than can be generated with current technology. This paper will present an overview of resistive emitterbased (broadband) IR scene projector system development, as well as describe recent progress in emitter materials and pixel designs applicable for legacy MIRAGE XL Systems to achieve apparent temperatures >1000K in the MWIR. These new high temperature MIRAGE XL (LFRA) Digital Emitter Engines (DEE) will be "plug and play" equivalent with legacy MIRAGE XL DEEs, the rest of the system is reusable. Under the High Temperature Dynamic Resistive Array (HDRA) development program, Santa Barbara Infrared Inc. (SBIR) is developing a new infrared scene projector architecture capable of producing both very large format (>2k x 2k) resistive emitter arrays and improved emitter pixel technology capable of simulating very high apparent temperatures. During earlier phases of the program, SBIR demonstrated materials with MWIR apparent temperatures in excess of 1500 K. These new emitter materials can be utilized with legacy RIICs to produce pixels that can achieve 7X the radiance of the legacy systems with low cost and low risk. A 'scalable' Read-In Integrated Circuit (RIIC) is also being developed under the same HDRA program to drive the high temperature pixels. This RIIC will utilize through-silicon via (TSV) and Quilt Packaging (QP) technologies to allow seamless tiling of multiple chips to fabricate very large arrays, and thus overcome the yield limitations inherent in large-scale integrated circuits. These quilted arrays can be fabricated in any N x M size in 512 steps.

  9. COUPLED FREE AND DISSOLVED PHASE TRANSPORT: NEW SIMULATION CAPABILITIES AND PARAMETER INVERSION

    EPA Science Inventory

    The vadose zone free-phase simulation capabilities of the US EPA Hydrocarbon Spill Screening Model (HSSM)have been linked with the 3-D multi-species dissolved-phase contaminant transport simulator MT3DMS.

  10. Test vs. simulation

    NASA Technical Reports Server (NTRS)

    Wood, Charles C.

    1991-01-01

    The following topics are presented in tabular form: (1) simulation capability assessments (no propulsion system test); (2) advanced vehicle simulation capability assessment; (3) systems tests identified events; (4) main propulsion test article (MPTA) testing evaluation; (5) Saturn 5, 1B, and 1 testing evaluation. Special vehicle simulation issues that are propulsion related are briefly addressed.

  11. A joint tracking method for NSCC based on WLS algorithm

    NASA Astrophysics Data System (ADS)

    Luo, Ruidan; Xu, Ying; Yuan, Hong

    2017-12-01

    Navigation signal based on compound carrier (NSCC), has the flexible multi-carrier scheme and various scheme parameters configuration, which enables it to possess significant efficiency of navigation augmentation in terms of spectral efficiency, tracking accuracy, multipath mitigation capability and anti-jamming reduction compared with legacy navigation signals. Meanwhile, the typical scheme characteristics can provide auxiliary information for signal synchronism algorithm design. This paper, based on the characteristics of NSCC, proposed a kind of joint tracking method utilizing Weighted Least Square (WLS) algorithm. In this method, the LS algorithm is employed to jointly estimate each sub-carrier frequency shift with the frequency-Doppler linear relationship, by utilizing the known sub-carrier frequency. Besides, the weighting matrix is set adaptively according to the sub-carrier power to ensure the estimation accuracy. Both the theory analysis and simulation results illustrate that the tracking accuracy and sensitivity of this method outperforms the single-carrier algorithm with lower SNR.

  12. Construction and Utilization of a Beowulf Computing Cluster: A User's Perspective

    NASA Technical Reports Server (NTRS)

    Woods, Judy L.; West, Jeff S.; Sulyma, Peter R.

    2000-01-01

    Lockheed Martin Space Operations - Stennis Programs (LMSO) at the John C Stennis Space Center (NASA/SSC) has designed and built a Beowulf computer cluster which is owned by NASA/SSC and operated by LMSO. The design and construction of the cluster are detailed in this paper. The cluster is currently used for Computational Fluid Dynamics (CFD) simulations. The CFD codes in use and their applications are discussed. Examples of some of the work are also presented. Performance benchmark studies have been conducted for the CFD codes being run on the cluster. The results of two of the studies are presented and discussed. The cluster is not currently being utilized to its full potential; therefore, plans are underway to add more capabilities. These include the addition of structural, thermal, fluid, and acoustic Finite Element Analysis codes as well as real-time data acquisition and processing during test operations at NASA/SSC. These plans are discussed as well.

  13. A guided wave dispersion compensation method based on compressed sensing

    NASA Astrophysics Data System (ADS)

    Xu, Cai-bin; Yang, Zhi-bo; Chen, Xue-feng; Tian, Shao-hua; Xie, Yong

    2018-03-01

    The ultrasonic guided wave has emerged as a promising tool for structural health monitoring (SHM) and nondestructive testing (NDT) due to their capability to propagate over long distances with minimal loss and sensitivity to both surface and subsurface defects. The dispersion effect degrades the temporal and spatial resolution of guided waves. A novel ultrasonic guided wave processing method for both single mode and multi-mode guided waves dispersion compensation is proposed in this work based on compressed sensing, in which a dispersion signal dictionary is built by utilizing the dispersion curves of the guided wave modes in order to sparsely decompose the recorded dispersive guided waves. Dispersion-compensated guided waves are obtained by utilizing a non-dispersion signal dictionary and the results of sparse decomposition. Numerical simulations and experiments are implemented to verify the effectiveness of the developed method for both single mode and multi-mode guided waves.

  14. Experimental and Analytical Evaluation of a Composite Honeycomb Deployable Energy Absorber

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Kellas, Sotiris; Horta, Lucas G.; Annett, Martin S.; Polanco, Michael A.; Littell, Justin D.; Fasanella, Edwin L.

    2011-01-01

    In 2006, the NASA Subsonic Rotary Wing Aeronautics Program sponsored the experimental and analytical evaluation of an externally deployable composite honeycomb structure that is designed to attenuate impact energy during helicopter crashes. The concept, which is designated the Deployable Energy Absorber (DEA), utilizes an expandable Kevlar honeycomb structure to dissipate kinetic energy through crushing. The DEA incorporates a unique flexible hinge design that allows the honeycomb to be packaged and stowed flat until needed for deployment. A variety of deployment options such as linear, radial, and/or hybrid methods can be used. Experimental evaluation of the DEA utilized a building block approach that included material characterization testing of its constituent, Kevlar -129 fabric/epoxy, and flexural testing of single hexagonal cells. In addition, the energy attenuation capabilities of the DEA were demonstrated through multi-cell component dynamic crush tests, and vertical drop tests of a composite fuselage section, retrofitted with DEA blocks, onto concrete, water, and soft soil. During each stage of the DEA evaluation process, finite element models of the test articles were developed and simulations were performed using the explicit, nonlinear transient dynamic finite element code, LS-DYNA. This report documents the results of the experimental evaluation that was conducted to assess the energy absorption capabilities of the DEA.

  15. Physically based modeling of rainfall-triggered landslides: a case study in the Luquillo Forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Lepore, C.; Arnone, E.; Noto, L. V.; Sivandran, G.; Bras, R. L.

    2013-01-01

    This paper presents the development of a rainfall-triggered landslide module within a physically based spatially distributed ecohydrologic model. The model, Triangulated Irregular Networks Real-time Integrated Basin Simulator and VEGetation Generator for Interactive Evolution (tRIBS-VEGGIE), is capable of a sophisticated description of many hydrological processes; in particular, the soil moisture dynamics is resolved at a temporal and spatial resolution required to examine the triggering mechanisms of rainfall-induced landslides. The validity of the tRIBS-VEGGIE model to a tropical environment is shown with an evaluation of its performance against direct observations made within the Luquillo Forest (the study area). The newly developed landslide module builds upon the previous version of the tRIBS landslide component. This new module utilizes a numerical solution to the Richards equation to better represent the time evolution of soil moisture transport through the soil column. Moreover, the new landslide module utilizes an extended formulation of the Factor of Safety (FS) to correctly quantify the role of matric suction in slope stability and to account for unsaturated conditions in the evaluation of FS. The new modeling framework couples the capabilities of the detailed hydrologic model to describe soil moisture dynamics with the Infinite Slope model creating a powerful tool for the assessment of landslide risk.

  16. Dc microgrid stabilization through fuzzy control of interleaved, heterogeneous storage elements

    NASA Astrophysics Data System (ADS)

    Smith, Robert David

    As microgrid power systems gain prevalence and renewable energy comprises greater and greater portions of distributed generation, energy storage becomes important to offset the higher variance of renewable energy sources and maximize their usefulness. One of the emerging techniques is to utilize a combination of lead-acid batteries and ultracapacitors to provide both short and long-term stabilization to microgrid systems. The different energy and power characteristics of batteries and ultracapacitors imply that they ought to be utilized in different ways. Traditional linear controls can use these energy storage systems to stabilize a power grid, but cannot effect more complex interactions. This research explores a fuzzy logic approach to microgrid stabilization. The ability of a fuzzy logic controller to regulate a dc bus in the presence of source and load fluctuations, in a manner comparable to traditional linear control systems, is explored and demonstrated. Furthermore, the expanded capabilities (such as storage balancing, self-protection, and battery optimization) of a fuzzy logic system over a traditional linear control system are shown. System simulation results are presented and validated through hardware-based experiments. These experiments confirm the capabilities of the fuzzy logic control system to regulate bus voltage, balance storage elements, optimize battery usage, and effect self-protection.

  17. A Physical Heart Failure Simulation System Utilizing the Total Artificial Heart and Modified Donovan Mock Circulation.

    PubMed

    Crosby, Jessica R; DeCook, Katrina J; Tran, Phat L; Betterton, Edward; Smith, Richard G; Larson, Douglas F; Khalpey, Zain I; Burkhoff, Daniel; Slepian, Marvin J

    2017-07-01

    With the growth and diversity of mechanical circulatory support (MCS) systems entering clinical use, a need exists for a robust mock circulation system capable of reliably emulating and reproducing physiologic as well as pathophysiologic states for use in MCS training and inter-device comparison. We report on the development of such a platform utilizing the SynCardia Total Artificial Heart and a modified Donovan Mock Circulation System, capable of being driven at normal and reduced output. With this platform, clinically relevant heart failure hemodynamics could be reliably reproduced as evidenced by elevated left atrial pressure (+112%), reduced aortic flow (-12.6%), blunted Starling-like behavior, and increased afterload sensitivity when compared with normal function. Similarly, pressure-volume relationships demonstrated enhanced sensitivity to afterload and decreased Starling-like behavior in the heart failure model. Lastly, the platform was configured to allow the easy addition of a left ventricular assist device (HeartMate II at 9600 RPM), which upon insertion resulted in improvement of hemodynamics. The present configuration has the potential to serve as a viable system for training and research, aimed at fostering safe and effective MCS device use. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  18. Overview of the Human Exploration Research Analog (HERA)

    NASA Technical Reports Server (NTRS)

    Neigut, J.

    2015-01-01

    In 2013, the Human Research Program at NASA began developing a new confinement analog specifically for conducting research to investigate the effects of confinement on the human system. The HERA (Human Exploration Research Analog) habitat has been used for both 7 and 14 day missions to date to examine and mitigate exploration risks to enable safe, reliable and productive human space exploration. This presentation will describe how the Flight Analogs Project developed the HERA facility and the infrastructure to suit investigator requirements for confinement research and in the process developed a new approach to analog utilization and a new state of the art analog facility. Details regarding HERA operations will be discussed including specifics on the mission simulation utilized for the current 14-day campaign, the specifics of the facility (total volume, overall size, hardware), and the capabilities available to researchers. The overall operational philosophy, mission fidelity including timeline, schedule pressures and cadence, and development and implementation of mission stressors will be presented. Research conducted to date in the HERA has addressed risks associated with behavioral health and performance, human physiology, as well as human factors. This presentation will conclude with a discussion of future research plans for the HERA, including infrastructure improvements and additional research capabilities planned for the upcoming 30-day missions in 2016.

  19. Optimal generator bidding strategies for power and ancillary services

    NASA Astrophysics Data System (ADS)

    Morinec, Allen G.

    As the electric power industry transitions to a deregulated market, power transactions are made upon price rather than cost. Generator companies are interested in maximizing their profits rather than overall system efficiency. A method to equitably compensate generation providers for real power, and ancillary services such as reactive power and spinning reserve, will ensure a competitive market with an adequate number of suppliers. Optimizing the generation product mix during bidding is necessary to maximize a generator company's profits. The objective of this research work is to determine and formulate appropriate optimal bidding strategies for a generation company in both the energy and ancillary services markets. These strategies should incorporate the capability curves of their generators as constraints to define the optimal product mix and price offered in the day-ahead and real time spot markets. In order to achieve such a goal, a two-player model was composed to simulate market auctions for power generation. A dynamic game methodology was developed to identify Nash Equilibria and Mixed-Strategy Nash Equilibria solutions as optimal generation bidding strategies for two-player non-cooperative variable-sum matrix games with incomplete information. These games integrated the generation product mix of real power, reactive power, and spinning reserve with the generators's capability curves as constraints. The research includes simulations of market auctions, where strategies were tested for generators with different unit constraints, costs, types of competitors, strategies, and demand levels. Studies on the capability of large hydrogen cooled synchronous generators were utilized to derive useful equations that define the exact shape of the capability curve from the intersections of the arcs defined by the centers and radial vectors of the rotor, stator, and steady-state stability limits. The available reactive reserve and spinning reserve were calculated given a generator operating point in the P-Q plane. Four computer programs were developed to automatically perform the market auction simulations using the equal incremental cost rule. The software calculates the payoffs for the two competing competitors, dispatches six generators, and allocates ancillary services for 64 combinations of bidding strategies, three levels of system demand, and three different types of competitors. Matrix Game theory was utilized to calculate Nash Equilibrium solutions and mixed-strategy Nash solutions as the optimal generator bidding strategies. A method to incorporate ancillary services into the generation bidding strategy, to assure an adequate supply of ancillary services, and to allocate these necessary resources to the on-line units was devised. The optimal generator bid strategy in a power auction was shown to be the Nash Equilibrium solution found in two-player variable-sum matrix games.

  20. From Lunar Regolith to Fabricated Parts: Technology Developments and the Utilization of Moon Dirt

    NASA Technical Reports Server (NTRS)

    McLemore, C. A.; Fikes, J. C.; McCarley, K. S.; Good, J. E.; Gilley, S. D.; Kennedy, J. P.

    2008-01-01

    The U.S. Space Exploration Policy has as a cornerstone the establishment of an outpost on the moon. This lunar outpost wil1 eventually provide the necessary planning, technology development, testbed, and training for manned missions in the future beyond the Moon. As part of the overall activity, the National Aeronautics and Space Administration (NASA) is investigating how the in situ resources can be utilized to improve mission success by reducing up-mass, improving safety, reducing risk, and bringing down costs for the overall mission. Marshall Space Flight Center (MSFC), along with other NASA centers, is supporting this endeavor by exploring how lunar regolith can be mined for uses such as construction, life support, propulsion, power, and fabrication. An infrastructure capable of fabrication and nondestructive evaluation will be needed to support habitat structure development and maintenance, tools and mechanical parts fabrication, as well as repair and replacement of space-mission hardware such as life-support items, vehicle components, and crew systems, This infrastructure will utilize the technologies being developed under the In Situ Fabrication and Repair (ISFR) element, which is working in conjunction with the technologies being developed under the In Situ Resources Utilization (ISRU) element, to live off the land. The ISFR Element supports the Space Exploration Initiative by reducing downtime due to failed components; decreasing risk to crew by recovering quickly from degraded operation of equipment; improving system functionality with advanced geometry capabilities; and enhancing mission safety by reducing assembly part counts of original designs where possible. This paper addresses the need and plan for understanding the properties of the lunar regolith to determine the applicability of using this material in a fabrication process. This effort includes the development of high fidelity simulants that will be used in fabrication processes on the ground to drive down risk and increase the Technology Readiness Level (TRL) prior to implementing this capability on the moon. Also discussed in this paper is the on-going research using Electron Beam Melting (EBM) technology as a possible solution to manufacturing parts and spares on the Moon's surface.

  1. Observational and Modeling Studies of Mixed-Phase Arctic Stratus: Results From M-PACE and Future Investigations as a Part of SEARCH

    NASA Astrophysics Data System (ADS)

    de Boer, G.; Eloranta, E. W.; Tripoli, G. J.; Hashino, T.

    2005-12-01

    A combination of unique observational and modeling tools is being utilized at the University of Wisconsin-Madison to investigate mixed-phase Arctic stratus formation and evolution, and aerosol influence on these processes. The combination of detailed measurements and advanced simulation techniques provides increased insight into processes governing the existence of these cloud structures. Simulations are completed using the Univ. of Wisconsin Non-Hydrostatic Modeling System (UW-NMS). The NMS is fully scalable, and currently being updated to include the Spectral Habitat Ice Prediction System (SHIPS). This new form of microphysics is built on interacting predictive systems for ice and liquid hydrometeors, and aerosols. The hydrometeor size spectra evolve through a modified spectral approach. No a-priori assumptions are made about ice characteristics such as habit, size and density. Instead, they evolve freely. The Univ. of Wisconsin Arctic High-Spectral Resolution Lidar (UW-AHSRL) was designed for long-term unattended Arctic operation and features unique measurement capabilities. Utilizing a molecular reference channel, the AHSRL provides absolutely calibrated measurements of aerosol backscatter cross-section, polarization, and optical depth, in addition to traditional lidar backscatter profiles. Algorithms utilizing AHSRL data in conjunction with millimeter radar data determine microphysical properties such as particle equivalent radius, and potentially liquid and ice water content. The AHSRL was deployed to Barrow, AK as part of M-PACE and is currently located in Eureka, Canada for the SEARCH campaign. Both of these locations host a NOAA Millimeter Wave Cloud Radar, aiding in the implementation of the above-mentioned algorithms. The AHSRL, combined with additional cloud and aerosol measurement instrumentation at these Arctic locations, provides an expansive source of mixed-phase cloud data to be used individually and as validation for UW-NMS simulations. We will outline current work being completed at the Univ. of Wisconsin, as well as present results from M-PACE simulations and data analysis and preliminary SEARCH measurements.

  2. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  3. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  4. Large-Eddy / Reynolds-Averaged Navier-Stokes Simulations of a Dual-Mode Scramjet Combustor

    NASA Technical Reports Server (NTRS)

    Fulton, Jesse A.; Edwards, Jack R.; Hassan, Hassan A.; Rockwell, Robert; Goyne, Christopher; McDaniel, James; Smith, Chad; Cutler, Andrew; Johansen, Craig; Danehy, Paul M.; hide

    2012-01-01

    Numerical simulations of reacting and non-reacting flows within a scramjet combustor configuration experimentally mapped at the University of Virginia s Scramjet Combustion Facility (operating with Configuration A ) are described in this paper. Reynolds-Averaged Navier-Stokes (RANS) and hybrid Large Eddy Simulation / Reynolds-Averaged Navier-Stokes (LES / RANS) methods are utilized, with the intent of comparing essentially blind predictions with results from non-intrusive flow-field measurement methods including coherent anti-Stokes Raman spectroscopy (CARS), hydroxyl radical planar laser-induced fluorescence (OH-PLIF), stereoscopic particle image velocimetry (SPIV), wavelength modulation spectroscopy (WMS), and focusing Schlieren. NC State's REACTMB solver was used both for RANS and LES / RANS, along with a 9-species, 19- reaction H2-air kinetics mechanism by Jachimowski. Inviscid fluxes were evaluated using Edwards LDFSS flux-splitting scheme, and the Menter BSL turbulence model was utilized in both full-domain RANS simulations and as the unsteady RANS portion of the LES / RANS closure. Simulations were executed and compared with experiment at two equivalence ratios, PHI = 0.17 and PHI = 0.34. Results show that the PHI = 0.17 flame is hotter near the injector while the PHI = 0.34 flame is displaced further downstream in the combustor, though it is still anchored to the injector. Reactant mixing was predicted to be much better at the lower equivalence ratio. The LES / RANS model appears to predict lower overall heat release compared to RANS (at least for PHI = 0.17), and its capability to capture the direct effects of larger turbulent eddies leads to much better predictions of reactant mixing and combustion in the flame stabilization region downstream of the fuel injector. Numerical results from the LES/RANS model also show very good agreement with OH-PLIF and SPIV measurements. An un-damped long-wave oscillation of the pre-combustion shock train, which caused convergence problems in some RANS simulations, was also captured in LES / RANS simulations, which were able to accommodate its effects accurately.

  5. Operational modeling system with dynamic-wave routing

    USGS Publications Warehouse

    Ishii, A.L.; Charlton, T.J.; Ortel, T.W.; Vonnahme, C.C.; ,

    1998-01-01

    A near real-time streamflow-simulation system utilizing continuous-simulation rainfall-runoff generation with dynamic-wave routing is being developed by the U.S. Geological Survey in cooperation with the Du Page County Department of Environmental Concerns for a 24-kilometer reach of Salt Creek in Du Page County, Illinois. This system is needed in order to more effectively manage the Elmhurst Quarry Flood Control Facility, an off-line stormwater diversion reservoir located along Salt Creek. Near real time simulation capabilities will enable the testing and evaluation of potential rainfall, diversion, and return-flow scenarios on water-surface elevations along Salt Creek before implementing diversions or return-flows. The climatological inputs for the continuous-simulation rainfall-runoff model, Hydrologic Simulation Program - FORTRAN (HSPF) are obtained by Internet access and from a network of radio-telemetered precipitation gages reporting to a base-station computer. The unit area runoff time series generated from HSPF are the input for the dynamic-wave routing model. Full Equations (FEQ). The Generation and Analysis of Model Simulation Scenarios (GENSCN) interface is used as a pre- and post-processor for managing input data and displaying and managing simulation results. The GENSCN interface includes a variety of graphical and analytical tools for evaluation and quick visualization of the results of operational scenario simulations and thereby makes it possible to obtain the full benefit of the fully distributed dynamic routing results.

  6. Living Together in Space: The International Space Station Internal Active Thermal Control System Issues and Solutions-Sustaining Engineering Activities at the Marshall Space Flight Center From 1998 to 2005

    NASA Technical Reports Server (NTRS)

    Wieland, P. O.; Roman, M. C.; Miller, L.

    2007-01-01

    On board the International Space Station, heat generated by the crew and equipment is removed by the internal active thermal control system to maintain a comfortable working environment and prevent equipment overheating. Test facilities simulating the internal active thermal control system (IATCS) were constructed at the Marshall Space Flight Center as part of the sustaining engineering activities to address concerns related to operational issues, equipment capability, and reliability. A full-scale functional simulator of the Destiny lab module IATCS was constructed and activated prior to launch of Destiny in 2001. This facility simulates the flow and thermal characteristics of the flight system and has a similar control interface. A subscale simulator was built, and activated in 2000, with special attention to materials and proportions of wetted surfaces to address issues related to changes in fluid chemistry, material corrosion, and microbial activity. The flight issues that have arisen and the tests performed using the simulator facilities are discussed in detail. In addition, other test facilities at the MSFC have been used to perform specific tests related to IATCS issues. Future testing is discussed as well as potential modifications to the simulators to enhance their utility.

  7. Development of a High-Fidelity Simulation Environment for Shadow-Mode Assessments of Air Traffic Concepts

    NASA Technical Reports Server (NTRS)

    Robinson, John E., III; Lee, Alan; Lai, Chok Fung

    2017-01-01

    This paper describes the Shadow-Mode Assessment Using Realistic Technologies for the National Airspace System (SMART-NAS) Test Bed. The SMART-NAS Test Bed is an air traffic simulation platform being developed by the National Aeronautics and Space Administration (NASA). The SMART-NAS Test Bed's core purpose is to conduct high-fidelity, real-time, human-in-the-loop and automation-in-the-loop simulations of current and proposed future air traffic concepts for the United States' Next Generation Air Transportation System called NextGen. The setup, configuration, coordination, and execution of realtime, human-in-the-loop air traffic management simulations are complex, tedious, time intensive, and expensive. The SMART-NAS Test Bed framework is an alternative to the current approach and will provide services throughout the simulation workflow pipeline to help alleviate these shortcomings. The principle concepts to be simulated include advanced gate-to-gate, trajectory-based operations, widespread integration of novel aircraft such as unmanned vehicles, and real-time safety assurance technologies to enable autonomous operations. To make this possible, SNTB will utilize Web-based technologies, cloud resources, and real-time, scalable, communication middleware. This paper describes the SMART-NAS Test Bed's vision, purpose, its concept of use, and the potential benefits, key capabilities, high-level requirements, architecture, software design, and usage.

  8. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  9. Emerging technologies in education and training: applications for the laboratory animal science community.

    PubMed

    Ketelhut, Diane Jass; Niemi, Steven M

    2007-01-01

    This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.

  10. Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software

    NASA Technical Reports Server (NTRS)

    Hunter, George; Boisvert, Benjamin

    2013-01-01

    This document is the final report for the project entitled "Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software." This report consists of 17 sections which document the results of the several subtasks of this effort. The Probabilistic NAS Platform (PNP) is an air operations simulation platform developed and maintained by the Saab Sensis Corporation. The improvements made to the PNP simulation include the following: an airborne distributed separation assurance capability, a required time of arrival assignment and conformance capability, and a tactical and strategic weather avoidance capability.

  11. Incompressible viscous flow simulations of the NFAC wind tunnel

    NASA Technical Reports Server (NTRS)

    Champney, Joelle Milene

    1986-01-01

    The capabilities of an existing 3-D incompressible Navier-Stokes flow solver, INS3D, are extended and improved to solve turbulent flows through the incorporation of zero- and two-equation turbulence models. The two-equation model equations are solved in their high Reynolds number form and utilize wall functions in the treatment of solid wall boundary conditions. The implicit approximate factorization scheme is modified to improve the stability of the two-equation solver. Applications to the 3-D viscous flow inside the 80 by 120 feet open return wind tunnel of the National Full Scale Aerodynamics Complex (NFAC) are discussed and described.

  12. ASTEP user's guide and software documentation

    NASA Technical Reports Server (NTRS)

    Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.

    1974-01-01

    The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.

  13. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  14. National Transonic Facility: A review of the operational plan

    NASA Technical Reports Server (NTRS)

    Liepmann, H. W.; Black, R. E.; Dietz, R. O.; Kirchner, M. E.; Sears, W. R.

    1980-01-01

    The proposed National Transonic Facility (NTF) operational plan is reviewed. The NTF will provide an aerodynamic test capability significantly exceeding that of other transonic regime wind tunnels now available. A limited number of academic research program that might use the NTF are suggested. It is concluded that the NTF operational plan is useful for management, technical, instrumentation, and model building techniques available in the specialized field of aerodynamic analysis and simulation. It is also suggested that NASA hold an annual conference to discuss wind tunnel research results and to report on developments that will further improve the utilization and cost effectiveness of the NTF and other wind tunnels.

  15. Visualizing Astronomical Data with Blender

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2014-01-01

    We present methods for using the 3D graphics program Blender in the visualization of astronomical data. The software's forte for animating 3D data lends itself well to use in astronomy. The Blender graphical user interface and Python scripting capabilities can be utilized in the generation of models for data cubes, catalogs, simulations, and surface maps. We review methods for data import, 2D and 3D voxel texture applications, animations, camera movement, and composite renders. Rendering times can be improved by using graphic processing units (GPUs). A number of examples are shown using the software features most applicable to various kinds of data paradigms in astronomy.

  16. Research flight-control system development for the F-18 high alpha research vehicle

    NASA Technical Reports Server (NTRS)

    Pahle, Joseph W.; Powers, Bruce; Regenie, Victoria; Chacon, Vince; Degroote, Steve; Murnyak, Steven

    1991-01-01

    The F-18 high alpha research vehicle was recently modified by adding a thrust vectoring control system. A key element in the modification was the development of a research flight control system integrated with the basic F-18 flight control system. Discussed here are design requirements, system development, and research utility of the resulting configuration as an embedded system for flight research in the high angle of attack regime. Particular emphasis is given to control system modifications and control law features required for high angle of attack flight. Simulation results are used to illustrate some of the thrust vectoring control system capabilities and predicted maneuvering improvements.

  17. Application of a stepwise method for analyzing fouling in shell-and-tube exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prieto, M.M.; Miranda, J.; Sigales, B.

    1999-12-01

    This article presents the results of the application of a quite simple method for analyzing shell-side fouling in shell-and-tube exchangers, capable of taking into account the formation or irregular fouling deposits with variable thermal conductivity. This method, based on the utilization of elementary heat exchangers, has been implemented for E-shell TEMA-type heat exchangers with two tube passes. Several fouling deposit distributions have been simulated so as to ascertain their effects on the heat transfer rate. These distributions consider that fouling is concentrated in zones where the temperature of the fluids is maximum or minimum.

  18. Space-shuttle interfaces/utilization. Earth Observatory Satellite system definition study (EOS)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The economic aspects of space shuttle application to a representative Earth Observatory Satellite (EOS) operational mission in the various candidate Shuttle modes of launch, retrieval, and resupply are discussed. System maintenance of the same mission capability using a conventional launch vehicle is also considered. The studies are based on application of sophisticated Monte Carlo mission simulation program developed originally for studies of in-space servicing of a military satellite system. The program has been modified to permit evaluation of space shuttle application to low altitude EOS missions in all three modes. The conclusions generated by the EOS system study are developed.

  19. Modelling and performance assessment of an antenna-control system

    NASA Astrophysics Data System (ADS)

    Burrows, C. R.

    1982-03-01

    An assessment is made of a surveillance-radar control system designed to provide a sector-search capability and continuous control of antenna speed without unwanted torque-reaction on the supporting mast. These objectives are attained by utilizing regenerative braking, and control is exercised through Perbury CVTs. A detailed analysis of the system is given. The models derived for the Perbury CVTs supplement the qualitative data contained in earlier papers. Some results from a computer simulation are presented. Although the paper is concerned with a particular problem, the analysis of the CVTs, and the concept of using energy transfer to control large inertial loads, are of more general interest.

  20. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Trahan, Alexis Chanel

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (alpha, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (alpha,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (alpha,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a variety of spontaneous fission-driven fresh fuel assemblies at Los Alamos National Laboratory and the BeRP ball at the Nevada National Security Site. The development of the new, improved analysis and characterization methods with the DDSI instrument makes it a viable technique for implementation in a facility to meet material control and safeguards needs.

  1. Mitigating Interconnection Challenges of the High Penetration Utility-Interconnected Photovoltaic (PV) in the Electrical Distribution Systems: Cooperative Research and Development Final Report, CRADA Number CRD-14-563

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Sudipta

    Various interconnection challenges exist when connecting distributed PV into the electrical distribution grid in terms of safety, reliability, and stability of the electric power systems. Some of the urgent areas for research, as identified by inverter manufacturers, installers and utilities, are potential for transient overvoltage from PV inverters, multi-inverter anti-islanding, impact of smart inverters on volt-VAR support, impact of bidirectional power flow, and potential for distributed generation curtailment solutions to mitigate grid stability challenges. Under this project, NREL worked with SolarCity to address these challenges through research, testing and analysis at the Energy System Integration Facility (ESIF). Inverters from differentmore » manufacturers were tested at ESIF and NREL's unique power hardware-in-the-loop (PHIL) capability was utilized to evaluate various system-level impacts. Through the modeling, simulation, and testing, this project eliminated critical barriers on high PV penetration and directly supported the Department of Energy's SunShot goal of increasing the solar PV on the electrical grid.« less

  2. Design of high-speed burst mode clock and data recovery IC for passive optical network

    NASA Astrophysics Data System (ADS)

    Yan, Minhui; Hong, Xiaobin; Huang, Wei-Ping; Hong, Jin

    2005-09-01

    Design of a high bit rate burst mode clock and data recovery (BMCDR) circuit for gigabit passive optical networks (GPON) is described. A top-down design flow is established and some of the key issues related to the behavioural level modeling are addressed in consideration for the complexity of the BMCDR integrated circuit (IC). Precise implementation of Simulink behavioural model accounting for the saturation of frequency control voltage is therefore developed for the BMCDR, and the parameters of the circuit blocks can be readily adjusted and optimized based on the behavioural model. The newly designed BMCDR utilizes the 0.18um standard CMOS technology and is shown to be capable of operating at bit rate of 2.5Gbps, as well as the recovery time of one bit period in our simulation. The developed behaviour model is verified by comparing with the detailed circuit simulation.

  3. Unsteady Full Annulus Simulations of a Transonic Axial Compressor Stage

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Hathaway, Michael D.; Chen, Jen-Ping

    2009-01-01

    Two recent research endeavors in turbomachinery at NASA Glenn Research Center have focused on compression system stall inception and compression system aerothermodynamic performance. Physical experiment and computational research are ongoing in support of these research objectives. TURBO, an unsteady, three-dimensional, Navier-Stokes computational fluid dynamics code commissioned and developed by NASA, has been utilized, enhanced, and validated in support of these endeavors. In the research which follows, TURBO is shown to accurately capture compression system flow range-from choke to stall inception-and also to accurately calculate fundamental aerothermodynamic performance parameters. Rigorous full-annulus calculations are performed to validate TURBO s ability to simulate the unstable, unsteady, chaotic stall inception process; as part of these efforts, full-annulus calculations are also performed at a condition approaching choke to further document TURBO s capabilities to compute aerothermodynamic performance data and support a NASA code assessment effort.

  4. Computational fluid dynamic on the temperature simulation of air preheat effect combustion in propane turbulent flame

    NASA Astrophysics Data System (ADS)

    Elwina; Yunardi; Bindar, Yazid

    2018-04-01

    this paper presents results obtained from the application of a computational fluid dynamics (CFD) code Fluent 6.3 to modelling of temperature in propane flames with and without air preheat. The study focuses to investigate the effect of air preheat temperature on the temperature of the flame. A standard k-ε model and Eddy Dissipation model are utilized to represent the flow field and combustion of the flame being investigated, respectively. The results of calculations are compared with experimental data of propane flame taken from literature. The results of the study show that a combination of the standard k-ε turbulence model and eddy dissipation model is capable of producing reasonable predictions of temperature, particularly in axial profile of all three flames. Both experimental works and numerical simulation showed that increasing the temperature of the combustion air significantly increases the flame temperature.

  5. Analysis of thematic mapper simulator data collected over eastern North Dakota

    NASA Technical Reports Server (NTRS)

    Anderson, J. E. (Principal Investigator)

    1982-01-01

    The results of the analysis of aircraft-acquired thematic mapper simulator (TMS) data, collected to investigate the utility of thematic mapper data in crop area and land cover estimates, are discussed. Results of the analysis indicate that the seven-channel TMS data are capable of delineating the 13 crop types included in the study to an overall pixel classification accuracy of 80.97% correct, with relative efficiencies for four crop types examined between 1.62 and 26.61. Both supervised and unsupervised spectral signature development techniques were evaluated. The unsupervised methods proved to be inferior (based on analysis of variance) for the majority of crop types considered. Given the ground truth data set used for spectral signature development as well as evaluation of performance, it is possible to demonstrate which signature development technique would produce the highest percent correct classification for each crop type.

  6. Numerical simulation of aerothermal loads in hypersonic engine inlets due to shock impingement

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, R.

    1992-01-01

    The effect of shock impingement on an axial corner simulating the inlet of a hypersonic vehicle engine is modeled using a finite-difference procedure. A three-dimensional dynamic grid adaptation procedure is utilized to move the grids to regions with strong flow gradients. The adaptation procedure uses a grid relocation stencil that is valid at both the interior and boundary points of the finite-difference grid. A linear combination of spatial derivatives of specific flow variables, calculated with finite-element interpolation functions, are used as adaptation measures. This computational procedure is used to study laminar and turbulent Mach 6 flows in the axial corner. The description of flow physics and qualitative measures of heat transfer distributions on cowl and strut surfaces obtained from the analysis are compared with experimental observations. Conclusions are drawn regarding the capability of the numerical scheme for enhanced modeling of high-speed compressible flows.

  7. Attitude determination for high-accuracy submicroradian jitter pointing on space-based platforms

    NASA Astrophysics Data System (ADS)

    Gupta, Avanindra A.; van Houten, Charles N.; Germann, Lawrence M.

    1990-10-01

    A description of the requirement definition process is given for a new wideband attitude determination subsystem (ADS) for image motion compensation (IMC) systems. The subsystem consists of either lateral accelerometers functioning in differential pairs or gas-bearing gyros for high-frequency sensors using CCD-based star trackers for low-frequency sensors. To minimize error the sensor signals are combined so that the mixing filter does not allow phase distortion. The two ADS models are introduced in an IMC simulation to predict measurement error, correction capability, and residual image jitter for a variety of system parameters. The IMC three-axis testbed is utilized to simulate an incoming beam in inertial space. Results demonstrate that both mechanical and electronic IMC meet the requirements of image stabilization for space-based observation at submicroradian-jitter levels. Currently available technology may be employed to implement IMC systems.

  8. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  9. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  10. Life sciences Spacelab Mission Development test 3 (SMD 3) data management report

    NASA Technical Reports Server (NTRS)

    Moseley, E. C.

    1977-01-01

    Development of a permanent data system for SMD tests was studied that would simulate all elements of the shuttle onboard, telemetry, and ground data systems that are involved with spacelab operations. The onboard data system (ODS) and the ground data system (GDS) were utilized. The air-to-ground link was simulated by a hardwired computer-to-computer interface. A patch board system was used on board to select experiment inputs, and the downlink configuration from the ODS was changed by a crew keyboard entry to support each experiment. The ODS provided a CRT display of experiment parameters to enable the crew to monitor experiment performance. An onboard analog system, with recording capability, was installed to handle high rate data and to provide a backup to the digital system. The GDS accomplished engineering unit conversion and limit sensing, and provided realtime parameter display on CRT's in the science monitoring area and the test control area.

  11. A simulation model for wind energy storage systems. Volume 2: Operation manual

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Edsinger, R. W.; Burroughs, J. D.

    1977-01-01

    A comprehensive computer program (SIMWEST) developed for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel, and pneumatic) is described. Features of the program include: a precompiler which generates computer models (in FORTRAN) of complex wind source/storage/application systems, from user specifications using the respective library components; a program which provides the techno-economic system analysis with the respective I/O the integration of system dynamics, and the iteration for conveyance of variables; and capability to evaluate economic feasibility as well as general performance of wind energy systems. The SIMWEST operation manual is presented and the usage of the SIMWEST program and the design of the library components are described. A number of example simulations intended to familiarize the user with the program's operation is given along with a listing of each SIMWEST library subroutine.

  12. Hydrodynamic Assists Magnetophoreses Rare Cancer cells Separation in Microchannel Simulation and Experimental Verifications

    NASA Astrophysics Data System (ADS)

    Saeed, O.; Duru, L.; Yulin, D.

    2018-05-01

    A proposed microfluidic design has been fabricated and simulated using COMSOL Multiphysics software, based on two physical models included in this design. The device’s ability to create a narrow stream of the core sample by controlling the sheath flow rates Qs1 and Qs2 in both peripheral channels was investigated. The main target of this paper is to study the possibility of combing the hydrodynamic and magnetic techniques, in order to achieve a high rate of cancer cells separation from a cell mixture and/or buffer sample. The study has been conducted in two stages, firstly, the effects of the sheath flow rates (Qs1 and Qs2) on the sample stream focusing were studied, to find the proposed device effectiveness optimal conditions and its capability in cell focusing, and then the magnetic mechanism has been utilized to finalize the pre-labelled cells separation process.

  13. Transport and discrete particle noise in gyrokinetic simulations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas; Lee, W. W.

    2006-10-01

    We present results from our recent investigations regarding the effects of discrete particle noise on the long-time behavior and transport properties of gyrokinetic particle-in-cell simulations. It is found that the amplitude of nonlinearly saturated drift waves is unaffected by discreteness-induced noise in plasmas whose behavior is dominated by a single mode in the saturated state. We further show that the scaling of this noise amplitude with particle count is correctly predicted by the fluctuation-dissipation theorem, even though the drift waves have driven the plasma from thermal equilibrium. As well, we find that the long-term behavior of the saturated system is unaffected by discreteness-induced noise even when multiple modes are included. Additional work utilizing a code with both total-f and δf capabilities is also presented, as part of our efforts to better understand the long- time balance between entropy production, collisional dissipation, and particle/heat flux in gyrokinetic plasmas.

  14. Magnetic-field enhancement beyond the skin-depth limit

    NASA Astrophysics Data System (ADS)

    Shin, Jonghwa; Park, Namkyoo; Fan, Shanhui; Lee, Yong-Hee

    2010-02-01

    Electric field enhancement has been actively studied recently and many metallic structures that are capable of locally enhancing electric field have been reported. The Babinet's principle can be utilized, especially in the form of Booker's extension, to transform the known electric field enhancing structures into magnetic field enhancing structures. The authors explain this transformation process and discuss the regime in which this principle breaks down. Unless the metals used can be well approximated with a PEC model, the principle's predictions fails to hold true. Authors confirm this aspect using numerical simulations based on realistic material parameters for actual metals. There is large discrepancy especially when the structural dimensions are comparable or less than the skin-depth at the wavelength of interest. An alternative way to achieve magnetic field enhancement is presented and the design of a connected bow-tie structure is proposed as an example. FDTD simulation results confirm the operation of the proposed structure.

  15. Time series modeling for syndromic surveillance.

    PubMed

    Reis, Ben Y; Mandl, Kenneth D

    2003-01-23

    Emergency department (ED) based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED) visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA) residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Time series methods applied to historical ED utilization data are an important tool for syndromic surveillance. Accurate forecasting of emergency department total utilization as well as the rates of particular syndromes is possible. The multiple models in the system account for both long-term and recent trends, and an integrated alarms strategy combining these two perspectives may provide a more complete picture to public health authorities. The systematic methodology described here can be generalized to other healthcare settings to develop automated surveillance systems capable of detecting anomalies in disease patterns and healthcare utilization.

  16. Air Force Strategy Study 2020-2030

    DTIC Science & Technology

    2011-01-01

    shocks (also known as “black swans”), having 2 │ introduction the potential to radically alter the utility of the capability, as a way of highlighting... utilized by its expeditionary air units.”5 The Air Force must present strategic and operational options along with forces capable of operating and...Emer- gency Management Agency (FEMA) regional staffs, in part representing the service and USNORTHCOM. The imagery analysts’ utility is largely due

  17. Status of NASA/Army rotorcraft research and development piloted flight simulation

    NASA Technical Reports Server (NTRS)

    Condon, Gregory W.; Gossett, Terrence D.

    1988-01-01

    The status of the major NASA/Army capabilities in piloted rotorcraft flight simulation is reviewed. The requirements for research and development piloted simulation are addressed as well as the capabilities and technologies that are currently available or are being developed by NASA and the Army at Ames. The application of revolutionary advances (in visual scene, electronic cockpits, motion, and modelling of interactive mission environments and/or vehicle systems) to the NASA/Army facilities are also addressed. Particular attention is devoted to the major advances made in integrating these individual capabilities into fully integrated simulation environment that were or are being applied to new rotorcraft mission requirements. The specific simulators discussed are the Vertical Motion Simulator and the Crew Station Research and Development Facility.

  18. GMI Capabilities

    NASA Technical Reports Server (NTRS)

    Strode, Sarah; Rodriguez, Jose; Steenrod, Steve; Liu, Junhua; Strahan, Susan; Nielsen, Eric

    2015-01-01

    We describe the capabilities of the Global Modeling Initiative (GMI) chemical transport model (CTM) with a special focus on capabilities related to the Atmospheric Tomography Mission (ATom). Several science results based on GMI hindcast simulations and preliminary results from the ATom simulations are highlighted. We also discuss the relationship between GMI and GEOS-5.

  19. A STUDY OF SIMULATOR CAPABILITIES IN AN OPERATIONAL TRAINING PROGRAM.

    ERIC Educational Resources Information Center

    MEYER, DONALD E.; AND OTHERS

    THE EXPERIMENT WAS CONDUCTED TO DETERMINE THE EFFECTS OF SIMULATOR TRAINING TO CRITERION PROFICIENCY UPON TIME REQUIRED IN THE AIRCRAFT. DATA WERE ALSO COLLECTED ON PROFICIENCY LEVELS ATTAINED, SELF-CONFIDENCE LEVELS, INDIVIDUAL ESTIMATES OF CAPABILITY, AND SOURCES FROM WHICH THAT CAPABILITY WAS DERIVED. SUBJECTS FOR THE EXPERIMENT--48 AIRLINE…

  20. Microfluidic device capable of medium recirculation for non-adherent cell culture

    PubMed Central

    Dixon, Angela R.; Rajan, Shrinidhi; Kuo, Chuan-Hsien; Bersano, Tom; Wold, Rachel; Futai, Nobuyuki; Takayama, Shuichi; Mehta, Geeta

    2014-01-01

    We present a microfluidic device designed for maintenance and culture of non-adherent mammalian cells, which enables both recirculation and refreshing of medium, as well as easy harvesting of cells from the device. We demonstrate fabrication of a novel microfluidic device utilizing Braille perfusion for peristaltic fluid flow to enable switching between recirculation and refresh flow modes. Utilizing fluid flow simulations and the human promyelocytic leukemia cell line, HL-60, non-adherent cells, we demonstrate the utility of this RECIR-REFRESH device. With computer simulations, we profiled fluid flow and concentration gradients of autocrine factors and found that the geometry of the cell culture well plays a key role in cell entrapping and retaining autocrine and soluble factors. We subjected HL-60 cells, in the device, to a treatment regimen of 1.25% dimethylsulfoxide, every other day, to provoke differentiation and measured subsequent expression of CD11b on day 2 and day 4 and tumor necrosis factor-alpha (TNF-α) on day 4. Our findings display perfusion sensitive CD11b expression, but not TNF-α build-up, by day 4 of culture, with a 1:1 ratio of recirculation to refresh flow yielding the greatest increase in CD11b levels. RECIR-REFRESH facilitates programmable levels of cell differentiation in a HL-60 non-adherent cell population and can be expanded to other types of non-adherent cells such as hematopoietic stem cells. PMID:24753733

  1. Large Eddy Simulation of Engineering Flows: A Bill Reynolds Legacy.

    NASA Astrophysics Data System (ADS)

    Moin, Parviz

    2004-11-01

    The term, Large eddy simulation, LES, was coined by Bill Reynolds, thirty years ago when he and his colleagues pioneered the introduction of LES in the engineering community. Bill's legacy in LES features his insistence on having a proper mathematical definition of the large scale field independent of the numerical method used, and his vision for using numerical simulation output as data for research in turbulence physics and modeling, just as one would think of using experimental data. However, as an engineer, Bill was pre-dominantly interested in the predictive capability of computational fluid dynamics and in particular LES. In this talk I will present the state of the art in large eddy simulation of complex engineering flows. Most of this technology has been developed in the Department of Energy's ASCI Program at Stanford which was led by Bill in the last years of his distinguished career. At the core of this technology is a fully implicit non-dissipative LES code which uses unstructured grids with arbitrary elements. A hybrid Eulerian/ Largangian approach is used for multi-phase flows, and chemical reactions are introduced through dynamic equations for mixture fraction and reaction progress variable in conjunction with flamelet tables. The predictive capability of LES is demonstrated in several validation studies in flows with complex physics and complex geometry including flow in the combustor of a modern aircraft engine. LES in such a complex application is only possible through efficient utilization of modern parallel super-computers which was recognized and emphasized by Bill from the beginning. The presentation will include a brief mention of computer science efforts for efficient implementation of LES.

  2. Simulation and fitting of complex reaction network TPR: The key is the objective function

    DOE PAGES

    Savara, Aditya Ashi

    2016-07-07

    In this research, a method has been developed for finding improved fits during simulation and fitting of data from complex reaction network temperature programmed reactions (CRN-TPR). It was found that simulation and fitting of CRN-TPR presents additional challenges relative to simulation and fitting of simpler TPR systems. The method used here can enable checking the plausibility of proposed chemical mechanisms and kinetic models. The most important finding was that when choosing an objective function, use of an objective function that is based on integrated production provides more utility in finding improved fits when compared to an objective function based onmore » the rate of production. The response surface produced by using the integrated production is monotonic, suppresses effects from experimental noise, requires fewer points to capture the response behavior, and can be simulated numerically with smaller errors. For CRN-TPR, there is increased importance (relative to simple reaction network TPR) in resolving of peaks prior to fitting, as well as from weighting of experimental data points. Using an implicit ordinary differential equation solver was found to be inadequate for simulating CRN-TPR. Lastly, the method employed here was capable of attaining improved fits in simulation and fitting of CRN-TPR when starting with a postulated mechanism and physically realistic initial guesses for the kinetic parameters.« less

  3. Training Community Modeling and Simulation Business Plan, 2007 Edition. Volume 1: Review of Training Capabilities

    DTIC Science & Technology

    2009-02-01

    Simulation Business Plan, 2007 Edition Volume I: Review of Training Capabilities J.D. Fletcher, IDA Frederick E. Hartman , IDA Robert Halayko, Addx Corp...Community Modeling and Simulation Business Plan, 2007 Edition Volume I: Review of Training Capabilities J.D. Fletcher, IDA Frederick E. Hartman , IDA...Steering Committee for the training community led by the Office of the Under Secretary of Defense (Personnel and Readiness), OUSD( P &R). The task was

  4. Image Simulation and Assessment of the Colour and Spatial Capabilities of the Colour and Stereo Surface Imaging System (CaSSIS) on the ExoMars Trace Gas Orbiter

    NASA Astrophysics Data System (ADS)

    Tornabene, Livio L.; Seelos, Frank P.; Pommerol, Antoine; Thomas, Nicholas; Caudill, C. M.; Becerra, Patricio; Bridges, John C.; Byrne, Shane; Cardinale, Marco; Chojnacki, Matthew; Conway, Susan J.; Cremonese, Gabriele; Dundas, Colin M.; El-Maarry, M. R.; Fernando, Jennifer; Hansen, Candice J.; Hansen, Kayle; Harrison, Tanya N.; Henson, Rachel; Marinangeli, Lucia; McEwen, Alfred S.; Pajola, Maurizio; Sutton, Sarah S.; Wray, James J.

    2018-02-01

    This study aims to assess the spatial and visible/near-infrared (VNIR) colour/spectral capabilities of the 4-band Colour and Stereo Surface Imaging System (CaSSIS) aboard the ExoMars 2016 Trace Grace Orbiter (TGO). The instrument response functions for the CaSSIS imager was used to resample spectral libraries, modelled spectra and to construct spectrally ( i.e., in I/F space) and spatially consistent simulated CaSSIS image cubes of various key sites of interest and for ongoing scientific investigations on Mars. Coordinated datasets from Mars Reconnaissance Orbiter (MRO) are ideal, and specifically used for simulating CaSSIS. The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) provides colour information, while the Context Imager (CTX), and in a few cases the High-Resolution Imaging Science Experiment (HiRISE), provides the complementary spatial information at the resampled CaSSIS unbinned/unsummed pixel resolution (4.6 m/pixel from a 400-km altitude). The methodology used herein employs a Gram-Schmidt spectral sharpening algorithm to combine the ˜18-36 m/pixel CRISM-derived CaSSIS colours with I/F images primarily derived from oversampled CTX images. One hundred and eighty-one simulated CaSSIS 4-colour image cubes (at 18-36 m/pixel) were generated (including one of Phobos) based on CRISM data. From these, thirty-three "fully"-simulated image cubes of thirty unique locations on Mars ( i.e., with 4 colour bands at 4.6 m/pixel) were made. All simulated image cubes were used to test both the colour capabilities of CaSSIS by producing standard colour RGB images, colour band ratio composites (CBRCs) and spectral parameters. Simulated CaSSIS CBRCs demonstrated that CaSSIS will be able to readily isolate signatures related to ferrous (Fe2+) iron- and ferric (Fe3+) iron-bearing deposits on the surface of Mars, ices and atmospheric phenomena. Despite the lower spatial resolution of CaSSIS when compared to HiRISE, the results of this work demonstrate that CaSSIS will not only compliment HiRISE-scale studies of various geological and seasonal phenomena, it will also enhance them by providing additional colour and geologic context through its wider and longer full-colour coverage (˜9.4 × 50 km), and its increased sensitivity to iron-bearing materials from its two IR bands (RED and NIR). In a few examples, subtle surface changes that were not easily detected by HiRISE were identified in the simulated CaSSIS images. This study also demonstrates the utility of the Gram-Schmidt spectral pan-sharpening technique to extend VNIR colour/spectral capabilities from a lower spatial resolution colour/spectral dataset to a single-band or panchromatic image greyscale image with higher resolution. These higher resolution colour products (simulated CaSSIS or otherwise) are useful as means to extend both geologic context and mapping of datasets with coarser spatial resolutions. The results of this study indicate that the TGO mission objectives, as well as the instrument-specific mission objectives, will be achievable with CaSSIS.

  5. 77 FR 77185 - Proposed Collection; Comment Request; Office of Small and Disadvantaged Business Utilization

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ... Business Utilization AGENCY: Departmental Offices, Department of Treasury. ACTION: Notice and request for... Capability Statement will be used by firms that wish to do business with the Department of the Treasury. The... businesses to perform on Treasury contracts. Current Actions: The Electronic Capability Statement was...

  6. Production of lactic acid from hemicellulose extracts by Bacillus coagulans MXL-9

    USDA-ARS?s Scientific Manuscript database

    Bacillus coagulans MXL-9 was found capable of growing on pre-pulping hemicellulose extracts, utilizing all of the principle monosugars found in woody biomass. This organism is a moderate thermophile isolated from compost for its pentose utilizing capabilities. It was found to have high tolerance f...

  7. A chemical EOR benchmark study of different reservoir simulators

    NASA Astrophysics Data System (ADS)

    Goudarzi, Ali; Delshad, Mojdeh; Sepehrnoori, Kamy

    2016-09-01

    Interest in chemical EOR processes has intensified in recent years due to the advancements in chemical formulations and injection techniques. Injecting Polymer (P), surfactant/polymer (SP), and alkaline/surfactant/polymer (ASP) are techniques for improving sweep and displacement efficiencies with the aim of improving oil production in both secondary and tertiary floods. There has been great interest in chemical flooding recently for different challenging situations. These include high temperature reservoirs, formations with extreme salinity and hardness, naturally fractured carbonates, and sandstone reservoirs with heavy and viscous crude oils. More oil reservoirs are reaching maturity where secondary polymer floods and tertiary surfactant methods have become increasingly important. This significance has added to the industry's interest in using reservoir simulators as tools for reservoir evaluation and management to minimize costs and increase the process efficiency. Reservoir simulators with special features are needed to represent coupled chemical and physical processes present in chemical EOR processes. The simulators need to be first validated against well controlled lab and pilot scale experiments to reliably predict the full field implementations. The available data from laboratory scale include 1) phase behavior and rheological data; and 2) results of secondary and tertiary coreflood experiments for P, SP, and ASP floods under reservoir conditions, i.e. chemical retentions, pressure drop, and oil recovery. Data collected from corefloods are used as benchmark tests comparing numerical reservoir simulators with chemical EOR modeling capabilities such as STARS of CMG, ECLIPSE-100 of Schlumberger, REVEAL of Petroleum Experts. The research UTCHEM simulator from The University of Texas at Austin is also included since it has been the benchmark for chemical flooding simulation for over 25 years. The results of this benchmark comparison will be utilized to improve chemical design for field-scale studies using commercial simulators. The benchmark tests illustrate the potential of commercial simulators for chemical flooding projects and provide a comprehensive table of strengths and limitations of each simulator for a given chemical EOR process. Mechanistic simulations of chemical EOR processes will provide predictive capability and can aid in optimization of the field injection projects. The objective of this paper is not to compare the computational efficiency and solution algorithms; it only focuses on the process modeling comparison.

  8. Health IT and inappropriate utilization of outpatient imaging: A cross-sectional study of U.S. hospitals.

    PubMed

    Appari, Ajit; Johnson, M Eric; Anthony, Denise L

    2018-01-01

    To determine whether the use of information technology (IT), measured by Meaningful Use capability, is associated with lower rates of inappropriate utilization of imaging services in hospital outpatient settings. A retrospective cross-sectional analysis of 3332 nonfederal U.S. hospitals using data from: Hospital Compare (2011 outpatient imaging efficiency measures), HIMSS Analytics (2009 health IT), and Health Indicator Warehouse (market characteristics). Hospitals were categorized for their health IT infrastructure including EHR Stage-1 capability, and three advanced imaging functionalities/systems including integrated picture archiving and communication system, Web-based image distribution, and clinical decision support (CDS) with physician pathways. Three imaging efficiency measures suggesting inappropriate utilization during 2011 included: percentage of "combined" (with and without contrast) computed tomography (CT) studies out of all CT studies for abdomen and chest respectively, and percentage of magnetic resonance imaging (MRI) studies of lumbar spine without antecedent conservative therapy within 60days. For each measure, three separate regression models (GLM with gamma-log link function, and denominator of imaging measure as exposure) were estimated adjusting for hospital characteristics, market characteristics, and state fixed effects. Additionally, Heckman's Inverse Mills Ratio and propensity for Stage-1 EHR capability were used to account for selection bias. We find support for association of each of the four health IT capabilities with inappropriate utilization rates of one or more imaging modality. Stage-1 EHR capability is associated with lower inappropriate utilization rates for chest CT (incidence rate ratio IRR=0.72, p-value <0.01) and lumbar MRI (IRR=0.87, p-value <0.05). Integrated PACS is associated with lower inappropriate utilization rate of abdomen CT (IRR=0.84, p-value <0.05). Imaging distribution over Web capability is associated with lower inappropriate utilization rates for chest CT (IRR=0.66, p-value <0.05) and lumbar MRI (IRR=0.86, p-value <0.05). CDS with physician pathways is associated with lower inappropriate utilization rates for abdomen CT (IRR=0.87, p-value <0.01) and lumbar MRI (IRR=0.90, p-value <0.05). All other cases showed no association. The study offers mixed results. Taken together, the results suggest that the use of Stage-1 Meaningful Use capable EHR systems along with advanced imaging related functionalities could have a beneficial impact on reducing some of the inappropriate utilization of outpatient imaging. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Water Electrolysis for In-Situ Resource Utilization (ISRU)

    NASA Technical Reports Server (NTRS)

    Lee, Kristopher A.

    2016-01-01

    Sending humans to Mars for any significant amount of time will require capabilities and technologies that enable Earth independence. To move towards this independence, the resources found on Mars must be utilized to produce the items needed to sustain humans away from Earth. To accomplish this task, NASA is studying In Situ Resource Utilization (ISRU) systems and techniques to make use of the atmospheric carbon dioxide and the water found on Mars. Among other things, these substances can be harvested and processed to make oxygen and methane. Oxygen is essential, not only for sustaining the lives of the crew on Mars, but also as the oxidizer for an oxygen-methane propulsion system that could be utilized on a Mars ascent vehicle. Given the presence of water on Mars, the electrolysis of water is a common technique to produce the desired oxygen. Towards this goal, NASA designed and developed a Proton Exchange Membrane (PEM) water electrolysis system, which was originally slated to produce oxygen for propulsion and fuel cell use in the Mars Atmosphere and Regolith COllector/PrOcessor for Lander Operations (MARCO POLO) project. As part of the Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA) project, this same electrolysis system, originally targeted at enabling in situ propulsion and power, operated in a life-support scenario. During HESTIA testing at Johnson Space Center, the electrolysis system supplied oxygen to a chamber simulating a habitat housing four crewmembers. Inside the chamber, oxygen was removed from the atmosphere to simulate consumption by the crew, and the electrolysis system's oxygen was added to replenish it. The electrolysis system operated nominally throughout the duration of the HESTIA test campaign, and the oxygen levels in the life support chamber were maintained at the desired levels.

  10. A High-Resolution Capability for Large-Eddy Simulation of Jet Flows

    NASA Technical Reports Server (NTRS)

    DeBonis, James R.

    2011-01-01

    A large-eddy simulation (LES) code that utilizes high-resolution numerical schemes is described and applied to a compressible jet flow. The code is written in a general manner such that the accuracy/resolution of the simulation can be selected by the user. Time discretization is performed using a family of low-dispersion Runge-Kutta schemes, selectable from first- to fourth-order. Spatial discretization is performed using central differencing schemes. Both standard schemes, second- to twelfth-order (3 to 13 point stencils) and Dispersion Relation Preserving schemes from 7 to 13 point stencils are available. The code is written in Fortran 90 and uses hybrid MPI/OpenMP parallelization. The code is applied to the simulation of a Mach 0.9 jet flow. Four-stage third-order Runge-Kutta time stepping and the 13 point DRP spatial discretization scheme of Bogey and Bailly are used. The high resolution numerics used allows for the use of relatively sparse grids. Three levels of grid resolution are examined, 3.5, 6.5, and 9.2 million points. Mean flow, first-order turbulent statistics and turbulent spectra are reported. Good agreement with experimental data for mean flow and first-order turbulent statistics is shown.

  11. Vista/F-16 Multi-Axis Thrust Vectoring (MATV) control law design and evaluation

    NASA Technical Reports Server (NTRS)

    Zwerneman, W. D.; Eller, B. G.

    1994-01-01

    For the Multi-Axis Thrust Vectoring (MATV) program, a new control law was developed using multi-axis thrust vectoring to augment the aircraft's aerodynamic control power to provide maneuverability above the normal F-16 angle of attack limit. The control law architecture was developed using Lockheed Fort Worth's offline and piloted simulation capabilities. The final flight control laws were used in flight test to demonstrate tactical benefits gained by using thrust vectoring in air-to-air combat. Differences between the simulator aerodynamics data base and the actual aircraft aerodynamics led to significantly different lateral-directional flying qualities during the flight test program than those identified during piloted simulation. A 'dial-a-gain' flight test control law update was performed in the middle of the flight test program. This approach allowed for inflight optimization of the aircraft's flying qualities. While this approach is not preferred over updating the simulator aerodynamic data base and then updating the control laws, the final selected gain set did provide adequate lateral-directional flying qualities over the MATV flight envelope. The resulting handling qualities and the departure resistance of the aircraft allowed the 422nd_squadron pilots to focus entirely on evaluating the aircraft's tactical utility.

  12. Dynamic Simulation of Human Gait Model With Predictive Capability.

    PubMed

    Sun, Jinming; Wu, Shaoli; Voglewede, Philip A

    2018-03-01

    In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.

  13. Development of the NTF-117S Semi-Span Balance

    NASA Technical Reports Server (NTRS)

    Lynn, Keith C.

    2010-01-01

    A new high-capacity semi-span force and moment balance has recently been developed for use at the National Transonic Facility at the NASA Langley Research Center. This new semi-span balance provides the NTF a new measurement capability that will support testing of semi-span test models at transonic high-lift testing regimes. Future testing utilizing this new balance capability will include active circulation control and propulsion simulation testing of semi-span transonic wing models. The NTF has recently implemented a new highpressure air delivery station that will provide both high and low mass flow pressure lines that are routed out to the semi-span models via a set high/low pressure bellows that are indirectly linked to the metric end of the NTF-117S balance. A new check-load stand is currently being developed to provide the NTF with an in-house capability that will allow for performing check-loads on the NTF-117S balance in order to determine the pressure tare affects on the overall performance of the balance. An experimental design is being developed that will allow for experimentally assessing the static pressure tare affects on the balance performance.

  14. Design, analysis, and control of a large transport aircraft utilizing selective engine thrust as a backup system for the primary flight control. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gerren, Donna S.

    1995-01-01

    A study has been conducted to determine the capability to control a very large transport airplane with engine thrust. This study consisted of the design of an 800-passenger airplane with a range of 5000 nautical miles design and evaluation of a flight control system, and design and piloted simulation evaluation of a thrust-only backup flight control system. Location of the four wing-mounted engines was varied to optimize the propulsive control capability, and the time constant of the engine response was studied. The goal was to provide level 1 flying qualities. The engine location and engine time constant did not have a large effect on the control capability. The airplane design did meet level 1 flying qualities based on frequencies, damping ratios, and time constants in the longitudinal and lateral-directional modes. Project pilots consistently rated the flying qualities as either level 1 or level 2 based on Cooper-Harper ratings. However, because of the limited control forces and moments, the airplane design fell short of meeting the time required to achieve a 30 deg bank and the time required to respond a control input.

  15. X-ray fast tomography and its applications in dynamical phenomena studies in geosciences at Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    Xiao, Xianghui; Fusseis, Florian; De Carlo, Francesco

    2012-10-01

    State-of-art synchrotron radiation based micro-computed tomography provides high spatial and temporal resolution. This matches the needs of many research problems in geosciences. In this letter we report the current capabilities in microtomography at sector 2BM at the Advanced Photon Source (APS) of Argonne National Laboratory. The beamline is well suited to routinely acquire three-dimensional data of excellent quality with sub-micron resolution. Fast cameras in combination with a polychromatic beam allow time-lapse experiments with temporal resolutions of down to 200 ms. Data processing utilizes quantitative phase retrieval to optimize contrast in phase contrast tomographic data. The combination of these capabilities with purpose-designed experimental cells allows for a wide range of dynamic studies on geoscientific topics, two of which are summarized here. In the near future, new experimental cells capable of simulating conditions in most geological reservoirs will be available for general use. Ultimately, these advances will be matched by a new wide-field imaging beam line, which will be constructed as part of the APS upgrade. It is expected that even faster tomography with larger field of view can be conducted at this beam line, creating new opportunities for geoscientific studies.

  16. Time-dependent Data System (TDDS); an interactive program to assemble, manage, and appraise input data and numerical output of flow/transport simulation models

    USGS Publications Warehouse

    Regan, R.S.; Schaffranek, R.W.; Baltzer, R.A.

    1996-01-01

    A system of functional utilities and computer routines, collectively identified as the Time-Dependent Data System CI DDS), has been developed and documented by the U.S. Geological Survey. The TDDS is designed for processing time sequences of discrete, fixed-interval, time-varying geophysical data--in particular, hydrologic data. Such data include various, dependent variables and related parameters typically needed as input for execution of one-, two-, and three-dimensional hydrodynamic/transport and associated water-quality simulation models. Such data can also include time sequences of results generated by numerical simulation models. Specifically, TDDS provides the functional capabilities to process, store, retrieve, and compile data in a Time-Dependent Data Base (TDDB) in response to interactive user commands or pre-programmed directives. Thus, the TDDS, in conjunction with a companion TDDB, provides a ready means for processing, preparation, and assembly of time sequences of data for input to models; collection, categorization, and storage of simulation results from models; and intercomparison of field data and simulation results. The TDDS can be used to edit and verify prototype, time-dependent data to affirm that selected sequences of data are accurate, contiguous, and appropriate for numerical simulation modeling. It can be used to prepare time-varying data in a variety of formats, such as tabular lists, sequential files, arrays, graphical displays, as well as line-printer plots of single or multiparameter data sets. The TDDB is organized and maintained as a direct-access data base by the TDDS, thus providing simple, yet efficient, data management and access. A single, easily used, program interface that provides all access to and from a particular TDDB is available for use directly within models, other user-provided programs, and other data systems. This interface, together with each major functional utility of the TDDS, is described and documented in this report.

  17. COUPLED FREE AND DISSOLVED PHASE TRANSPORT: NEW SIMULATION CAPABILITIES AND PARAMETER INVERSION

    EPA Science Inventory

    The vadose zone free-phase simulation capabilities of the US EPA Hydrocarbon Spill Screening Model (HSSM) (Weaver et al., 1994) have been linked with the 3-D multi-species dissolved-phase contaminant transport simulator MT3DMS (Zheng and Wang, 1999; Zheng, 2005). The linkage pro...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahidehpour, Mohammad

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less

  19. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  20. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlburg, Jill; Corones, James; Batchelor, Donald

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less

  2. Parallel Fokker–Planck-DSMC algorithm for rarefied gas flow simulation in complex domains at all Knudsen numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Küchlin, Stephan, E-mail: kuechlin@ifd.mavt.ethz.ch; Jenny, Patrick

    2017-01-01

    A major challenge for the conventional Direct Simulation Monte Carlo (DSMC) technique lies in the fact that its computational cost becomes prohibitive in the near continuum regime, where the Knudsen number (Kn)—characterizing the degree of rarefaction—becomes small. In contrast, the Fokker–Planck (FP) based particle Monte Carlo scheme allows for computationally efficient simulations of rarefied gas flows in the low and intermediate Kn regime. The Fokker–Planck collision operator—instead of performing binary collisions employed by the DSMC method—integrates continuous stochastic processes for the phase space evolution in time. This allows for time step and grid cell sizes larger than the respective collisionalmore » scales required by DSMC. Dynamically switching between the FP and the DSMC collision operators in each computational cell is the basis of the combined FP-DSMC method, which has been proven successful in simulating flows covering the whole Kn range. Until recently, this algorithm had only been applied to two-dimensional test cases. In this contribution, we present the first general purpose implementation of the combined FP-DSMC method. Utilizing both shared- and distributed-memory parallelization, this implementation provides the capability for simulations involving many particles and complex geometries by exploiting state of the art computer cluster technologies.« less

  3. Hybrid Kalman Filter: A New Approach for Aircraft Engine In-Flight Diagnostics

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2006-01-01

    In this paper, a uniquely structured Kalman filter is developed for its application to in-flight diagnostics of aircraft gas turbine engines. The Kalman filter is a hybrid of a nonlinear on-board engine model (OBEM) and piecewise linear models. The utilization of the nonlinear OBEM allows the reference health baseline of the in-flight diagnostic system to be updated to the degraded health condition of the engines through a relatively simple process. Through this health baseline update, the effectiveness of the in-flight diagnostic algorithm can be maintained as the health of the engine degrades over time. Another significant aspect of the hybrid Kalman filter methodology is its capability to take advantage of conventional linear and nonlinear Kalman filter approaches. Based on the hybrid Kalman filter, an in-flight fault detection system is developed, and its diagnostic capability is evaluated in a simulation environment. Through the evaluation, the suitability of the hybrid Kalman filter technique for aircraft engine in-flight diagnostics is demonstrated.

  4. A novel HTS SMES application in combination with a permanent magnet synchronous generator type wind power generation system

    NASA Astrophysics Data System (ADS)

    Kim, G. H.; Kim, A. R.; Kim, S.; Park, M.; Yu, I. K.; Seong, K. C.; Won, Y. J.

    2011-11-01

    Superconducting magnetic energy storage (SMES) system is a DC current driven device and can be utilized to improve power quality particularly in connection with renewable energy sources due to higher efficiency and faster response than other devices. This paper suggests a novel connection topology of SMES which can smoothen the output power flow of the wind power generation system (WPGS). The structure of the proposed system is cost-effective because it reduces a power converter in comparison with a conventional application of SMES. One more advantage of SMES in the proposed system is to improve the capability of low voltage ride through (LVRT) for the permanent magnet synchronous generator (PMSG) type WPGS. The proposed system including a SMES has been modeled and analyzed by a PSCAD/EMTDC. The simulation results show the effectiveness of the novel SMES application strategy to not only mitigate the output power of the PMSG but also improve the capability of LVRT for PMSG type WPGS.

  5. Optical performance analysis of plenoptic camera systems

    NASA Astrophysics Data System (ADS)

    Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas

    2014-09-01

    Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

  6. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  7. Fast Segmentation From Blurred Data in 3D Fluorescence Microscopy.

    PubMed

    Storath, Martin; Rickert, Dennis; Unser, Michael; Weinmann, Andreas

    2017-10-01

    We develop a fast algorithm for segmenting 3D images from linear measurements based on the Potts model (or piecewise constant Mumford-Shah model). To that end, we first derive suitable space discretizations of the 3D Potts model, which are capable of dealing with 3D images defined on non-cubic grids. Our discretization allows us to utilize a specific splitting approach, which results in decoupled subproblems of moderate size. The crucial point in the 3D setup is that the number of independent subproblems is so large that we can reasonably exploit the parallel processing capabilities of the graphics processing units (GPUs). Our GPU implementation is up to 18 times faster than the sequential CPU version. This allows to process even large volumes in acceptable runtimes. As a further contribution, we extend the algorithm in order to deal with non-negativity constraints. We demonstrate the efficiency of our method for combined image deconvolution and segmentation on simulated data and on real 3D wide field fluorescence microscopy data.

  8. Electro-Thermal-Mechanical Simulation Capability Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, D

    This is the Final Report for LDRD 04-ERD-086, 'Electro-Thermal-Mechanical Simulation Capability'. The accomplishments are well documented in five peer-reviewed publications and six conference presentations and hence will not be detailed here. The purpose of this LDRD was to research and develop numerical algorithms for three-dimensional (3D) Electro-Thermal-Mechanical simulations. LLNL has long been a world leader in the area of computational mechanics, and recently several mechanics codes have become 'multiphysics' codes with the addition of fluid dynamics, heat transfer, and chemistry. However, these multiphysics codes do not incorporate the electromagnetics that is required for a coupled Electro-Thermal-Mechanical (ETM) simulation. There aremore » numerous applications for an ETM simulation capability, such as explosively-driven magnetic flux compressors, electromagnetic launchers, inductive heating and mixing of metals, and MEMS. A robust ETM simulation capability will enable LLNL physicists and engineers to better support current DOE programs, and will prepare LLNL for some very exciting long-term DoD opportunities. We define a coupled Electro-Thermal-Mechanical (ETM) simulation as a simulation that solves, in a self-consistent manner, the equations of electromagnetics (primarily statics and diffusion), heat transfer (primarily conduction), and non-linear mechanics (elastic-plastic deformation, and contact with friction). There is no existing parallel 3D code for simulating ETM systems at LLNL or elsewhere. While there are numerous magnetohydrodynamic codes, these codes are designed for astrophysics, magnetic fusion energy, laser-plasma interaction, etc. and do not attempt to accurately model electromagnetically driven solid mechanics. This project responds to the Engineering R&D Focus Areas of Simulation and Energy Manipulation, and addresses the specific problem of Electro-Thermal-Mechanical simulation for design and analysis of energy manipulation systems such as magnetic flux compression generators and railguns. This project compliments ongoing DNT projects that have an experimental emphasis. Our research efforts have been encapsulated in the Diablo and ALE3D simulation codes. This new ETM capability already has both internal and external users, and has spawned additional research in plasma railgun technology. By developing this capability Engineering has become a world-leader in ETM design, analysis, and simulation. This research has positioned LLNL to be able to compete for new business opportunities with the DoD in the area of railgun design. We currently have a three-year $1.5M project with the Office of Naval Research to apply our ETM simulation capability to railgun bore life issues and we expect to be a key player in the railgun community.« less

  9. Simulated real-time lunar volatiles prospecting with a rover-borne neutron spectrometer

    NASA Astrophysics Data System (ADS)

    Elphic, Richard C.; Heldmann, Jennifer L.; Marinova, Margarita M.; Colaprete, Anthony; Fritzler, Erin L.; McMurray, Robert E.; Morse, Stephanie; Roush, Ted L.; Stoker, Carol R.; Deans, Matthew C.; Smith, Trey F.

    2015-05-01

    In situ resource utilization (ISRU) may one day enable long duration lunar missions. But the efficacy of such an approach greatly depends on (1) physical and chemical makeup of the resource, and (2) the logistical cost of exploiting the resource. Establishing these key strategic factors requires prospecting: the capability of locating and characterizing potential resources. There is already considerable evidence from orbital and impact missions that the lunar poles harbor plausibly rich reservoirs of volatiles. The next step is to land on the Moon and assess the nature, “ore-grade”, and extractability of water ice and other materials. In support of this next step, a mission simulation was carried out on the island of Hawai'i in July of 2012. A robotic rover, provided by the Canadian Space Agency, carried several NASA ISRU-supporting instruments in a field test to address how such a mission might be carried out. This exercise was meant to test the ability to (a) locate and characterize volatiles, (b) acquire subsurface samples in a volatile-rich location, and (c) analyze the form and composition of the volatiles to determine their utility. This paper describes the successful demonstration of neutron spectroscopy as a prospecting and decision support system to locate and evaluate potential ISRU targets in the field exercise.

  10. High definition TV projection via single crystal faceplate technology

    NASA Astrophysics Data System (ADS)

    Kindl, H. J.; St. John, Thomas

    1993-03-01

    Single crystal phosphor faceplates are epitaxial phosphors grown on crystalline substrates with the advantages of high light output, resolution, and extended operational life. Single crystal phosphor faceplate industrial technology in the United States is capable of providing a faceplate appropriate to the projection industry of up to four (4) inches in diameter. Projection systems incorporating cathode ray tubes utilizing single crystal phosphor faceplates will produce 1500 lumens of white light with 1000 lines of resolution, non-interlaced. This 1500 lumen projection system will meet all of the currently specified luminance and resolution requirements of Visual Display systems for flight simulators. Significant logistic advantages accrue from the introduction of single crystal phosphor faceplate CRT's. Specifically, the full performance life of a CRT is expected to increase by a factor of five (5); ie, from 2000 to 10,000 hours of operation. There will be attendant reductions in maintenance time, spare CRT requirements, system down time, etc. The increased brightness of the projection system will allow use of lower gain, lower cost simulator screen material. Further, picture performance characteristics will be more balanced across the full simulator.

  11. Simulating Freak Waves in the Ocean with CFD Modeling

    NASA Astrophysics Data System (ADS)

    Manolidis, M.; Orzech, M.; Simeonov, J.

    2017-12-01

    Rogue, or freak, waves constitute an active topic of research within the world scientific community, as various maritime authorities around the globe seek to better understand and more accurately assess the risks that the occurrence of such phenomena entail. Several experimental studies have shed some light on the mechanics of rogue wave formation. In our work we numerically simulate the formation of such waves in oceanic conditions by means of Computational Fluid Dynamics (CFD) software. For this purpose we implement the NHWAVE and OpenFOAM software packages. Both are non-hydrostatic, turbulent flow solvers, but NHWAVE implements a shock-capturing scheme at the free surface-interface, while OpenFOAM utilizes the Volume Of Fluid (VOF) method. NHWAVE has been shown to accurately reproduce highly nonlinear surface wave phenomena, such as soliton propagation and wave shoaling. We conducted a range of tests simulating rogue wave formation and horizontally varying currents to evaluate and compare the capabilities of the two software packages. Then we used each model to investigate the effect of ocean currents and current gradients on the formation of rogue waves. We present preliminary results.

  12. Classification of electronically generated phantom targets by an Atlantic bottlenose dolphin (Tursiops truncatus).

    PubMed

    Aubauer, R; Au, W W; Nachtigall, P E; Pawloski, D A; DeLong, C M

    2000-05-01

    Animal behavior experiments require not only stimulus control of the animal's behavior, but also precise control of the stimulus itself. In discrimination experiments with real target presentation, the complex interdependence between the physical dimensions and the backscattering process of an object make it difficult to extract and control relevant echo parameters separately. In other phantom-echo experiments, the echoes were relatively simple and could only simulate certain properties of targets. The echo-simulation method utilized in this paper can be used to transform any animal echolocation sound into phantom echoes of high fidelity and complexity. The developed phantom-echo system is implemented on a digital signal-processing board and gives an experimenter fully programmable control over the echo-generating process and the echo structure itself. In this experiment, the capability of a dolphin to discriminate between acoustically simulated phantom replicas of targets and their real equivalents was tested. Phantom replicas were presented in a probe technique during a materials discrimination experiment. The animal accepted the phantom echoes and classified them in the same manner as it classified real targets.

  13. Evaluation of NCMRWF unified model vertical cloud structure with CloudSat over the Indian summer monsoon region

    NASA Astrophysics Data System (ADS)

    Jayakumar, A.; Mamgain, Ashu; Jisesh, A. S.; Mohandas, Saji; Rakhi, R.; Rajagopal, E. N.

    2016-05-01

    Representation of rainfall distribution and monsoon circulation in the high resolution versions of NCMRWF Unified model (NCUM-REG) for the short-range forecasting of extreme rainfall event is vastly dependent on the key factors such as vertical cloud distribution, convection and convection/cloud relationship in the model. Hence it is highly relevant to evaluate the vertical structure of cloud and precipitation of the model over the monsoon environment. In this regard, we utilized the synergy of the capabilities of CloudSat data for long observational period, by conditioning it for the synoptic situation of the model simulation period. Simulations were run at 4-km grid length with the convective parameterization effectively switched off and on. Since the sample of CloudSat overpasses through the monsoon domain is small, the aforementioned methodology may qualitatively evaluate the vertical cloud structure for the model simulation period. It is envisaged that the present study will open up the possibility of further improvement in the high resolution version of NCUM in the tropics for the Indian summer monsoon associated rainfall events.

  14. Marshall Space Flight Center's Impact Testing Facility Capabilities

    NASA Technical Reports Server (NTRS)

    Finchum, Andy; Hubbs, Whitney; Evans, Steve

    2008-01-01

    Marshall Space Flight Center s (MSFC) Impact Testing Facility (ITF) serves as an important installation for space and missile related materials science research. The ITF was established and began its research in spacecraft debris shielding in the early 1960s, then played a major role in the International Space Station debris shield development. As NASA became more interested in launch debris and in-flight impact concerns, the ITF grew to include research in a variety of impact genres. Collaborative partnerships with the DoD led to a wider range of impact capabilities being relocated to MSFC as a result of the closure of Particle Impact Facilities in Santa Barbara, California. The Particle Impact Facility had a 30 year history in providing evaluations of aerospace materials and components during flights through rain, ice, and solid particle environments at subsonic through hypersonic velocities. The facility s unique capabilities were deemed a "National Asset" by the DoD. The ITF now has capabilities including environmental, ballistic, and hypervelocity impact testing utilizing an array of air, powder, and two-stage light gas guns to accommodate a variety of projectile and target types and sizes. Numerous upgrades including new instrumentation, triggering circuitry, high speed photography, and optimized sabot designs have been implemented. Other recent research has included rain drop demise characterization tests to obtain data for inclusion in on-going model development. The current and proposed ITF capabilities range from rain to micrometeoroids allowing the widest test parameter range possible for materials investigations in support of space, atmospheric, and ground environments. These test capabilities including hydrometeor, single/multi-particle, ballistic gas guns, exploding wire gun, and light gas guns combined with Smooth Particle Hydrodynamics Code (SPHC) simulations represent the widest range of impact test capabilities in the country.

  15. Cpp Utility - Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    III, FredOppel; Rigdon, J. Brian

    2014-09-08

    A collection of general Umbra modules that are reused by other Umbra libraries. These capabilities include line segments, file utilities, color utilities, string utilities (for std::string), list utilities (for std ::vector ), bounding box intersections, range limiters, simple filters, cubic roots solvers and a few other utilities.

  16. UAS-Systems Integration, Validation, and Diagnostics Simulation Capability

    NASA Technical Reports Server (NTRS)

    Buttrill, Catherine W.; Verstynen, Harry A.

    2014-01-01

    As part of the Phase 1 efforts of NASA's UAS-in-the-NAS Project a task was initiated to explore the merits of developing a system simulation capability for UAS to address airworthiness certification requirements. The core of the capability would be a software representation of an unmanned vehicle, including all of the relevant avionics and flight control system components. The specific system elements could be replaced with hardware representations to provide Hardware-in-the-Loop (HWITL) test and evaluation capability. The UAS Systems Integration and Validation Laboratory (UAS-SIVL) was created to provide a UAS-systems integration, validation, and diagnostics hardware-in-the-loop simulation capability. This paper discusses how SIVL provides a robust and flexible simulation framework that permits the study of failure modes, effects, propagation paths, criticality, and mitigation strategies to help develop safety, reliability, and design data that can assist with the development of certification standards, means of compliance, and design best practices for civil UAS.

  17. Investigation of Various Novel Air-Breathing Propulsion Systems

    NASA Astrophysics Data System (ADS)

    Wilhite, Jarred M.

    The current research investigates the operation and performance of various air-breathing propulsion systems, which are capable of utilizing different types of fuel. This study first focuses on a modular RDE configuration, which was mainly studied to determine which conditions yield stable, continuous rotating detonation for an ethylene-air mixture. The performance of this RDE was analyzed by studying various parameters such as mass flow rate, equivalence ratios, wave speed and cell size. For relatively low mass flow rates near stoichiometric conditions, a rotating detonation wave is observed for an ethylene-RDE, but at speeds less than an ideal detonation wave. The current research also involves investigating the newly designed, Twin Oxidizer Injection Capable (TOXIC) RDE. Mixtures of hydrogen and air were utilized for this configuration, resulting in sustained rotating detonation for various mass flow rates and equivalence ratios. A thrust stand was also developed to observe and further measure the performance of the TOXIC RDE. Further analysis was conducted to accurately model and simulate the response of thrust stand during operation of the RDE. Also included in this research are findings and analysis of a propulsion system capable of operating on the Inverse Brayton Cycle. The feasibility of this novel concept was validated in a previous study to be sufficient for small-scale propulsion systems, namely UAV applications. This type of propulsion system consists of a reorganization of traditional gas turbine engine components, which incorporates expansion before compression. This cycle also requires a heat exchanger to reduce the temperature of the flow entering the compressor downstream. While adding a heat exchanger improves the efficiency of the cycle, it also increases the engine weight, resulting in less endurance for the aircraft. Therefore, this study focuses on the selection and development of a new heat exchanger design that is lightweight, and is capable of transferring significant amounts of heat and improving the efficiency and performance of the propulsion system.

  18. Mobile Smog Simulator: New Capabilities to Study Urban Mixtures

    EPA Pesticide Factsheets

    A smog simulator developed by EPA scientists and engineers has unique capabilities that will provide information for assessing the health impacts of relevant multipollutant atmospheres and identify contributions of specific sources.

  19. Development and evaluation of the Screening Trajectory Ozone Prediction System (STOPS, version 1.0)

    NASA Astrophysics Data System (ADS)

    Czader, B. H.; Percell, P.; Byun, D.; Kim, S.; Choi, Y.

    2015-05-01

    A hybrid Lagrangian-Eulerian based modeling tool has been developed using the Eulerian framework of the Community Multiscale Air Quality (CMAQ) model. It is a moving nest that utilizes saved original CMAQ simulation results to provide boundary conditions, initial conditions, as well as emissions and meteorological parameters necessary for a simulation. Given that these files are available, this tool can run independently of the CMAQ whole domain simulation, and it is designed to simulate source-receptor relationships upon changes in emissions. In this tool, the original CMAQ's horizontal domain is reduced to a small sub-domain that follows a trajectory defined by the mean mixed-layer wind. It has the same vertical structure and physical and chemical interactions as CMAQ except advection calculation. The advantage of this tool compared to other Lagrangian models is its capability of utilizing realistic boundary conditions that change with space and time as well as detailed chemistry treatment. The correctness of the algorithms and the overall performance was evaluated against CMAQ simulation results. Its performance depends on the atmospheric conditions occurring during the simulation period, with the comparisons being most similar to CMAQ results under uniform wind conditions. The mean bias for surface ozone mixing ratios varies between -0.03 and -0.78 ppbV and the slope is between 0.99 and 1.01 for different analyzed cases. For complicated meteorological conditions, such as wind circulation, the simulated mixing ratios deviate from CMAQ values as a result of the Lagrangian approach of using mean wind for its movement, but are still close, with the mean bias for ozone varying between 0.07 and -4.29 ppbV and the slope varying between 0.95 and 1.06 for different analyzed cases. For historical reasons, this hybrid Lagrangian-Eulerian based tool is named the Screening Trajectory Ozone Prediction System (STOPS), but its use is not limited to ozone prediction as, similarly to CMAQ, it can simulate concentrations of many species, including particulate matter and some toxic compounds, such as formaldehyde and 1,3-butadiene.

  20. An Evaluation of Training Interventions and Computed Scoring Techniques for Grading a Level Turn Task and a Straight In Landing Approach on a PC-Based Flight Simulator

    NASA Technical Reports Server (NTRS)

    Heath, Bruce E.

    2007-01-01

    One result of the relatively recent advances in computing technology has been the decreasing cost of computers and increasing computational power. This has allowed high fidelity airplane simulations to be run on personal computers (PC). Thus, simulators are now used routinely by pilots to substitute real flight hours for simulated flight hours for training for an aircraft type rating thereby reducing the cost of flight training. However, FAA regulations require that such substitution training must be supervised by Certified Flight Instructors (CFI). If the CFI presence could be reduced or eliminated for certain tasks this would mean a further cost savings to the pilot. This would require that the flight simulator have a certain level of 'intelligence' in order to provide feedback on pilot perfolmance similar to that of a CFI. The 'intelligent' flight sinlulator would have at least the capability to use data gathered from the flight to create a measure for the performance of the student pilot. Also, to fully utilize the advances in computational power, the sinlulator would be capable of interacting with the student pilot using the best possible training interventions. This thesis reposts on the two studies conducted at Tuskegee University investigating the effects of interventions on the learning of two flight maneuvers on a flight sinlulator and the robustness and accuracy of calculated perfornlance indices as compared to CFI evaluations of performance. The intent of these studies is to take a step in the direction of creating an 'intelligent' flight simulator. The first study deals with the comparisons of novice pilot performance trained at different levels of above real-time to execute a level S-turn. The second study examined the effect of out-of-the-window (OTW) visual cues in the form of hoops on the performance of novice pilots learning to fly a landing approach on the flight simulator. The reliability/robustness of the computed performance metrics was assessed by comparing them with the evaluations of the landing approach maneuver by a number of CFIs.

  1. Sensor-scheduling simulation of disparate sensors for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Hobson, T.; Clarkson, I.

    2011-09-01

    The art and science of space situational awareness (SSA) has been practised and developed from the time of Sputnik. However, recent developments, such as the accelerating pace of satellite launch, the proliferation of launch capable agencies, both commercial and sovereign, and recent well-publicised collisions involving man-made space objects, has further magnified the importance of timely and accurate SSA. The United States Strategic Command (USSTRATCOM) operates the Space Surveillance Network (SSN), a global network of sensors tasked with maintaining SSA. The rapidly increasing number of resident space objects will require commensurate improvements in the SSN. Sensors are scarce resources that must be scheduled judiciously to obtain measurements of maximum utility. Improvements in sensor scheduling and fusion, can serve to reduce the number of additional sensors that may be required. Recently, Hill et al. [1] have proposed and developed a simulation environment named TASMAN (Tasking Autonomous Sensors in a Multiple Application Network) to enable testing of alternative scheduling strategies within a simulated multi-sensor, multi-target environment. TASMAN simulates a high-fidelity, hardware-in-the-loop system by running multiple machines with different roles in parallel. At present, TASMAN is limited to simulations involving electro-optic sensors. Its high fidelity is at once a feature and a limitation, since supercomputing is required to run simulations of appreciable scale. In this paper, we describe an alternative, modular and scalable SSA simulation system that can extend the work of Hill et al with reduced complexity, albeit also with reduced fidelity. The tool has been developed in MATLAB and therefore can be run on a very wide range of computing platforms. It can also make use of MATLAB’s parallel processing capabilities to obtain considerable speed-up. The speed and flexibility so obtained can be used to quickly test scheduling algorithms even with a relatively large number of space objects. We further describe an application of the tool by exploring how the relative mixture of electro-optical and radar sensors can impact the scheduling, fusion and achievable accuracy of an SSA system. By varying the mixture of sensor types, we are able to characterise the main advantages and disadvantages of each configuration.

  2. Ensuring US National Aeronautics Test Capabilities

    NASA Technical Reports Server (NTRS)

    Marshall, Timothy J.

    2010-01-01

    U.S. leadership in aeronautics depends on ready access to technologically advanced, efficient, and affordable aeronautics test capabilities. These systems include major wind tunnels and propulsion test facilities and flight test capabilities. The federal government owns the majority of the major aeronautics test capabilities in the United States, primarily through the National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD). However, changes in the Aerospace landscape, primarily the decrease in demand for testing over the last 20 years required an overarching strategy for management of these national assets. Therefore, NASA established the Aeronautics Test Program (ATP) as a two-pronged strategic initiative to: (1) retain and invest in NASA aeronautics test capabilities considered strategically important to the agency and the nation, and (2) establish a strong, high level partnership with the DoD. Test facility utilization is a critical factor for ATP because it relies on user occupancy fees to recover a substantial part of the operations costs for its facilities. Decreasing utilization is an indicator of excess capacity and in some cases low-risk redundancy (i.e., several facilities with basically the same capability and overall low utilization). However, low utilization does not necessarily translate to lack of strategic importance. Some facilities with relatively low utilization are nonetheless vitally important because of the unique nature of the capability and the foreseeable aeronautics testing needs. Unfortunately, since its inception, the customer base for ATP has continued to shrink. Utilization of ATP wind tunnels has declined by more than 50% from the FY 2006 levels. This significant decrease in customer usage is attributable to several factors, including the overall decline in new programs and projects in the aerospace sector; the impact of computational fluid dynamics (CFD) on the design, development, and research process; and the reductions in wind tunnel testing requirements within the largest consumer of ATP wind tunnel test time, the Aeronautics Research Mission Directorate (ARMD). Retirement of the Space Shuttle Program and recent perturbations of NASA's Constellation Program will exacerbate this downward trend. Therefore it is crucial that ATP periodically revisit and determine which of its test capabilities are strategically important, which qualify as low-risk redundancies that could be put in an inactive status or closed, and address the challenges associated with both sustainment and improvements to the test capabilities that must remain active. This presentation will provide an overview of the ATP vision, mission, and goals as well as the challenges and opportunities the program is facing both today and in the future. We will discuss the strategy ATP is taking over the next five years to address the National aeronautics test capability challenges and what the program will do to capitalize on its opportunities to ensure a ready, robust and relevant portfolio of National aeronautics test capabilities.

  3. Development of a large-scale, outdoor, ground-based test capability for evaluating the effect of rain on airfoil lift

    NASA Technical Reports Server (NTRS)

    Bezos, Gaudy M.; Campbell, Bryan A.

    1993-01-01

    A large-scale, outdoor, ground-based test capability for acquiring aerodynamic data in a simulated rain environment was developed at the Langley Aircraft Landing Dynamics Facility (ALDF) to assess the effect of heavy rain on airfoil performance. The ALDF test carriage was modified to transport a 10-ft-chord NACA 64210 wing section along a 3000-ft track at full-scale aircraft approach speeds. An overhead rain simulation system was constructed along a 525-ft section of the track with the capability of producing simulated rain fields of 2, 10, 30, and 40 in/hr. The facility modifications, the aerodynamic testing and rain simulation capability, the design and calibration of the rain simulation system, and the operational procedures developed to minimize the effect of wind on the simulated rain field and aerodynamic data are described in detail. The data acquisition and reduction processes are also presented along with sample force data illustrating the environmental effects on data accuracy and repeatability for the 'rain-off' test condition.

  4. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.

  5. SPARTAN: A High-Fidelity Simulation for Automated Rendezvous and Docking Applications

    NASA Technical Reports Server (NTRS)

    Turbe, Michael A.; McDuffie, James H.; DeKock, Brandon K.; Betts, Kevin M.; Carrington, Connie K.

    2007-01-01

    bd Systems (a subsidiary of SAIC) has developed the Simulation Package for Autonomous Rendezvous Test and ANalysis (SPARTAN), a high-fidelity on-orbit simulation featuring multiple six-degree-of-freedom (6DOF) vehicles. SPARTAN has been developed in a modular fashion in Matlab/Simulink to test next-generation automated rendezvous and docking guidance, navigation,and control algorithms for NASA's new Vision for Space Exploration. SPARTAN includes autonomous state-based mission manager algorithms responsible for sequencing the vehicle through various flight phases based on on-board sensor inputs and closed-loop guidance algorithms, including Lambert transfers, Clohessy-Wiltshire maneuvers, and glideslope approaches The guidance commands are implemented using an integrated translation and attitude control system to provide 6DOF control of each vehicle in the simulation. SPARTAN also includes high-fidelity representations of a variety of absolute and relative navigation sensors that maybe used for NASA missions, including radio frequency, lidar, and video-based rendezvous sensors. Proprietary navigation sensor fusion algorithms have been developed that allow the integration of these sensor measurements through an extended Kalman filter framework to create a single optimal estimate of the relative state of the vehicles. SPARTAN provides capability for Monte Carlo dispersion analysis, allowing for rigorous evaluation of the performance of the complete proposed AR&D system, including software, sensors, and mechanisms. SPARTAN also supports hardware-in-the-loop testing through conversion of the algorithms to C code using Real-Time Workshop in order to be hosted in a mission computer engineering development unit running an embedded real-time operating system. SPARTAN also contains both runtime TCP/IP socket interface and post-processing compatibility with bdStudio, a visualization tool developed by bd Systems, allowing for intuitive evaluation of simulation results. A description of the SPARTAN architecture and capabilities is provided, along with details on the models and algorithms utilized and results from representative missions.

  6. Physically based modeling of rainfall-triggered landslides: a case study in the Luquillo forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Lepore, C.; Arnone, E.; Noto, L. V.; Sivandran, G.; Bras, R. L.

    2013-09-01

    This paper presents the development of a rainfall-triggered landslide module within an existing physically based spatially distributed ecohydrologic model. The model, tRIBS-VEGGIE (Triangulated Irregular Networks-based Real-time Integrated Basin Simulator and Vegetation Generator for Interactive Evolution), is capable of a sophisticated description of many hydrological processes; in particular, the soil moisture dynamics are resolved at a temporal and spatial resolution required to examine the triggering mechanisms of rainfall-induced landslides. The validity of the tRIBS-VEGGIE model to a tropical environment is shown with an evaluation of its performance against direct observations made within the study area of Luquillo Forest. The newly developed landslide module builds upon the previous version of the tRIBS landslide component. This new module utilizes a numerical solution to the Richards' equation (present in tRIBS-VEGGIE but not in tRIBS), which better represents the time evolution of soil moisture transport through the soil column. Moreover, the new landslide module utilizes an extended formulation of the factor of safety (FS) to correctly quantify the role of matric suction in slope stability and to account for unsaturated conditions in the evaluation of FS. The new modeling framework couples the capabilities of the detailed hydrologic model to describe soil moisture dynamics with the infinite slope model, creating a powerful tool for the assessment of rainfall-triggered landslide risk.

  7. Prior-knowledge Fitting of Accelerated Five-dimensional Echo Planar J-resolved Spectroscopic Imaging: Effect of Nonlinear Reconstruction on Quantitation.

    PubMed

    Iqbal, Zohaib; Wilson, Neil E; Thomas, M Albert

    2017-07-24

    1 H Magnetic Resonance Spectroscopic imaging (SI) is a powerful tool capable of investigating metabolism in vivo from mul- tiple regions. However, SI techniques are time consuming, and are therefore difficult to implement clinically. By applying non-uniform sampling (NUS) and compressed sensing (CS) reconstruction, it is possible to accelerate these scans while re- taining key spectral information. One recently developed method that utilizes this type of acceleration is the five-dimensional echo planar J-resolved spectroscopic imaging (5D EP-JRESI) sequence, which is capable of obtaining two-dimensional (2D) spectra from three spatial dimensions. The prior-knowledge fitting (ProFit) algorithm is typically used to quantify 2D spectra in vivo, however the effects of NUS and CS reconstruction on the quantitation results are unknown. This study utilized a simulated brain phantom to investigate the errors introduced through the acceleration methods. Errors (normalized root mean square error >15%) were found between metabolite concentrations after twelve-fold acceleration for several low concentra- tion (<2 mM) metabolites. The Cramér Rao lower bound% (CRLB%) values, which are typically used for quality control, were not reflective of the increased quantitation error arising from acceleration. Finally, occipital white (OWM) and gray (OGM) human brain matter were quantified in vivo using the 5D EP-JRESI sequence with eight-fold acceleration.

  8. Assimilative model for ionospheric dynamics employing delay, Doppler, and direction of arrival measurements from multiple HF channels

    NASA Astrophysics Data System (ADS)

    Fridman, Sergey V.; Nickisch, L. J.; Hausman, Mark; Zunich, George

    2016-03-01

    We describe the development of new HF data assimilation capabilities for our ionospheric inversion algorithm called GPSII (GPS Ionospheric Inversion). Previously existing capabilities of this algorithm included assimilation of GPS total electron content data as well as assimilation of backscatter ionograms. In the present effort we concentrated on developing assimilation tools for data related to HF propagation channels. Measurements of propagation delay, angle of arrival, and the ionosphere-induced Doppler from any number of known propagation links can now be utilized by GPSII. The resulting ionospheric model is consistent with all assimilated measurements. This means that ray tracing simulations of the assimilated propagation links are guaranteed to be in agreement with measured data within the errors of measurement. The key theoretical element for assimilating HF data is the raypath response operator (RPRO) which describes response of raypath parameters to infinitesimal variations of electron density in the ionosphere. We construct the RPRO out of the fundamental solution of linearized ray tracing equations for a dynamic magnetoactive plasma. We demonstrate performance and internal consistency of the algorithm using propagation delay data from multiple oblique ionograms (courtesy of Defence Science and Technology Organisation, Australia) as well as with time series of near-vertical incidence sky wave data (courtesy of the Intelligence Advanced Research Projects Activity HFGeo Program Government team). In all cases GPSII produces electron density distributions which are smooth in space and in time. We simulate the assimilated propagation links by performing ray tracing through GPSII-produced ionosphere and observe that simulated data are indeed in agreement with assimilated measurements.

  9. Analyzing the effectiveness of flare dispensing programs against pulse width modulation seekers using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Şahingil, Mehmet C.; Aslan, Murat Š.

    2013-10-01

    Infrared guided missile seekers utilizing pulse width modulation in target tracking is one of the threats against air platforms. To be able to achieve a "soft-kill" protection of own platform against these type of threats, one needs to examine carefully the seeker operating principle with its special electronic counter-counter measure (ECCM) capability. One of the cost-effective ways of soft kill protection is to use flare decoys in accordance with an optimized dispensing program. Such an optimization requires a good understanding of the threat seeker, capabilities of the air platform and engagement scenario information between them. Modeling and simulation is very powerful tool to achieve a valuable insight and understand the underlying phenomenology. A careful interpretation of simulation results is crucial to infer valuable conclusions from the data. In such an interpretation there are lots of factors (features) which affect the results. Therefore, powerful statistical tools and pattern recognition algorithms are of special interest in the analysis. In this paper, we show how self-organizing maps (SOMs), which is one of those powerful tools, can be used in analyzing the effectiveness of various flare dispensing programs against a PWM seeker. We perform several Monte Carlo runs for a typical engagement scenario in a MATLAB-based simulation environment. In each run, we randomly change the flare dispending program and obtain corresponding class: "successful" or "unsuccessful", depending on whether the corresponding flare dispensing program deceives the seeker or not, respectively. Then, in the analysis phase, we use SOMs to interpret and visualize the results.

  10. Adaptation of an unstructured-mesh, finite-element ocean model to the simulation of ocean circulation beneath ice shelves

    NASA Astrophysics Data System (ADS)

    Kimura, Satoshi; Candy, Adam S.; Holland, Paul R.; Piggott, Matthew D.; Jenkins, Adrian

    2013-07-01

    Several different classes of ocean model are capable of representing floating glacial ice shelves. We describe the incorporation of ice shelves into Fluidity-ICOM, a nonhydrostatic finite-element ocean model with the capacity to utilize meshes that are unstructured and adaptive in three dimensions. This geometric flexibility offers several advantages over previous approaches. The model represents melting and freezing on all ice-shelf surfaces including vertical faces, treats the ice shelf topography as continuous rather than stepped, and does not require any smoothing of the ice topography or any of the additional parameterisations of the ocean mixed layer used in isopycnal or z-coordinate models. The model can also represent a water column that decreases to zero thickness at the 'grounding line', where the floating ice shelf is joined to its tributary ice streams. The model is applied to idealised ice-shelf geometries in order to demonstrate these capabilities. In these simple experiments, arbitrarily coarsening the mesh outside the ice-shelf cavity has little effect on the ice-shelf melt rate, while the mesh resolution within the cavity is found to be highly influential. Smoothing the vertical ice front results in faster flow along the smoothed ice front, allowing greater exchange with the ocean than in simulations with a realistic ice front. A vanishing water-column thickness at the grounding line has little effect in the simulations studied. We also investigate the response of ice shelf basal melting to variations in deep water temperature in the presence of salt stratification.

  11. Langley Aerothermodynamic Facilities Complex: Enhancements and Testing Capabilities

    NASA Technical Reports Server (NTRS)

    Micol, J. R.

    1998-01-01

    Description, capabilities, recent upgrades, and utilization of the NASA Langley Research Center (LaRC) Aerothermodynamic Facilities Complex (AFC) are presented. The AFC consists of five hypersonic, blow-down-to-vacuum wind tunnels that collectively provide a range of Mach number from 6 to 20, unit Reynolds number from 0.04 to 22 million per foot and, most importantly for blunt configurations, normal shock density ratio from 4 to 12. These wide ranges of hypersonic simulation parameters are due, in part, to the use of three different test gases (air, helium, and tetrafluoromethane), thereby making several of the facilities unique. The Complex represents nearly three-fourths of the conventional (as opposed to impulse)-type hypersonic wind tunnels operational in this country. AFC facilities are used to assess and optimize the hypersonic aerodynamic performance and aeroheating characteristics of aerospace vehicle concepts and to provide benchmark aerodynamic/aeroheating data fr generating the flight aerodynamic databook and final design of the thermal protection system (TPS) (e.g., establishment of flight limitations not to exceed TPS design limits). Modifications and enhancements of AFC hardware components and instrumentation have been pursued to increase capability, reliability, and productivity in support of programmatic goals. Examples illustrating facility utilization in recent years to generate essentially all of the experimental hypersonic aerodynamic and aeroheating information for high-priority, fast-paced Agency programs are presented. These programs include Phase I of the Reusable Launch Vehicle (RLV) Advanced Technology Demonstrator, X-33 program, PHase II of the X-33 program, X-34 program, the Hyper-X program ( a Mach 5,7, and 10 airbreathing propulsion flight experiment), and the X-38 program (Experimental Crew Return Vehicle, X-CRV). Current upgrades/enchancements and future plans for the AFC are discussed.

  12. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Amy B.; Stauffer, Philip H.; Reed, Donald T.

    The primary objective of the experimental effort described here is to aid in understanding the complex nature of liquid, vapor, and solid transport occurring around heated nuclear waste in bedded salt. In order to gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that (a) hydrological and physiochemical parameters and (b) processes are correctly simulated. The experiments proposed here are designed to study aspects of the system that have not been satisfactorily quantified in prior work. In addition to exploring the complex coupled physical processes in support of numerical model validation, lessons learnedmore » from these experiments will facilitate preparations for larger-scale experiments that may utilize similar instrumentation techniques.« less

  14. High-voltage spark carbon-fiber sticky-tape data analyzer

    NASA Technical Reports Server (NTRS)

    Yang, L. C.; Hull, G. G.

    1980-01-01

    An efficient method for detecting carbon fibers collected on a stick tape monitor was developed. The fibers were released from a simulated crash fire situation containing carbon fiber composite material. The method utilized the ability of the fiber to initiate a spark across a set of alternately biased high voltage electrodes to electronically count the number of fiber fragments collected on the tape. It was found that the spark, which contains high energy and is of very short duration, is capable of partially damaging or consuming the fiber fragments. It also creates a mechanical disturbance which ejects the fiber from the grid. Both characteristics were helpful in establishing a single discharge pulse for each fiber segment.

  15. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scipioni, B.; Liu, D.; Song, T.

    1993-05-01

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL`s systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor andmore » analyze the PDSF.« less

  16. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  17. Visualizing the Big (and Large) Data from an HPC Resource

    NASA Astrophysics Data System (ADS)

    Sisneros, R.

    2015-10-01

    Supercomputers are built to endure painfully large simulations and contend with resulting outputs. These are characteristics that scientists are all too willing to test the limits of in their quest for science at scale. The data generated during a scientist's workflow through an HPC center (large data) is the primary target for analysis and visualization. However, the hardware itself is also capable of generating volumes of diagnostic data (big data); this presents compelling opportunities to deploy analogous analytic techniques. In this paper we will provide a survey of some of the many ways in which visualization and analysis may be crammed into the scientific workflow as well as utilized on machine-specific data.

  18. Neural basis of quasi-rational decision making.

    PubMed

    Lee, Daeyeol

    2006-04-01

    Standard economic theories conceive homo economicus as a rational decision maker capable of maximizing utility. In reality, however, people tend to approximate optimal decision-making strategies through a collection of heuristic routines. Some of these routines are driven by emotional processes, and others are adjusted iteratively through experience. In addition, routines specialized for social decision making, such as inference about the mental states of other decision makers, might share their origins and neural mechanisms with the ability to simulate or imagine outcomes expected from alternative actions that an individual can take. A recent surge of collaborations across economics, psychology and neuroscience has provided new insights into how such multiple elements of decision making interact in the brain.

  19. Near-Field Noise Source Localization in the Presence of Interference

    NASA Astrophysics Data System (ADS)

    Liang, Guolong; Han, Bo

    In order to suppress the influence of interference sources on the noise source localization in the near field, the near-field broadband source localization in the presence of interference is studied. Oblique projection is constructed with the array measurements and the steering manifold of interference sources, which is used to filter the interference signals out. 2D-MUSIC algorithm is utilized to deal with the data in each frequency, and then the results of each frequency are averaged to achieve the positioning of the broadband noise sources. The simulations show that this method suppresses the interference sources effectively and is capable of locating the source which is in the same direction with the interference source.

  20. The Avoidance of Saturation Limits in Magnetic Bearing Systems During Transient Excitation

    NASA Technical Reports Server (NTRS)

    Rutland, Neil K.; Keogh, Patrick S.; Burrows, Clifford R.

    1996-01-01

    When a transient event, such as mass loss, occurs in a rotor/magnetic bearing system, optimal vibration control forces may exceed bearing capabilities. This will be inevitable when the mass loss is sufficiently large and a conditionally unstable dynamic system could result if the bearing characteristic become non-linear. This paper provides a controller design procedure to suppress, where possible, bearing force demands below saturation levels while maintaining vibration control. It utilizes H(sub infinity) optimization with appropriate input and output weightings. Simulation of transient behavior following mass loss from a flexible rotor is used to demonstrate the avoidance of conditional instability. A compromise between transient control force and vibration levels was achieved.

  1. NPSS Overview to TAFW Multidisciplinary Simulation Capabilities

    NASA Technical Reports Server (NTRS)

    Owen, Karl

    2002-01-01

    The Numerical Propulsion System Simulation (NPSS) is a concerted effort by NASA Glenn Research Center, the aerospace industry, and academia to develop an advanced engineering environment or integrated collection of software programs for the analysis and design of aircraft engines and, eventually, space transportation components. NPSS is now being applied by GE ground power to ground power generation with the view of expanding the capability to nontraditional power plant applications (example: fuel cells) and NPSS has an interest in in-space power and will be developing those simulation capabilities.

  2. Capabilities and applications of the Program to Optimize Simulated Trajectories (POST). Program summary document

    NASA Technical Reports Server (NTRS)

    Brauer, G. L.; Cornick, D. E.; Stevenson, R.

    1977-01-01

    The capabilities and applications of the three-degree-of-freedom (3DOF) version and the six-degree-of-freedom (6DOF) version of the Program to Optimize Simulated Trajectories (POST) are summarized. The document supplements the detailed program manuals by providing additional information that motivates and clarifies basic capabilities, input procedures, applications and computer requirements of these programs. The information will enable prospective users to evaluate the programs, and to determine if they are applicable to their problems. Enough information is given to enable managerial personnel to evaluate the capabilities of the programs and describes the POST structure, formulation, input and output procedures, sample cases, and computer requirements. The report also provides answers to basic questions concerning planet and vehicle modeling, simulation accuracy, optimization capabilities, and general input rules. Several sample cases are presented.

  3. Comparative analysis of the functionality of simulators of the da Vinci surgical robot.

    PubMed

    Smith, Roger; Truong, Mireille; Perez, Manuela

    2015-04-01

    The implementation of robotic technology in minimally invasive surgery has led to the need to develop more efficient and effective training methods, as well as assessment and skill maintenance tools for surgical education. Multiple simulators and procedures are available for educational and training purposes. A need for comparative evaluations of these simulators exists to aid users in selecting an appropriate device for their purposes. We conducted an objective review and comparison of the design and capabilities of all dedicated simulators of the da Vinci robot, the da Vinci Skill Simulator (DVSS) (Intuitive Surgical Inc., Sunnyvale, CA, USA), dV-Trainer (dVT) (Mimic Technologies Inc., Seattle, WA, USA), and Robotic Surgery Simulator (RoSS) (Simulated Surgical Skills, LLC, Williamsville, NY, USA). This provides base specifications of the hardware and software, with an emphasis on the training capabilities of each system. Each simulator contains a large number of training exercises, DVSS = 40, dVT = 65, and RoSS = 52 for skills development. All three offer 3D visual images but use different display technologies. The DVSS leverages the real robotic surgeon's console to provide visualization, hand controls, and foot pedals. The dVT and RoSS created simulated versions of all of these control systems. They include systems management services which allow instructors to collect, export, and analyze the scores of students using the simulators. This study is the first to provide comparative information of the three simulators functional capabilities with an emphasis on their educational skills. They offer unique advantages and capabilities in training robotic surgeons. Each device has been the subject of multiple validation experiments which have been published in the literature. But those do not provide specific details on the capabilities of the simulators which are necessary for an understanding sufficient to select the one best suited for an organization's needs.

  4. Requirements Development for Interoperability Simulation Capability for Law Enforcement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holter, Gregory M.

    2004-05-19

    The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less

  5. Assessment of Capabilities for First-Principles Simulation of Spacecraft Electric Propulsion Systems and Plasma Spacecraft Environment

    DTIC Science & Technology

    2016-04-29

    Simulation of Spacecraft Electric Propulsion Systems and Plasma Spacecraft Environment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Assessment of Capabilities for First‐ Principles Simulation of Spacecraft Electric  Propulsion   Systems and Plasma Spacecraft Environment” Team leader(s

  6. Drill System Development for the Lunar Subsurface Exploration

    NASA Astrophysics Data System (ADS)

    Zacny, Kris; Davis, Kiel; Paulsen, Gale; Roberts, Dustyn; Wilson, Jack; Hernandez, Wilson

    Reaching the cold traps at the lunar poles and directly sensing the subsurface regolith is a primary goal of lunar exploration, especially as a means of prospecting for future In Situ Resource Utilization efforts. As part of the development of a lunar drill capable of reaching a depth of two meters or more, Honeybee Robotics has built a laboratory drill system with a total linear stroke of 1 meter, capability to produce as much as 45 N-m of torque at a rotational speed of 200 rpm, and a capability of delivering maximum downforce of 1000 N. Since this is a test-bed, the motors were purposely chosen to be relative large to provide ample power to the drill system (the Apollo drill was a 500 Watt drill, i.e. not small in current standards). In addition, the drill is capable of using three different drilling modes: rotary, rotary percussive and percussive. The frequency of percussive impact can be varied if needed while rotational speed can be held constant. An integral part of this test bed is a vacuum chamber that is currently being constructed. The drill test-bed is used for analyzing various drilling modes and testing different drill bit and auger systems under low pressure conditions and in lunar regolith simulant. The results of the tests are used to develop final lunar drill design as well as efficient drilling protocols. The drill was also designed to accommodate a downhole neutron spectrometer for measuring the amount of hydrated material in the area surrounding the borehole, as well as downhole temperature sensors, accelerometers, and electrical properties tester. The presentation will include history of lunar drilling, challenges of drilling on the Moon, a description of the drill and chamber as well as preliminary drilling test results conducted in the ice-bound lunar regolith simulant with a variety of drill bits and augers systems.

  7. PSL Icing Facility Upgrade Overview

    NASA Technical Reports Server (NTRS)

    Griffin, Thomas A.; Dicki, Dennis J.; Lizanich, Paul J.

    2014-01-01

    The NASA Glenn Research Center Propulsion Systems Lab (PSL) was recently upgraded to perform engine inlet ice crystal testing in an altitude environment. The system installed 10 spray bars in the inlet plenum for ice crystal generation using 222 spray nozzles. As an altitude test chamber, the PSL is capable of simulating icing events at altitude in a groundtest facility. The system was designed to operate at altitudes from 4,000 to 40,000 ft at Mach numbers up to 0.8M and inlet total temperatures from -60 to +15 degF. This paper and presentation will be part of a series of presentations on PSL Icing and will cover the development of the icing capability through design, developmental testing, installation, initial calibration, and validation engine testing. Information will be presented on the design criteria and process, spray bar developmental testing at Cox and Co., system capabilities, and initial calibration and engine validation test. The PSL icing system was designed to provide NASA and the icing community with a facility that could be used for research studies of engine icing by duplicating in-flight events in a controlled ground-test facility. With the system and the altitude chamber we can produce flight conditions and cloud environments to simulate those encountered in flight. The icing system can be controlled to set various cloud uniformities, droplet median volumetric diameter (MVD), and icing water content (IWC) through a wide variety of conditions. The PSL chamber can set altitudes, Mach numbers, and temperatures of interest to the icing community and also has the instrumentation capability of measuring engine performance during icing testing. PSL last year completed the calibration and initial engine validation of the facility utilizing a Honeywell ALF502-R5 engine and has duplicated in-flight roll back conditions experienced during flight testing. This paper will summarize the modifications and buildup of the facility to accomplish these tests.

  8. The Evolution of Medical Training Simulation in the U.S. Military.

    PubMed

    Linde, Amber S; Kunkler, Kevin

    2016-01-01

    The United States has been at war since 2003. During that time, training using Medical Simulation technology has been developed and integrated into military medical training for combat medics, nurses and surgeons. Efforts stemming from the Joint Programmatic Committee-1 (JPC-1) Medical Simulation and Training Portfolio has allowed for the improvement and advancement in military medical training by focusing on research in simulation training technology in order to achieve this. Based upon lessons learned capability gaps have been identified concerning the necessity to validate and enhance combat medial training simulators. These capability gaps include 1) Open Source/Open Architecture; 2) Modularity and Interoperability; and 3) Material and Virtual Reality (VR) Models. Using the capability gaps, JPC-1 has identified important research endeavors that need to be explored.

  9. Isolation and identification of methanethiol-utilizing bacterium CZ05 and its application in bio-trickling filter of biogas.

    PubMed

    Zhang, Chao-zheng; Zhang, Wei-jiang; Xu, Jiao

    2013-12-01

    A bacterium capable of methanethiol (MT) degradation was enriched and isolated by employing activated sewage sludge as the inoculum in a mineral medium containing MT. The isolate was identified as Paenibacillus polymyxa CZ05 through a Biolog test and 16S rDNA sequencing. This strain can utilize both organic and inorganic media and thrives at pH 4 to 9. The batch culture showed that the strain can degrade MT better in the No. 4 medium than in the No. 1 medium. A series-operating biotrickling filter with lava stone as the carrier was employed to test the application of P. polymyxa CZ05 in the removal of MT in simulated biogas. Long-term experiments showed that a high concentration of MT (60 ppm) was efficiently removed (99.5%) by the biotrickling filters at EBRT 30 s. The addition of hydrogen sulfide decreased the MT removal rate because the dissolved oxygen competed with MT. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Implementation of Basic and Universal Gates In a single Circuit Based On Quantum-dot Cellular Automata Using Multi-Layer Crossbar Wire

    NASA Astrophysics Data System (ADS)

    Bhowmik, Dhrubajyoti; Saha, Apu Kr; Dutta, Paramartha; Nandi, Supratim

    2017-08-01

    Quantum-dot Cellular Automata (QCA) is one of the most substitutes developing nanotechnologies for electronic circuits, as a result of lower force utilization, higher speed and smaller size in correlation with CMOS innovation. The essential devices, a Quantum-dot cell can be utilized to logic gates and wires. As it is the key building block on nanotechnology circuits. By applying simple gates, the hardware requirements for a QCA circuit can be decreased and circuits can be less complex as far as level, delay and cell check. This article exhibits an unobtrusive methodology for actualizing novel upgraded simple and universal gates, which can be connected to outline numerous variations of complex QCA circuits. Proposed gates are straightforward in structure and capable as far as implementing any digital circuits. The main aim is to build all basic and universal gates in a simple circuit with and without crossbar-wire. Simulation results and physical relations affirm its handiness in actualizing each advanced circuit.

  11. Quantitative phase microscopy via optimized inversion of the phase optical transfer function.

    PubMed

    Jenkins, Micah H; Gaylord, Thomas K

    2015-10-01

    Although the field of quantitative phase imaging (QPI) has wide-ranging biomedical applicability, many QPI methods are not well-suited for such applications due to their reliance on coherent illumination and specialized hardware. By contrast, methods utilizing partially coherent illumination have the potential to promote the widespread adoption of QPI due to their compatibility with microscopy, which is ubiquitous in the biomedical community. Described herein is a new defocus-based reconstruction method that utilizes a small number of efficiently sampled micrographs to optimally invert the partially coherent phase optical transfer function under assumptions of weak absorption and slowly varying phase. Simulation results are provided that compare the performance of this method with similar algorithms and demonstrate compatibility with large phase objects. The accuracy of the method is validated experimentally using a microlens array as a test phase object. Lastly, time-lapse images of live adherent cells are obtained with an off-the-shelf microscope, thus demonstrating the new method's potential for extending QPI capability widely in the biomedical community.

  12. Shared virtual environments for aerospace training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Voss, Mark

    1994-01-01

    Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.

  13. A Many-Objective Approach to Developing Adaptive Water Supply Portfolios in the 'Research Triangle' Region of North Carolina

    NASA Astrophysics Data System (ADS)

    Zeff, H. B.; Kasprzyk, J. R.; Reed, P. M.; Characklis, G. W.

    2012-12-01

    This study uses many-objective evolutionary optimization to quantify the tradeoffs water utilities face when developing flexible water shortage response plans. Alternatives to infrastructure development, such as temporary demand management programs and inter-utility water transfer agreements, allow local water providers to develop portfolios of water supply options capable of adapting to changing hydrologic conditions and growing water demand. The extent to which these options are implemented will be determined by a number of conflicting operational and financial considerations. An integrated reservoir simulation model including four large water utilities in the 'Research Triangle' region of North Carolina is used to evaluate the potential tradeoffs resulting from regional demands on shared infrastructure, customer concerns, and the financial uncertainty caused by the intermittent and irregular nature of drought. Instead of providing one optimal solution, multi-objective evolutionary algorithms (MOEAs) use the concept of non-dominations to discover a set of portfolio options in which no solution is inferior to any other solution in all objectives. Interactive visual analytics enable water providers to explore these tradeoffs and develop water shortage response plans tailored to their individual circumstances. The simulation model is evaluated under a number of different formulations to help identify and visualize the impacts of water efficiency, revenue/cost variability, consumer effects, and inter-utility cooperation. The different problems are formulated by adding portfolio options and objectives in such a way that the lower dimensional problem formulations are sub-sets of the full formulation. The full formulation considers reservoir reliability, water use restriction frequency, total water transfer allotment, total costs, revenue/cost variability, and additional consumer losses during restrictions. The simulation results highlight the inadequacy of lower order, cost-benefit type analyses to evaluate water management techniques as they move beyond the construction of large storage infrastructure. This work can help water providers develop the analytical tools to evaluate complex, adaptive techniques that are becoming more attractive in an era of growing municipal demand, risking infrastructure costs, and uncertain hydrology.

  14. Improving the Effectiveness and Acquisition Management of Selected Weapon Systems: A Summary of Major Issues and Recommended Actions.

    DTIC Science & Technology

    1982-05-14

    need for effective training--a situation which will be impaired until the AH-64 combat mission simulator , now under development, becomes available in...antisubmarine warfare system includes the capability to detect, classify, localize, and destroy the enemy. This capability includes multimillion dollar...to simulate combat situations will simulate only air-to-air activity. Air-to-ground and electronic counter countermeasures simulations were deleted

  15. High fidelity studies of exploding foil initiator bridges, Part 3: ALEGRA MHD simulations

    NASA Astrophysics Data System (ADS)

    Neal, William; Garasi, Christopher

    2017-01-01

    Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, and predict a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this third paper of a three part study, the experimental results presented in part 2 are compared against 3-dimensional MHD simulations. This improved experimental capability, along with advanced simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.

  16. New approaches to provide ride-through for critical loads in electric power distribution systems

    NASA Astrophysics Data System (ADS)

    Montero-Hernandez, Oscar C.

    2001-07-01

    The extensive use of electronic circuits has enabled modernization, automation, miniaturization, high quality, low cost, and other achievements regarding electric loads in the last decades. However, modern electronic circuits and systems are extremely sensitive to disturbances from the electric power supply. In fact, the rate at which these disturbances happen is considerable as has been documented in recent years. In response to the power quality concerns presented previously, this dissertation is proposing new approaches to provide ride-through for critical loads during voltage disturbances with emphasis on voltage sags. In this dissertation, a new approach based on an AC-DC-AC system is proposed to provide ride-through for critical loads connected in buildings and/or an industrial system. In this approach, a three-phase IGBT inverter with a built in Dc-link voltage regulator is suitably controlled along with static by-pass switches to provide continuous power to critical loads. During a disturbance, the input utility source is disconnected and the power from the inverter is connected to the load. The remaining voltage in the AC supply is converted to DC and compensated before being applied to the inverter and the load. After detecting normal utility conditions, power from the utility is restored to the critical load. In order to achieve an extended ride-through capability a second approach is introduced. In this case, the Dc-link voltage regulator is performed by a DC-DC Buck-Boost converter. This new approach has the capability to mitigate voltage variations below and above the nominal value. In the third approach presented in this dissertation, a three-phase AC to AC boost converter is investigated. This converter provides a boosting action for the utility input voltages, right before they are applied to the load. The proposed Pulse Width Modulation (PWM) control strategy ensures independent control of each phase and compensates for both single-phase or poly-phase voltage sags. Algorithms capable of detecting voltage disturbances such as voltage sags, voltage swells, flicker, frequency change, and harmonics in a fast and reliable way are investigated and developed in this dissertation as an essential part of the approaches previously described. Simulation and experimental work has been done to validate the feasibility of all approaches under the most common voltage disturbances such as single-phase voltage sags and three-phase voltage sags.

  17. Physiologically based pharmacokinetic modeling using microsoft excel and visual basic for applications.

    PubMed

    Marino, Dale J

    2005-01-01

    Abstract Physiologically based pharmacokinetic (PBPK) models are mathematical descriptions depicting the relationship between external exposure and internal dose. These models have found great utility for interspecies extrapolation. However, specialized computer software packages, which are not widely distributed, have typically been used for model development and utilization. A few physiological models have been reported using more widely available software packages (e.g., Microsoft Excel), but these tend to include less complex processes and dose metrics. To ascertain the capability of Microsoft Excel and Visual Basis for Applications (VBA) for PBPK modeling, models for styrene, vinyl chloride, and methylene chloride were coded in Advanced Continuous Simulation Language (ACSL), Excel, and VBA, and simulation results were compared. For styrene, differences between ACSL and Excel or VBA compartment concentrations and rates of change were less than +/-7.5E-10 using the same numerical integration technique and time step. Differences using VBA fixed step or ACSL Gear's methods were generally <1.00E-03, although larger differences involving very small values were noted after exposure transitions. For vinyl chloride and methylene chloride, Excel and VBA PBPK model dose metrics differed by no more than -0.013% or -0.23%, respectively, from ACSL results. These differences are likely attributable to different step sizes rather than different numerical integration techniques. These results indicate that Microsoft Excel and VBA can be useful tools for utilizing PBPK models, and given the availability of these software programs, it is hoped that this effort will help facilitate the use and investigation of PBPK modeling.

  18. Toward reliable retrieval of functional information of papillary dermis using spatially resolved diffuse reflectance spectroscopy

    PubMed Central

    Chen, Yu-Wen; Guo, Jun-Yen; Tzeng, Shih-Yu; Chou, Ting-Chun; Lin, Ming-Jen; Huang, Lynn Ling-Huei; Yang, Chao-Chun; Hsu, Chao-Kai; Tseng, Sheng-Hao

    2016-01-01

    Spatially resolved diffuse reflectance spectroscopy (SRDRS) has been employed to quantify tissue optical properties and its interrogation volume is majorly controlled by the source-to-detector separations (SDSs). To noninvasively quantify properties of dermis, a SRDRS setup that includes SDS shorter than 1 mm is required. It will be demonstrated in this study that Monte Carlo simulations employing the Henyey-Greenstein phase function cannot always precisely predict experimentally measured diffuse reflectance at such short SDSs, and we speculated this could be caused by the non-negligible backward light scattering at short SDSs that cannot be properly modeled by the Henyey-Greenstein phase function. To accurately recover the optical properties and functional information of dermis using SRDRS, we proposed the use of the modified two-layer (MTL) geometry. Monte Carlo simulations and phantom experiment results revealed that the MTL probing geometry was capable of faithfully recovering the optical properties of upper dermis. The capability of the MTL geometry in probing the upper dermis properties was further verified through a swine study, and it was found that the measurement results were reasonably linked to histological findings. Finally, the MTL probe was utilized to study psoriatic lesions. Our results showed that the MTL probe was sensitive to the physiological condition of tissue volumes within the papillary dermis and could be used in studying the physiology of psoriasis. PMID:26977361

  19. Dissemination and support of ARGUS for accelerator applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User's Guide that documents the use of the code for all users. To release the code and the User's Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  20. Achieving ultra-high temperatures with a resistive emitter array

    NASA Astrophysics Data System (ADS)

    Danielson, Tom; Franks, Greg; Holmes, Nicholas; LaVeigne, Joe; Matis, Greg; McHugh, Steve; Norton, Dennis; Vengel, Tony; Lannon, John; Goodwin, Scott

    2016-05-01

    The rapid development of very-large format infrared detector arrays has challenged the IR scene projector community to also develop larger-format infrared emitter arrays to support the testing of systems incorporating these detectors. In addition to larger formats, many scene projector users require much higher simulated temperatures than can be generated with current technology in order to fully evaluate the performance of their systems and associated processing algorithms. Under the Ultra High Temperature (UHT) development program, Santa Barbara Infrared Inc. (SBIR) is developing a new infrared scene projector architecture capable of producing both very large format (>1024 x 1024) resistive emitter arrays and improved emitter pixel technology capable of simulating very high apparent temperatures. During earlier phases of the program, SBIR demonstrated materials with MWIR apparent temperatures in excess of 1400 K. New emitter materials have subsequently been selected to produce pixels that achieve even higher apparent temperatures. Test results from pixels fabricated using the new material set will be presented and discussed. A 'scalable' Read In Integrated Circuit (RIIC) is also being developed under the same UHT program to drive the high temperature pixels. This RIIC will utilize through-silicon via (TSV) and Quilt Packaging (QP) technologies to allow seamless tiling of multiple chips to fabricate very large arrays, and thus overcome the yield limitations inherent in large-scale integrated circuits. Results of design verification testing of the completed RIIC will be presented and discussed.

  1. A model based bayesian solution for characterization of complex damage scenarios in aerospace composite structures.

    PubMed

    Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J

    2018-01-01

    Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. NON-LTE INVERSIONS OF THE Mg ii h and k AND UV TRIPLET LINES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De la Cruz Rodríguez, Jaime; Leenaarts, Jorrit; Ramos, Andrés Asensio

    The Mg ii h and k lines are powerful diagnostics for studying the solar chromosphere. They have become particularly popular with the launch of the Interface Region Imaging Spectrograph ( IRIS ) satellite, and a number of studies that include these lines have lead to great progress in understanding chromospheric heating, in many cases thanks to the support from 3D MHD simulations. In this study, we utilize another approach to analyze observations: non-LTE inversions of the Mg ii h and k and UV triplet lines including the effects of partial redistribution. Our inversion code attempts to construct a model atmospheremore » that is compatible with the observed spectra. We have assessed the capabilities and limitations of the inversions using the FALC atmosphere and a snapshot from a 3D radiation-MHD simulation. We find that Mg ii h and k allow reconstructing a model atmosphere from the middle photosphere to the transition region. We have also explored the capabilities of a multi-line/multi-atom setup, including the Mg ii h and k, the Ca ii 854.2 nm, and the Fe i 630.25 lines to recover the full stratification of physical parameters, including the magnetic field vector, from the photosphere to the chromosphere. Finally, we present the first inversions of observed IRIS spectra from quiet-Sun, plage, and sunspot, with very promising results.« less

  3. CATS Near Real Time Data Products: Applications for Assimilation Into the NASA GEOS-5 AGCM

    NASA Technical Reports Server (NTRS)

    Hlavka, D. L.; Nowottnick, E. P.; Yorks, J. E.; Da Silva, A.; McGill, M. J.; Palm, S. P.; Selmer, P. A.; Pauly, R. M.; Ozog, S.

    2017-01-01

    From February 2015 through October 2017, the NASA Cloud-Aerosol Transport System (CATS) backscatter lidar operated on the International Space Station (ISS) as a technology demonstration for future Earth Science Missions, providing vertical measurements of cloud and aerosols properties. Owing to its location on the ISS, a cornerstone technology demonstration of CATS was the capability to acquire, process, and disseminate near-real time (NRT) data within 6 hours of observation time. CATS NRT data has several applications, including providing notification of hazardous events for air traffic control and air quality advisories, field campaign flight planning, as well as for constraining cloud and aerosol distributions in via data assimilation in aerosol transport models.   Recent developments in aerosol data assimilation techniques have permitted the assimilation of aerosol optical thickness (AOT), a 2-dimensional column integrated quantity that is reflective of the simulated aerosol loading in aerosol transport models. While this capability has greatly improved simulated AOT forecasts, the vertical position, a key control on aerosol transport, is often not impacted when 2-D AOT is assimilated. Here, we present preliminary efforts to assimilate CATS aerosol observations into the NASA Goddard Earth Observing System version 5 (GEOS-5) atmospheric general circulation model and assimilation system using a 1-D Variational (1-D VAR) ensemble approach, demonstrating the utility of CATS for future Earth Science Missions.

  4. Toward reliable retrieval of functional information of papillary dermis using spatially resolved diffuse reflectance spectroscopy.

    PubMed

    Chen, Yu-Wen; Guo, Jun-Yen; Tzeng, Shih-Yu; Chou, Ting-Chun; Lin, Ming-Jen; Huang, Lynn Ling-Huei; Yang, Chao-Chun; Hsu, Chao-Kai; Tseng, Sheng-Hao

    2016-02-01

    Spatially resolved diffuse reflectance spectroscopy (SRDRS) has been employed to quantify tissue optical properties and its interrogation volume is majorly controlled by the source-to-detector separations (SDSs). To noninvasively quantify properties of dermis, a SRDRS setup that includes SDS shorter than 1 mm is required. It will be demonstrated in this study that Monte Carlo simulations employing the Henyey-Greenstein phase function cannot always precisely predict experimentally measured diffuse reflectance at such short SDSs, and we speculated this could be caused by the non-negligible backward light scattering at short SDSs that cannot be properly modeled by the Henyey-Greenstein phase function. To accurately recover the optical properties and functional information of dermis using SRDRS, we proposed the use of the modified two-layer (MTL) geometry. Monte Carlo simulations and phantom experiment results revealed that the MTL probing geometry was capable of faithfully recovering the optical properties of upper dermis. The capability of the MTL geometry in probing the upper dermis properties was further verified through a swine study, and it was found that the measurement results were reasonably linked to histological findings. Finally, the MTL probe was utilized to study psoriatic lesions. Our results showed that the MTL probe was sensitive to the physiological condition of tissue volumes within the papillary dermis and could be used in studying the physiology of psoriasis.

  5. Analyzing and Visualizing Cosmological Simulations with ParaView

    NASA Astrophysics Data System (ADS)

    Woodring, Jonathan; Heitmann, Katrin; Ahrens, James; Fasel, Patricia; Hsu, Chung-Hsing; Habib, Salman; Pope, Adrian

    2011-07-01

    The advent of large cosmological sky surveys—ushering in the era of precision cosmology—has been accompanied by ever larger cosmological simulations. The analysis of these simulations, which currently encompass tens of billions of particles and up to a trillion particles in the near future, is often as daunting as carrying out the simulations in the first place. Therefore, the development of very efficient analysis tools combining qualitative and quantitative capabilities is a matter of some urgency. In this paper, we introduce new analysis features implemented within ParaView, a fully parallel, open-source visualization toolkit, to analyze large N-body simulations. A major aspect of ParaView is that it can live and operate on the same machines and utilize the same parallel power as the simulation codes themselves. In addition, data movement is in a serious bottleneck now and will become even more of an issue in the future; an interactive visualization and analysis tool that can handle data in situ is fast becoming essential. The new features in ParaView include particle readers and a very efficient halo finder that identifies friends-of-friends halos and determines common halo properties, including spherical overdensity properties. In combination with many other functionalities already existing within ParaView, such as histogram routines or interfaces to programming languages like Python, this enhanced version enables fast, interactive, and convenient analyses of large cosmological simulations. In addition, development paths are available for future extensions.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, R.D.

    The results of a research effort to develop a multiphase naturally fractured, lenticular reservoir simulator is presented. The simulator possesses the capability of investigating the effects of non-Darcy flow, Klinkenberg effect, and transient multiphase wellbore storage for wells with finite and infinite conductivity fractures. The simulator has been utilized to simulate actual pressure transient data for gas wells associated with the United States Department of Energy, Western Gas Sands Project, MWX Experiments. The results of these simulations are contained in the report as well as simulation results for hypothetical wells which are producing under multiphase flow conditions. In addition tomore » the reservoir simulation development, and theoretical and field case studies the results of an experimental program to investigate multiphase non-Darcy flow coefficients (inertial resistance coefficients or beta factors as they are sometimes called) are also presented. The experimental data was obtained for non-Darcy flow in porous and fractured media. The results clearly indicate the dependence of the non-Darcy flow coefficient upon liquid saturation. Where appropriate comparisons are made against data available in the open literature. In addition, theoretical development of a correlation to predict non-Darcy flow coefficients as a function of effective gas permeability, liquid saturations, and porosity is presentd. The results presented in this report will provide scientists and engineers tools to investigate well performance data and production trends for wells completed in lenticular, naturally fractured formations producing under non-Darcy, multiphase conditions. 65 refs., 57 figs., 15 tabs.« less

  7. Multi-ray medical ultrasound simulation without explicit speckle modelling.

    PubMed

    Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak

    2018-05-04

    To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.

  8. Off-Gas Adsorption Model Capabilities and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, Kevin L.; Welty, Amy K.; Law, Jack

    2016-03-01

    Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less

  9. Advanced Cardiac Life Support (ACLS) utilizing Man-Tended Capability (MTC) hardware onboard Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, M.; Barratt, M.; Lloyd, C.

    1992-01-01

    Because of the time and distance involved in returning a patient from space to a definitive medical care facility, the capability for Advanced Cardiac Life Support (ACLS) exists onboard Space Station Freedom. Methods: In order to evaluate the effectiveness of terrestrial ACLS protocols in microgravity, a medical team conducted simulations during parabolic flights onboard the KC-135 aircraft. The hardware planned for use during the MTC phase of the space station was utilized to increase the fidelity of the scenario and to evaluate the prototype equipment. Based on initial KC-135 testing of CPR and ACLS, changes were made to the ventricular fibrillation algorithm in order to accommodate the space environment. Other constraints to delivery of ACLS onboard the space station include crew size, minimum training, crew deconditioning, and limited supplies and equipment. Results: The delivery of ACLS in microgravity is hindered by the environment, but should be adequate. Factors specific to microgravity were identified for inclusion in the protocol including immediate restraint of the patient and early intubation to insure airway. External cardiac compressions of adequate force and frequency were administered using various methods. The more significant limiting factors appear to be crew training, crew size, and limited supplies. Conclusions: Although ACLS is possible in the microgravity environment, future evaluations are necessary to further refine the protocols. Proper patient and medical officer restraint is crucial prior to advanced procedures. Also emphasis should be placed on early intubation for airway management and drug administration. Preliminary results and further testing will be utilized in the design of medical hardware, determination of crew training, and medical operations for space station and beyond.

  10. The Life Cycle Application of Intelligent Software Modeling for the First Materials Science Research Rack

    NASA Technical Reports Server (NTRS)

    Rice, Amanda; Parris, Frank; Nerren, Philip

    2000-01-01

    Marshall Space Flight Center (MSFC) has been funding development of intelligent software models to benefit payload ground operations for nearly a decade. Experience gained from simulator development and real-time monitoring and control is being applied to engineering design, testing, and operation of the First Material Science Research Rack (MSRR-1). MSRR-1 is the first rack in a suite of three racks comprising the Materials Science Research Facility (MSRF) which will operate on the International Space Station (ISS). The MSRF will accommodate advanced microgravity investigations in areas such as the fields of solidification of metals and alloys, thermo-physical properties of polymers, crystal growth studies of semiconductor materials, and research in ceramics and glasses. The MSRR-1 is a joint venture between NASA and the European Space Agency (ESA) to study the behavior of different materials during high temperature processing in a low gravity environment. The planned MSRR-1 mission duration is five (5) years on-orbit and the total design life is ten (IO) years. The MSRR-1 launch is scheduled on the third Utilization Flight (UF-3) to ISS, currently in February of 2003). The objective of MSRR-1 is to provide an early capability on the ISS to conduct material science, materials technology, and space product research investigations in microgravity. It will provide a modular, multi-user facility for microgravity research in materials crystal growth and solidification. An intelligent software model of MSRR-1 is under development and will serve multiple purposes to support the engineering analysis, testing, training, and operational phases of the MSRR-1 life cycle development. The G2 real-time expert system software environment developed by Gensym Corporation was selected as the intelligent system shell for this development work based on past experience gained and the effectiveness of the programming environment. Our approach of multi- uses of the simulation model and its intuitive graphics capabilities is providing a concurrent engineering environment for rapid prototyping and development. Operational schematics of the MSRR-1 electrical, thermal control, vacuum access, and gas supply systems, and furnace inserts are represented graphically in the environment. Logic to represent first order engineering calculations is coded into the knowledge base to simulate the operational behavior of the MSRR-1 systems. An example of engineering data provided includes electrical currents, voltages, operational power, temperatures, thermal fluid flow rates. pressures, and component status indications. These type of data are calculated and displayed at appropriate instrumentation points, and the schematics are animated to reflect the simulated operational status of the MSRR-1. The software control functions are also simulated to represent appropriate operational behavior based on automated control and response to commands received by the crew or ground controllers. The first benefit of this simulation environment is being realized in the high fidelity engineering analysis results from the electrical power system G2 model. Secondly, the MSRR-1 simulation model will be embedded with a hardware mock-up of the MSRR-1 to provide crew training on MSRR-1 integrated payload operations. G2 gateway code will output the simulated instrumentation values, termed as telemetry, in a flight-like data stream so that the crew has realistic and accurate simulated MSRR-1 data on the flight displays which will be designed for crew use. The simulation will also respond appropriately to crew or ground initiated commands, which will be part of normal facility operations. A third use of the G2 model is being planned; the MSRR-1 simulation will be integrated with additional software code as part of the test configuration of the primary onboard computer, or Master Controller, for MSRR-1. We will take advantage of the G2 capability to simulate the flight like data stream to test flight software responses and behavior. A fourth use of the G2 model will be to train the Ground Support Personnel that will monitor the MSRR-1 systems and payloads while they are operating aboard the ISS. The intuitive, schematic based environment will provide an excellent foundation for personnel to understand the integrated configuration and operation of the MSRR-1, and the anticipated telemetry feedback based on operational modes of the equipment. Expert monitoring features will be enhanced to provide a smart monitoring environment for the operators. These features include: (1) Animated, intuitive schematic-based displays which reflect telemetry values, (1) Real-time plotting of simulated or incoming sensor values, (3) High/Low exception monitoring for analog data, (4) Expected state monitoring for discrete data, (5) Data trending, (6) Automated malfunction procedure execution to diagnose problems, (7) Look ahead capability to planned MSRR-1 activities in the onboard timeline. And finally, the logic to calculate telemetry values will be deactivated, and the same environment will interface to the incoming data for the real-time telemetry stream to schematically represent the onboard hardware configuration. G2 will be the foundation for the real-time monitoring and control environment. In summary, our MSRR-1 simulation model spans many elements of the life cycle development of this project: Engineering Analysis, Test and Checkout, Training of Crew and Ground Personnel, and Real-time monitoring and control. By utilizing the unique features afforded by an expert system development environment, we have been able to synergize a powerful tool capable of addressing our project needs at every phase of project development.

  11. Image simulation and assessment of the colour and spatial capabilities of the Colour and Stereo Surface Imaging System (CaSSIS) on the ExoMars Trace Gas Orbiter

    USGS Publications Warehouse

    Tornabene, Livio L.; Seelos, Frank P.; Pommerol, Antoine; Thomas, Nicolas; Caudill, Christy M.; Becerra, Patricio; Bridges, John C.; Byrne, Shane; Cardinale, Marco; Chojnacki, Matthew; Conway, Susan J.; Cremonese, Gabriele; Dundas, Colin M.; El-Maarry, M. R.; Fernando, Jennifer; Hansen, Candice J.; Hansen, Kayle; Harrison, Tanya N.; Henson, Rachel; Marinangeli, Lucia; McEwen, Alfred S.; Pajola, Maurizio; Sutton, Sarah S.; Wray, James J.

    2018-01-01

    This study aims to assess the spatial and visible/near-infrared (VNIR) colour/spectral capabilities of the 4-band Colour and Stereo Surface Imaging System (CaSSIS) aboard the ExoMars 2016 Trace Grace Orbiter (TGO). The instrument response functions for the CaSSIS imager was used to resample spectral libraries, modelled spectra and to construct spectrally (i.e., in I/F space) and spatially consistent simulated CaSSIS image cubes of various key sites of interest and for ongoing scientific investigations on Mars. Coordinated datasets from Mars Reconnaissance Orbiter (MRO) are ideal, and specifically used for simulating CaSSIS. The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) provides colour information, while the Context Imager (CTX), and in a few cases the High-Resolution Imaging Science Experiment (HiRISE), provides the complementary spatial information at the resampled CaSSIS unbinned/unsummed pixel resolution (4.6 m/pixel from a 400-km altitude). The methodology used herein employs a Gram-Schmidt spectral sharpening algorithm to combine the ∼18–36 m/pixel CRISM-derived CaSSIS colours with I/F images primarily derived from oversampled CTX images. One hundred and eighty-one simulated CaSSIS 4-colour image cubes (at 18–36 m/pixel) were generated (including one of Phobos) based on CRISM data. From these, thirty-three “fully”-simulated image cubes of thirty unique locations on Mars (i.e., with 4 colour bands at 4.6 m/pixel) were made. All simulated image cubes were used to test both the colour capabilities of CaSSIS by producing standard colour RGB images, colour band ratio composites (CBRCs) and spectral parameters. Simulated CaSSIS CBRCs demonstrated that CaSSIS will be able to readily isolate signatures related to ferrous (Fe2+) iron- and ferric (Fe3+) iron-bearing deposits on the surface of Mars, ices and atmospheric phenomena. Despite the lower spatial resolution of CaSSIS when compared to HiRISE, the results of this work demonstrate that CaSSIS will not only compliment HiRISE-scale studies of various geological and seasonal phenomena, it will also enhance them by providing additional colour and geologic context through its wider and longer full-colour coverage (∼9.4×50">∼9.4×50∼9.4×50 km), and its increased sensitivity to iron-bearing materials from its two IR bands (RED and NIR). In a few examples, subtle surface changes that were not easily detected by HiRISE were identified in the simulated CaSSIS images. This study also demonstrates the utility of the Gram-Schmidt spectral pan-sharpening technique to extend VNIR colour/spectral capabilities from a lower spatial resolution colour/spectral dataset to a single-band or panchromatic image greyscale image with higher resolution. These higher resolution colour products (simulated CaSSIS or otherwise) are useful as means to extend both geologic context and mapping of datasets with coarser spatial resolutions. The results of this study indicate that the TGO mission objectives, as well as the instrument-specific mission objectives, will be achievable with CaSSIS.

  12. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less

  13. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE PAGES

    Wang, Bei; Ethier, Stephane; Tang, William; ...

    2017-06-29

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  14. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  15. A tradeoff study of determine the optimum approach to a wash/rinse capability to support future space flight

    NASA Technical Reports Server (NTRS)

    Wilson, D. A.

    1976-01-01

    Specific requirements for a wash/rinse capability to support Spacelab biological experimentation and to identify various concepts for achieving this capability were determined. This included the examination of current state-of-the-art and emerging technology designs that would meet the wash/rinse requirements. Once several concepts were identified, including the disposable utensils, tools and gloves or other possible alternatives, a tradeoff analysis involving system cost, weight, volume utilization, functional performance, maintainability, reliability, power utilization, safety, complexity, etc., was performed so as to determine an optimum approach for achieving a wash/rinse capability to support future space flights. Missions of varying crew size and durations were considered.

  16. The Space Systems Environmental Test Facility Database (SSETFD), Website Development Status

    NASA Technical Reports Server (NTRS)

    Snyder, James M.

    2008-01-01

    The Aerospace Corporation has been developing a database of U.S. environmental test laboratory capabilities utilized by the space systems hardware development community. To date, 19 sites have been visited by The Aerospace Corporation and verbal agreements reached to include their capability descriptions in the database. A website is being developed to make this database accessible by all interested government, civil, university and industry personnel. The website will be accessible by all interested in learning more about the extensive collective capability that the US based space industry has to offer. The Environments, Test & Assessment Department within The Aerospace Corporation will be responsible for overall coordination and maintenance of the database. Several US government agencies are interested in utilizing this database to assist in the source selection process for future spacecraft programs. This paper introduces the website by providing an overview of its development, location and search capabilities. It will show how the aerospace community can apply this new tool as a way to increase the utilization of existing lab facilities, and as a starting point for capital expenditure/upgrade trade studies. The long term result is expected to be increased utilization of existing laboratory capability and reduced overall development cost of space systems hardware. Finally, the paper will present the process for adding new participants, and how the database will be maintained.

  17. Development and application of the microbial fate and transport module for the Agricultural Policy/Environmental eXtender (APEX) model

    NASA Astrophysics Data System (ADS)

    Hong, E.; Park, Y.; Muirhead, R.; Jeong, J.; Pachepsky, Y. A.

    2017-12-01

    Pathogenic microorganisms in recreational and irrigation waters remain the subject of concern. Water quality models are used to estimate microbial quality of water sources, to evaluate microbial contamination-related risks, to guide the microbial water quality monitoring, and to evaluate the effect of agricultural management on the microbial water quality. The Agricultural Policy/Environmental eXtender (APEX) is the watershed-scale water quality model that includes highly detailed representation of agricultural management. The APEX currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop the first APEX microbial fate and transport module that could use the APEX conceptual model of manure removal together with recently introduced conceptualizations of the in-stream microbial fate and transport. The module utilizes manure erosion rates found in the APEX. Bacteria survival in soil-manure mixing layer was simulated with the two-stage survival model. Individual survival patterns were simulated for each manure application date. Simulated in-stream microbial fate and transport processes included the reach-scale passive release of bacteria with resuspended bottom sediment during high flow events, the transport of bacteria from bottom sediment due to the hyporheic exchange during low flow periods, the deposition with settling sediment, and the two-stage survival. Default parameter values were available from recently published databases. The APEX model with the newly developed microbial fate and transport module was applied to simulate seven years of monitoring data for the Toenepi watershed in New Zealand. Based on calibration and testing results, the APEX with the microbe module reproduced well the monitored pattern of E. coli concentrations at the watershed outlet. The APEX with the microbial fate and transport module will be utilized for predicting microbial quality of water under various agricultural practices, evaluating monitoring protocols, and supporting the selection of management practices based on regulations that rely on fecal indicator bacteria concentrations.

  18. A Virtual Ocean Observatory for Climate and Ocean Science: Synergistic Applications for SWOT and XOVWM

    NASA Astrophysics Data System (ADS)

    Arabshahi, P.; Howe, B. M.; Chao, Y.; Businger, S.; Chien, S.

    2010-12-01

    We present a virtual ocean observatory (VOO) that supports climate and ocean science as addressed in the NRC decadal survey. The VOO is composed of an autonomous software system, in-situ and space-based sensing assets, data sets, and interfaces to ocean and atmosphere models. The purpose of this observatory and its output data products are: 1) to support SWOT mission planning, 2) to serve as a vanguard for fusing SWOT, XOVWM, and in-situ data sets through fusion of OSTM (SWOT proxy) and QuikSCAT (XOVWM proxy) data with in-situ data, and 3) to serve as a feed-forward platform for high-resolution measurements of ocean surface topography (OST) in island and coastal environments utilizing space-based and in-situ adaptive sampling. The VOO will enable models capable of simulating and estimating realistic oceanic processes and atmospheric forcing of the ocean in these environments. Such measurements are critical in understanding the oceans' effects on global climate. The information systems innovations of the VOO are: 1. Development of an autonomous software platform for automated mission planning and combining science data products of QuikSCAT and OSTM with complementary in-situ data sets to deliver new data products. This software will present first-step demonstrations of technology that, once matured, will offer increased operational capability to SWOT by providing automated planning, and new science data sets using automated workflows. The future data sets to be integrated include those from SWOT and XOVWM. 2. A capstone demonstration of the effort utilizes the elements developed in (1) above to achieve adaptive in-situ sampling through feedback from space-based-assets via the SWOT simulator. This effort will directly contribute to orbit design during the experimental phase (first 6-9 months) of the SWOT mission by high resolution regional atmospheric and ocean modeling and sampling. It will also contribute to SWOT science via integration of in-situ data, QuikSCAT, and OSTM data sets, and models, thus serving as technology pathfinder for SWOT and XOVWM data fusion; and will contribute to SWOT operations via data fusion and mission planning technology. The goals of our project are as follows: (a) Develop and test the VOO, including hardware, in-situ science platforms (Seagliders) and instruments, and two autonomous software modules: 1) automated data fusion/assimilation, and 2) automated planning technology; (b) Generate new data sets (OST data in the Hawaiian Islands region) from fusion of in-situ data with QuikSCAT and OSTM data; (c) Integrate data sets derived from the VOO into the SWOT simulator for improved SWOT mission planning; (d) Demonstrate via Hawaiian Islands region field experiments and simulation the operational capability of the VOO to generate improved hydrologic cycle/ocean science, in particular: mesoscale and submesoscale ocean circulation including velocities, vorticity, and stress measurements, that are important to the modeling of ocean currents, eddies and mixing.

  19. Modular, high power, variable R dynamic electrical load simulator

    NASA Technical Reports Server (NTRS)

    Joncas, K. P.

    1974-01-01

    The design of a previously developed basic variable R load simulator was entended to increase its power dissipation and transient handling capabilities. The delivered units satisfy all design requirements, and provides for a high power, modular simulation capability uniquely suited to the simulation of complex load responses. In addition to presenting conclusions and recommendations and pertinent background information, the report covers program accomplishments; describes the simulator basic circuits, transfer characteristic, protective features, assembly, and specifications; indicates the results of simulator evaluation, including burn-in and acceptance testing; provides acceptance test data; and summarizes the monthly progress reports.

  20. Coupled Fluid-Structure Interaction Analysis of Solid Rocket Motor with Flexible Inhibitors

    NASA Technical Reports Server (NTRS)

    Yang, H. Q.; West, Jeff; Harris, Robert E.

    2014-01-01

    Flexible inhibitors are generally used in solid rocket motors (SRMs) as a means to control the burning of propellant. Vortices generated by the flow of propellant around the flexible inhibitors have been identified as a driving source of instabilities that can lead to thrust oscillations in launch vehicles. Potential coupling between the SRM thrust oscillations and structural vibration modes is an important risk factor in launch vehicle design. As a means to predict and better understand these phenomena, a multidisciplinary simulation capability that couples the NASA production CFD code, Loci/CHEM, with CFDRC's structural finite element code, CoBi, has been developed. This capability is crucial to the development of NASA's new space launch system (SLS). This paper summarizes the efforts in applying the coupled software to demonstrate and investigate fluid-structure interaction (FSI) phenomena between pressure waves and flexible inhibitors inside reusable solid rocket motors (RSRMs). The features of the fluid and structural solvers are described in detail, and the coupling methodology and interfacial continuity requirements are then presented in a general Eulerian-Lagrangian framework. The simulations presented herein utilize production level CFD with hybrid RANS/LES turbulence modeling and grid resolution in excess of 80 million cells. The fluid domain in the SRM is discretized using a general mixed polyhedral unstructured mesh, while full 3D shell elements are utilized in the structural domain for the flexible inhibitors. Verifications against analytical solutions for a structural model under a steady uniform pressure condition and under dynamic modal analysis show excellent agreement in terms of displacement distribution and eigenmode frequencies. The preliminary coupled results indicate that due to acoustic coupling, the dynamics of one of the more flexible inhibitors shift from its first modal frequency to the first acoustic frequency of the solid rocket motor. This insight could have profound implications for SRM and flexible inhibitor designs for current and future launch vehicles including SLS.

Top