Sample records for simulation tool capable

  1. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  2. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  3. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  4. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  5. Simulator for concurrent processing data flow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.; Stoughton, John W.; Mielke, Roland R.

    1992-01-01

    A software simulator capability of simulating execution of an algorithm graph on a given system under the Algorithm to Architecture Mapping Model (ATAMM) rules is presented. ATAMM is capable of modeling the execution of large-grained algorithms on distributed data flow architectures. Investigating the behavior and determining the performance of an ATAMM based system requires the aid of software tools. The ATAMM Simulator presented is capable of determining the performance of a system without having to build a hardware prototype. Case studies are performed on four algorithms to demonstrate the capabilities of the ATAMM Simulator. Simulated results are shown to be comparable to the experimental results of the Advanced Development Model System.

  6. Development of Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Capability 2 and Experimental Plans

    NASA Technical Reports Server (NTRS)

    Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.; hide

    2006-01-01

    The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.

  7. Fast scattering simulation tool for multi-energy x-ray imaging

    NASA Astrophysics Data System (ADS)

    Sossin, A.; Tabary, J.; Rebuffel, V.; Létang, J. M.; Freud, N.; Verger, L.

    2015-12-01

    A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.

  8. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  9. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  10. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meisner, Robert; McCoy, Michel; Archer, Bill

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less

  11. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  12. Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Madden, Michael

    2014-01-01

    The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.

  13. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  14. Data Presentation and Visualization (DPV) Interface Control Document

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    Data Presentation and Visualization (DPV) is a subset of the modeling and simulation (M&S) capabilities at Kennedy Space Center (KSC) that endeavors to address the challenges of how to present and share simulation output for analysts, stakeholders, decision makers, and other interested parties. DPV activities focus on the development and provision of visualization tools to meet the objectives identified above, as well as providing supporting tools and capabilities required to make its visualization products available and accessible across NASA.

  15. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  16. Approaches to incorporating climate change effects in state and transition simulation models of vegetation

    Treesearch

    Becky K. Kerns; Miles A. Hemstrom; David Conklin; Gabriel I. Yospin; Bart Johnson; Dominique Bachelet; Scott Bridgham

    2012-01-01

    Understanding landscape vegetation dynamics often involves the use of scientifically-based modeling tools that are capable of testing alternative management scenarios given complex ecological, management, and social conditions. State-and-transition simulation model (STSM) frameworks and software such as PATH and VDDT are commonly used tools that simulate how landscapes...

  17. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  18. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  19. Transitioning Human, Social, Cultural Behavior (HSCB) Models and Simulations to the Operational User1

    DTIC Science & Technology

    2009-10-01

    actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process

  20. Simulation of cyber attacks with applications in homeland defense training

    NASA Astrophysics Data System (ADS)

    Brown, Bill; Cutts, Andrew; McGrath, Dennis; Nicol, David M.; Smith, Timothy P.; Tofel, Brett

    2003-09-01

    We describe a tool to help exercise and train IT managers who make decisions about IP networks in the midst of cyber calamity. Our tool is interactive, centered around a network simulation. It includes various modes of communications one would use to make informed decisions. Our tool is capable of simulating networks with hundreds of components and dozens of players. Test indicate that it could support an exercise an order of magnitude larger and more complex.

  1. Software Comparison for Renewable Energy Deployment in a Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less

  2. EBUS-STAT Subscore Analysis to Predict the Efficacy and Assess the Validity of Virtual Reality Simulation for EBUS-TBNA Training Among Experienced Bronchoscopists.

    PubMed

    Scarlata, Simone; Palermo, Patrizio; Candoli, Piero; Tofani, Ariela; Petitti, Tommasangelo; Corbetta, Lorenzo

    2017-04-01

    Linear endobronchial ultrasound transbronchial needle aspiration (EBUS-TBNA) represents a pivotal innovation in interventional pulmonology; determining the best approach to guarantee systematic and efficient training is expected to become a main issue in the forthcoming years. Virtual reality simulators have been proposed as potential EBUS-TBNA training instruments, to avoid unskilled beginners practicing directly in real-life settings. A validated and perfected simulation program could be used before allowing beginners to practice on patients. Our goal was to test the reliability of the EBUS-Skills and Task Assessment Tool (STAT) and its subscores for measuring the competence of experienced bronchoscopists approaching EBUS-guided TBNA, using only the virtual reality simulator as both a training and an assessment tool. Fifteen experienced bronchoscopists, with poor or no experience in EBUS-TBNA, participated in this study. They were all administered the Italian version of the EBUS-STAT evaluation tool, during a high-fidelity virtual reality simulation. This was followed by a single 7-hour theoretical and practical (on simulators) session on EBUS-TBNA, at the end of which their skills were reassessed by EBUS-STAT. An overall, significant improvement in EBUS-TBNA skills was observed, thereby confirming that (a) virtual reality simulation can facilitate practical learning among practitioners, and (b) EBUS-STAT is capable of detecting these improvements. The test's overall ability to detect differences was negatively influenced by the minimal variation of the scores relating to items 1 and 2, was not influenced by the training, and improved significantly when the 2 items were not considered. Apart from these 2 items, all the remaining subscores were equally capable of revealing improvements in the learner. Lastly, we found that trainees with presimulation EBUS-STAT scores above 79 did not show any significant improvement after virtual reality training, suggesting that this score represents a cutoff value capable of predicting the likelihood that simulation can be beneficial. Virtual reality simulation is capable of providing a practical learning tool for practitioners with previous experience in flexible bronchoscopy, and the EBUS-STAT questionnaire is capable of detecting these changes. A pretraining EBUS-STAT score below 79 is a good indicator of those candidates who will benefit from the simulation training. Further studies are needed to verify whether a modified version of the questionnaire would be capable of improving its performance among experienced bronchoscopists.

  3. Development of a fourth generation predictive capability maturity model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard Guy; Witkowski, Walter R.; Urbina, Angel

    2013-09-01

    The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, themore » PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.« less

  4. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  5. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less

  7. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  8. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  9. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  10. Building Airport Surface HITL Simulation Capability

    NASA Technical Reports Server (NTRS)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  11. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  12. Demonstration of CBR Modeling and Simulation Tool (CBRSim) Capabilities. Installation Technology Transfer Program

    DTIC Science & Technology

    2009-04-01

    Capabilities Co ns tr uc tio n En gi ne er in g R es ea rc h La bo ra to ry Kathy L. Simunich, Timothy K. Perkins, David M. Bailey, David Brown, and...inversion height in convective condition is estimated with a one- dimensional model of the atmospheric boundary layer based on the Drie- donks slab model...tool and its capabilities. Installation geospatial data, in CAD format, were obtained for select buildings, roads, and topographic features in

  13. Computer software tool REALM for sustainable water allocation and management.

    PubMed

    Perera, B J C; James, B; Kularathna, M D U

    2005-12-01

    REALM (REsource ALlocation Model) is a generalised computer simulation package that models harvesting and bulk distribution of water resources within a water supply system. It is a modeling tool, which can be applied to develop specific water allocation models. Like other water resource simulation software tools, REALM uses mass-balance accounting at nodes, while the movement of water within carriers is subject to capacity constraints. It uses a fast network linear programming algorithm to optimise the water allocation within the network during each simulation time step, in accordance with user-defined operating rules. This paper describes the main features of REALM and provides potential users with an appreciation of its capabilities. In particular, it describes two case studies covering major urban and rural water supply systems. These case studies illustrate REALM's capabilities in the use of stochastically generated data in water supply planning and management, modelling of environmental flows, and assessing security of supply issues.

  14. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  15. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    NASA Technical Reports Server (NTRS)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  16. Accurate Modeling of Dark-Field Scattering Spectra of Plasmonic Nanostructures.

    PubMed

    Jiang, Liyong; Yin, Tingting; Dong, Zhaogang; Liao, Mingyi; Tan, Shawn J; Goh, Xiao Ming; Allioux, David; Hu, Hailong; Li, Xiangyin; Yang, Joel K W; Shen, Zexiang

    2015-10-27

    Dark-field microscopy is a widely used tool for measuring the optical resonance of plasmonic nanostructures. However, current numerical methods for simulating the dark-field scattering spectra were carried out with plane wave illumination either at normal incidence or at an oblique angle from one direction. In actual experiments, light is focused onto the sample through an annular ring within a range of glancing angles. In this paper, we present a theoretical model capable of accurately simulating the dark-field light source with an annular ring. Simulations correctly reproduce a counterintuitive blue shift in the scattering spectra from gold nanodisks with a diameter beyond 140 nm. We believe that our proposed simulation method can be potentially applied as a general tool capable of simulating the dark-field scattering spectra of plasmonic nanostructures as well as other dielectric nanostructures with sizes beyond the quasi-static limit.

  17. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  18. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  19. ECO-DRIVING MODELING ENVIRONMENT

    DOT National Transportation Integrated Search

    2015-11-01

    This research project aims to examine the eco-driving modeling capabilities of different traffic modeling tools available and to develop a driver-simulator-based eco-driving modeling tool to evaluate driver behavior and to reliably estimate or measur...

  20. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less

  1. The Role of Crop Systems Simulation in Agriculture and Environment

    USDA-ARS?s Scientific Manuscript database

    Over the past 30 to 40 years, simulation of crop systems has advanced from a neophyte science with inadequate computing power into a robust and increasingly accepted science supported by improved software, languages, development tools, and computer capabilities. Crop system simulators contain mathe...

  2. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  3. Global Simulation of Aviation Operations

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Sheth, Kapil; Ng, Hok Kwan; Morando, Alex; Li, Jinhua

    2016-01-01

    The simulation and analysis of global air traffic is limited due to a lack of simulation tools and the difficulty in accessing data sources. This paper provides a global simulation of aviation operations combining flight plans and real air traffic data with historical commercial city-pair aircraft type and schedule data and global atmospheric data. The resulting capability extends the simulation and optimization functions of NASA's Future Air Traffic Management Concept Evaluation Tool (FACET) to global scale. This new capability is used to present results on the evolution of global air traffic patterns from a concentration of traffic inside US, Europe and across the Atlantic Ocean to a more diverse traffic pattern across the globe with accelerated growth in Asia, Australia, Africa and South America. The simulation analyzes seasonal variation in the long-haul wind-optimal traffic patterns in six major regions of the world and provides potential time-savings of wind-optimal routes compared with either great circle routes or current flight-plans if available.

  4. Predictive Capability Maturity Model (PCMM).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Knupp, Patrick Michael; Urbina, Angel

    2010-10-01

    Predictive Capability Maturity Model (PCMM) is a communication tool that must include a dicussion of the supporting evidence. PCMM is a tool for managing risk in the use of modeling and simulation. PCMM is in the service of organizing evidence to help tell the modeling and simulation (M&S) story. PCMM table describes what activities within each element are undertaken at each of the levels of maturity. Target levels of maturity can be established based on the intended application. The assessment is to inform what level has been achieved compared to the desired level, to help prioritize the VU activities &more » to allocate resources.« less

  5. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  6. Medical Simulations for Exploration Medicine

    NASA Technical Reports Server (NTRS)

    Reyes, David; Suresh, Rahul; Pavela, James; Urbina, Michelle; Mindock, Jennifer; Antonsen, Erik

    2018-01-01

    Medical simulation is a useful tool that can be used to train personnel, develop medical processes, and assist cross-disciplinary communication. Medical simulations have been used in the past at NASA for these purposes, however they are usually created ad hoc. A stepwise approach to scenario development has not previously been used. The NASA Exploration Medical Capability (ExMC) created a medical scenario development tool to test medical procedures, technologies, concepts of operation and for use in systems engineering (SE) processes.

  7. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less

  8. Telecom Modeling with ChatterBell.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jrad, Ahmad M.; Kelic, Andjelka

    This document provides a description and user manual for the ChatterBell voice telecom modeling and simulation capability. The intended audience consists of network planners and practitioners who wish to use the tool to model a particular voice network and analyze its behavior under varying assumptions and possible failure conditions. ChatterBell is built on top of the N-SMART voice simulation and visualization suite that was developed through collaboration between Sandia National Laboratories and Bell Laboratories of Lucent Technologies. The new and improved modeling and simulation tool has been modified and modernized to incorporate the latest development in the telecom world includingmore » the widespread use of VoIP technology. In addition, ChatterBell provides new commands and modeling capabilities that were not available in the N-SMART application.« less

  9. Simulation Tool for Dielectric Barrier Discharge Plasma Actuators at Atmospheric and Sub-Atmospheric Pressures: SBIR Phase I Final Report

    NASA Technical Reports Server (NTRS)

    Likhanskii, Alexandre

    2012-01-01

    This report is the final report of a SBIR Phase I project. It is identical to the final report submitted, after some proprietary information of administrative nature has been removed. The development of a numerical simulation tool for dielectric barrier discharge (DBD) plasma actuator is reported. The objectives of the project were to analyze and predict DBD operation at wide range of ambient gas pressures. It overcomes the limitations of traditional DBD codes which are limited to low-speed applications and have weak prediction capabilities. The software tool allows DBD actuator analysis and prediction for subsonic to hypersonic flow regime. The simulation tool is based on the VORPAL code developed by Tech-X Corporation. VORPAL's capability of modeling DBD plasma actuator at low pressures (0.1 to 10 torr) using kinetic plasma modeling approach, and at moderate to atmospheric pressures (1 to 10 atm) using hydrodynamic plasma modeling approach, were demonstrated. In addition, results of experiments with pulsed+bias DBD configuration that were performed for validation purposes are reported.

  10. Simulation-Based Analysis of Reentry Dynamics for the Sharp Atmospheric Entry Vehicle

    NASA Technical Reports Server (NTRS)

    Tillier, Clemens Emmanuel

    1998-01-01

    This thesis describes the analysis of the reentry dynamics of a high-performance lifting atmospheric entry vehicle through numerical simulation tools. The vehicle, named SHARP, is currently being developed by the Thermal Protection Materials and Systems branch of NASA Ames Research Center, Moffett Field, California. The goal of this project is to provide insight into trajectory tradeoffs and vehicle dynamics using simulation tools that are powerful, flexible, user-friendly and inexpensive. Implemented Using MATLAB and SIMULINK, these tools are developed with an eye towards further use in the conceptual design of the SHARP vehicle's trajectory and flight control systems. A trajectory simulator is used to quantify the entry capabilities of the vehicle subject to various operational constraints. Using an aerodynamic database computed by NASA and a model of the earth, the simulator generates the vehicle trajectory in three-dimensional space based on aerodynamic angle inputs. Requirements for entry along the SHARP aerothermal performance constraint are evaluated for different control strategies. Effect of vehicle mass on entry parameters is investigated, and the cross range capability of the vehicle is evaluated. Trajectory results are presented and interpreted. A six degree of freedom simulator builds on the trajectory simulator and provides attitude simulation for future entry controls development. A Newtonian aerodynamic model including control surfaces and a mass model are developed. A visualization tool for interpreting simulation results is described. Control surfaces are roughly sized. A simple controller is developed to fly the vehicle along its aerothermal performance constraint using aerodynamic flaps for control. This end-to-end demonstration proves the suitability of the 6-DOF simulator for future flight control system development. Finally, issues surrounding real-time simulation with hardware in the loop are discussed.

  11. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  12. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  13. Integration of Linear Dynamic Emission and Climate Models with Air Traffic Simulations

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Ng, Hok K.; Chen, Neil Y.

    2012-01-01

    Future air traffic management systems are required to balance the conflicting objectives of maximizing safety and efficiency of traffic flows while minimizing the climate impact of aviation emissions and contrails. Integrating emission and climate models together with air traffic simulations improve the understanding of the complex interaction between the physical climate system, carbon and other greenhouse gas emissions and aviation activity. This paper integrates a national-level air traffic simulation and optimization capability with simple climate models and carbon cycle models, and climate metrics to assess the impact of aviation on climate. The capability can be used to make trade-offs between extra fuel cost and reduction in global surface temperature change. The parameters in the simulation can be used to evaluate the effect of various uncertainties in emission models and contrails and the impact of different decision horizons. Alternatively, the optimization results from the simulation can be used as inputs to other tools that monetize global climate impacts like the FAA s Aviation Environmental Portfolio Management Tool for Impacts.

  14. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  15. A Deep Space Orbit Determination Software: Overview and Event Prediction Capability

    NASA Astrophysics Data System (ADS)

    Kim, Youngkwang; Park, Sang-Young; Lee, Eunji; Kim, Minsik

    2017-06-01

    This paper presents an overview of deep space orbit determination software (DSODS), as well as validation and verification results on its event prediction capabilities. DSODS was developed in the MATLAB object-oriented programming environment to support the Korea Pathfinder Lunar Orbiter (KPLO) mission. DSODS has three major capabilities: celestial event prediction for spacecraft, orbit determination with deep space network (DSN) tracking data, and DSN tracking data simulation. To achieve its functionality requirements, DSODS consists of four modules: orbit propagation (OP), event prediction (EP), data simulation (DS), and orbit determination (OD) modules. This paper explains the highest-level data flows between modules in event prediction, orbit determination, and tracking data simulation processes. Furthermore, to address the event prediction capability of DSODS, this paper introduces OP and EP modules. The role of the OP module is to handle time and coordinate system conversions, to propagate spacecraft trajectories, and to handle the ephemerides of spacecraft and celestial bodies. Currently, the OP module utilizes the General Mission Analysis Tool (GMAT) as a third-party software component for highfidelity deep space propagation, as well as time and coordinate system conversions. The role of the EP module is to predict celestial events, including eclipses, and ground station visibilities, and this paper presents the functionality requirements of the EP module. The validation and verification results show that, for most cases, event prediction errors were less than 10 millisec when compared with flight proven mission analysis tools such as GMAT and Systems Tool Kit (STK). Thus, we conclude that DSODS is capable of predicting events for the KPLO in real mission applications.

  16. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  17. Real-time scene and signature generation for ladar and imaging sensors

    NASA Astrophysics Data System (ADS)

    Swierkowski, Leszek; Christie, Chad L.; Antanovskii, Leonid; Gouthas, Efthimios

    2014-05-01

    This paper describes development of two key functionalities within the VIRSuite scene simulation program, broadening its scene generation capabilities and increasing accuracy of thermal signatures. Firstly, a new LADAR scene generation module has been designed. It is capable of simulating range imagery for Geiger mode LADAR, in addition to the already existing functionality for linear mode systems. Furthermore, a new 3D heat diffusion solver has been developed within the VIRSuite signature prediction module. It is capable of calculating the temperature distribution in complex three-dimensional objects for enhanced dynamic prediction of thermal signatures. With these enhancements, VIRSuite is now a robust tool for conducting dynamic simulation for missiles with multi-mode seekers.

  18. Pika: A snow science simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the state-of-the-art in line with other scientific research efforts.

  19. Telescience - Concepts and contributions to the Extreme Ultraviolet Explorer mission

    NASA Technical Reports Server (NTRS)

    Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.

    1987-01-01

    It is shown how the contradictory goals of low-cost and fast data turnaround characterizing the Extreme Ultraviolet Explorer (EUVE) mission can be achieved via the early use of telescience style transparent tools and simulations. The use of transparent tools reduces the parallel development of capability while ensuring that valuable prelaunch experience is not lost in the operations phase. Efforts made to upgrade the 'EUVE electronics' simulator are described.

  20. Dynamic analysis of flexible mechanical systems using LATDYN

    NASA Technical Reports Server (NTRS)

    Wu, Shih-Chin; Chang, Che-Wei; Housner, Jerrold M.

    1989-01-01

    A 3-D, finite element based simulation tool for flexible multibody systems is presented. Hinge degrees-of-freedom is built into equations of motion to reduce geometric constraints. The approach avoids the difficulty in selecting deformation modes for flexible components by using assumed mode method. The tool is applied to simulate a practical space structure deployment problem. Results of examples demonstrate the capability of the code and approach.

  1. Integrated corridor management analysis, modeling, and simulation results for the test corridor.

    DOT National Transportation Integrated Search

    2008-06-01

    This report documents the Integrated Corridor Management (ICM) Analysis Modeling and Simulation (AMS) tools and strategies used on a Test Corridor, presents results and lessons-learned, and documents the relative capability of AMS to support benefit-...

  2. Coherent tools for physics-based simulation and characterization of noise in semiconductor devices oriented to nonlinear microwave circuit CAD

    NASA Astrophysics Data System (ADS)

    Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan

    2004-05-01

    We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.

  3. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  4. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  5. Usability and User Satisfaction of Multimedia Instructional Message (MIM) for Packet Tracer Simulation

    ERIC Educational Resources Information Center

    Elias, Mohd Syahrizad; Mohamad Ali, Ahmad Zamzuri

    2016-01-01

    Simulation-aided learning has capability in improving student's learning performance. However, the positive effect of simulation-aided learning still being discussed, which at times has not played the purported role expected. To address these problems, Multimedia Instructional Message (MIM) appeared to be an essential supporting tool in ensuring…

  6. Computer-Aided Engineering Tools | Water Power | NREL

    Science.gov Websites

    energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department

  7. L3.PHI.CTF.P10.02-rev2 Coupling of Subchannel T/H (CTF) and CRUD Chemistry (MAMBA1D)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salko, Robert K.; Palmtag, Scott; Collins, Benjamin S.

    2015-05-15

    The purpose of this milestone is to create a preliminary capability for modeling light water reactor (LWR) thermal-hydraulic (T/H) and CRUD growth using the CTF subchannel code and the subgrid version of the MAMBA CRUD chemistry code, MAMBA1D. In part, this is a follow-on to Milestone L3.PHI.VCS.P9.01, which is documented in Report CASL-U-2014-0188-000, titled "Development of CTF Capability for Modeling Reactor Operating Cycles with Crud Growth". As the title suggests, the previous milestone set up a framework for modeling reactor operation cycles with CTF. The framework also facilitated coupling to a CRUD chemistry capability for modeling CRUD growth throughout themore » reactor operating cycle. To demonstrate the capability, a simple CRUD \\surrogate" tool was developed and coupled to CTF; however, it was noted that CRUD growth predictions by the surrogate were not considered realistic. This milestone builds on L3.PHI.VCS.P9.01 by replacing this simple surrogate tool with the more advanced MAMBA1D CRUD chemistry code. Completing this task involves addressing unresolved tasks from Milestone L3.PHI.VCS.P9.01, setting up an interface to MAMBA1D, and extracting new T/H information from CTF that was not previously required in the simple surrogate tool. Speci c challenges encountered during this milestone include (1) treatment of the CRUD erosion model, which requires local turbulent kinetic energy (TKE) (a value that CTF does not calculate) and (2) treatment of the MAMBA1D CRUD chimney boiling model in the CTF rod heat transfer solution. To demonstrate this new T/H, CRUD modeling capability, two sets of simulations were performed: (1) an 18 month cycle simulation of a quarter symmetry model of Watts Bar and (2) a simulation of Assemblies G69 and G70 from Seabrook Cycle 5. The Watts Bar simulation is merely a demonstration of the capability. The simulation of the Seabrook cycle, which had experienced CRUD-related fuel rod failures, had actual CRUD-scrape data to compare with results. As results show, the initial CTF/MAMBA1D-predicted CRUD thicknesses were about half of their expected values, so further investigation will be required for this simulation.« less

  8. A TT&C Performance Simulator for Space Exploration and Scientific Satellites - Architecture and Applications

    NASA Astrophysics Data System (ADS)

    Donà, G.; Faletra, M.

    2015-09-01

    This paper presents the TT&C performance simulator toolkit developed internally at Thales Alenia Space Italia (TAS-I) to support the design of TT&C subsystems for space exploration and scientific satellites. The simulator has a modular architecture and has been designed using a model-based approach using standard engineering tools such as MATLAB/SIMULINK and mission analysis tools (e.g. STK). The simulator is easily reconfigurable to fit different types of satellites, different mission requirements and different scenarios parameters. This paper provides a brief description of the simulator architecture together with two examples of applications used to demonstrate some of the simulator’s capabilities.

  9. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implementmore » a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.« less

  10. The use of real-time, hardware-in-the-loop simulation in the design and development of the new Hughes HS601 spacecraft attitude control system

    NASA Technical Reports Server (NTRS)

    Slafer, Loren I.

    1989-01-01

    Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.

  11. A study on directional resistivity logging-while-drilling based on self-adaptive hp-FEM

    NASA Astrophysics Data System (ADS)

    Liu, Dejun; Li, Hui; Zhang, Yingying; Zhu, Gengxue; Ai, Qinghui

    2014-12-01

    Numerical simulation of resistivity logging-while-drilling (LWD) tool response provides guidance for designing novel logging instruments and interpreting real-time logging data. In this paper, based on self-adaptive hp-finite element method (hp-FEM) algorithm, we analyze LWD tool response against model parameters and briefly illustrate geosteering capabilities of directional resistivity LWD. Numerical simulation results indicate that the change of source spacing is of obvious influence on the investigation depth and detecting precision of resistivity LWD tool; the change of frequency can improve the resolution of low-resistivity formation and high-resistivity formation. The simulation results also indicate that the self-adaptive hp-FEM algorithm has good convergence speed and calculation accuracy to guide the geologic steering drilling and it is suitable to simulate the response of resistivity LWD tools.

  12. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  13. Distributed Observer Network (DON), Version 3.0, User's Guide

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    The Distributed Observer Network (DON) is a data presentation tool developed by the National Aeronautics and Space Administration (NASA) to distribute and publish simulation results. Leveraging the display capabilities inherent in modern gaming technology, DON places users in a fully navigable 3-D environment containing graphical models and allows the users to observe how those models evolve and interact over time in a given scenario. Each scenario is driven with data that has been generated by authoritative NASA simulation tools and exported in accordance with a published data interface specification. This decoupling of the data from the source tool enables DON to faithfully display a simulator's results and ensure that every simulation stakeholder will view the exact same information every time.

  14. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE PAGES

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...

    2017-04-24

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  15. Evaluation and demonstration of commercialization potential of CCSI tools within gPROMS advanced simulation platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawal, Adekola; Schmal, Pieter; Ramos, Alfredo

    PSE, in the first phase of the CCSI commercialization project, set out to identify market opportunities for the CCSI tools combined with existing gPROMS platform capabilities and develop a clear technical plan for the proposed commercialization activities.

  16. NetMOD v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J

    2015-12-22

    NetMOD is a tool to model the performance of global ground-based explosion monitoring systems. The version 2.0 of the software supports the simulation of seismic, hydroacoustic, and infrasonic detection capability. The tool provides a user interface to execute simulations based upon a hypothetical definition of the monitoring system configuration, geophysical properties of the Earth, and detection analysis criteria. NetMOD will be distributed with a project file defining the basic performance characteristics of the International Monitoring System (IMS), a network of sensors operated by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Network modeling is needed to be able to assess and explainmore » the potential effect of changes to the IMS, to prioritize station deployment and repair, and to assess the overall CTBTO monitoring capability currently and in the future. Currently the CTBTO uses version 1.0 of NetMOD, provided to them in early 2014. NetMOD will provide a modern tool that will cover all the simulations currently available and allow for the development of additional simulation capabilities of the IMS in the future. NetMOD simulates the performance of monitoring networks by estimating the relative amplitudes of the signal and noise measured at each of the stations within the network based upon known geophysical principles. From these signal and noise estimates, a probability of detection may be determined for each of the stations. The detection probabilities at each of the stations may then be combined to produce an estimate of the detection probability for the entire monitoring network.« less

  17. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  18. Device Performance | Photovoltaic Research | NREL

    Science.gov Websites

    Device Performance Device Performance PV Calibrations Blog Check out the latest updates from the PV than 190 person-years. Capabilities Our capabilities for measuring key performance parameters of solar cells and modules include the use of various solar simulators and tools to measure current-voltage and

  19. Analysis and simulation tools for solar array power systems

    NASA Astrophysics Data System (ADS)

    Pongratananukul, Nattorn

    This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including semiconductor properties, allows simulation of electrical properties as well as the evaluation of the impact of environmental conditions. A second developed tool provides a co-simulation platform with the capability to verify the performance of an actual digital controller implemented in programmable hardware such as a DSP processor, while the entire solar array including the DC-DC power converter is modeled in software algorithms running on a computer. This "virtual plant" allows developing and debugging code for the digital controller, and also to improve the control algorithm. One important task in solar arrays is to track the maximum power point on the array in order to maximize the power that can be delivered. Digital controllers implemented with programmable processors are particularly attractive for this task because sophisticated tracking algorithms can be implemented and revised when needed to optimize their performance. The proposed co-simulation tools are thus very valuable in developing and optimizing the control algorithm, before the system is built. Examples that demonstrate the effectiveness of the proposed methodologies are presented. The proposed simulation tools are also valuable in the design of multi-channel arrays. In the specific system that we have designed and tested, the control algorithm is implemented on a single digital signal processor. In each of the channels the maximum power point is tracked individually. In the prototype we built, off-the-shelf commercial DC-DC converters were utilized. At the end, the overall performance of the entire system was evaluated using solar array simulators capable of simulating various I-V characteristics, and also by using an electronic load. Experimental results are presented.

  20. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  1. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  2. Amanzi: An Open-Source Multi-process Simulator for Environmental Applications

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.

    2014-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.

  3. Simulator for heterogeneous dataflow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    1993-01-01

    A new simulator is developed to simulate the execution of an algorithm graph in accordance with the Algorithm to Architecture Mapping Model (ATAMM) rules. ATAMM is a Petri Net model which describes the periodic execution of large-grained, data-independent dataflow graphs and which provides predictable steady state time-optimized performance. This simulator extends the ATAMM simulation capability from a heterogenous set of resources, or functional units, to a more general heterogenous architecture. Simulation test cases show that the simulator accurately executes the ATAMM rules for both a heterogenous architecture and a homogenous architecture, which is the special case for only one processor type. The simulator forms one tool in an ATAMM Integrated Environment which contains other tools for graph entry, graph modification for performance optimization, and playback of simulations for analysis.

  4. Numerical Model of Flame Spread Over Solids in Microgravity: A Supplementary Tool for Designing a Space Experiment

    NASA Technical Reports Server (NTRS)

    Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)

    2001-01-01

    The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.

  5. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less

  6. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    PubMed

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.

  7. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB

    PubMed Central

    Lee, Leng-Feng

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184

  8. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  9. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  10. NASA Simulation Capabilities

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon R.

    2017-01-01

    This presentation provides a high-level overview of NASA's Future ATM Concepts Evaluation Tool (FACET) with a high-level description of the system's inputs and outputs. This presentation is designed to support the joint simulations that NASA and the Chinese Aeronautical Establishment (CAE) will conduct under an existing Memorandum of Understanding.

  11. THE STORM WATER MANAGEMENT MODEL (SWMM) AND RELATED WATERSHED TOOLS DEVELOPMENT

    EPA Science Inventory

    The Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. It is the only publicly available model capable of performing a comprehensiv...

  12. MAGIC: Model and Graphic Information Converter

    NASA Technical Reports Server (NTRS)

    Herbert, W. C.

    2009-01-01

    MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.

  13. Recent Developments in Aircraft Flyover Noise Simulation at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Sullivan, Brenda M.; Aumann, Aric R.

    2008-01-01

    The NASA Langley Research Center is involved in the development of a new generation of synthesis and simulation tools for creation of virtual environments used in the study of aircraft community noise. The original emphasis was on simulation of flyover noise associated with subsonic fixed wing aircraft. Recently, the focus has shifted to rotary wing aircraft. Many aspects of the simulation are applicable to both vehicle classes. Other aspects, particularly those associated with synthesis, are more vehicle specific. This paper discusses the capabilities of the current suite of tools, their application to fixed and rotary wing aircraft, and some directions for the future.

  14. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.

  15. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  16. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.

  17. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  18. The Loci Multidisciplinary Simulation System Overview and Status

    NASA Technical Reports Server (NTRS)

    Luke, Edward A.; Tong, Xiao-Ling; Tang, Lin

    2002-01-01

    This paper will discuss the Loci system, an innovative tool for developing tightly coupled multidisciplinary three dimensional simulations. This presentation will overview some of the unique capabilities of the Loci system to automate the assembly of numerical simulations from libraries of fundamental computational components. We will discuss the demonstration of the Loci system on coupled fluid-structure problems related to RBCC propulsion systems.

  19. Design Patterns for Learning and Assessment: Facilitating the Introduction of a Complex Simulation-Based Learning Environment into a Community of Instructors

    NASA Astrophysics Data System (ADS)

    Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.

    2010-04-01

    Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.

  20. Integrated simulations for fusion research in the 2030's time frame (white paper outline)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.

    This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less

  1. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  2. Interoperability and complementarity of simulation tools for beamline design in the OASYS environment

    NASA Astrophysics Data System (ADS)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    In the next years most of the major synchrotron radiation facilities around the world will upgrade to 4th-generation Diffraction Limited Storage Rings using multi-bend-achromat technology. Moreover, several Free Electron Lasers are ready-to-go or in phase of completion. These events represent a huge challenge for the optics physicists responsible of designing and calculating optical systems capable to exploit the revolutionary characteristics of the new photon beams. Reliable and robust beamline design is nowadays based on sophisticated computer simulations only possible by lumping together different simulation tools. The OASYS (OrAnge SYnchrotron Suite) suite drives several simulation tools providing new mechanisms of interoperability and communication within the same software environment. OASYS has been successfully used during the conceptual design of many beamline and optical designs for the ESRF and Elettra- Sincrotrone Trieste upgrades. Some examples are presented showing comparisons and benchmarking of simulations against calculated and experimental data.

  3. Reducing the Schizophrenia Stigma: A New Approach Based on Augmented Reality

    PubMed Central

    Silva, Rafael D. de C.; Albuquerque, Saulo G. C.; Muniz, Artur de V.; Filho, Pedro P. Rebouças; Ribeiro, Sidarta

    2017-01-01

    Schizophrenia is a chronic mental disease that usually manifests psychotic symptoms and affects an individual's functionality. The stigma related to this disease is a serious obstacle for an adequate approach to its treatment. Stigma can, for example, delay the start of treatment, and it creates difficulties in interpersonal and professional relationships. This work proposes a new tool based on augmented reality to reduce the stigma related to schizophrenia. The tool is capable of simulating the psychotic symptoms typical of schizophrenia and simulates sense perception changes in order to create an immersive experience capable of generating pathological experiences of a patient with schizophrenia. The integration into the proposed environment occurs through immersion glasses and an embedded camera. Audio and visual effects can also be applied in real time. To validate the proposed environment, medical students experienced the virtual environment and then answered three questionnaires to assess (i) stigmas related to schizophrenia, (ii) the efficiency and effectiveness of the tool, and, finally (iii) stigma after simulation. The analysis of the questionnaires showed that the proposed model is a robust tool and quite realistic and, thus, very promising in reducing stigma associated with schizophrenia by instilling in the observer a greater comprehension of any person during an schizophrenic outbreak, whether a patient or a family member. PMID:29317860

  4. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    NASA Astrophysics Data System (ADS)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  5. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.

  6. Electronic Systems for Spacecraft Vehicles: Required EDA Tools

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic

    1999-01-01

    The continuous increase in complexity of electronic systems is making the design and manufacturing of such systems more challenging than ever before. As a result, designers are finding it impossible to design efficient systems without the use of sophisticated Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and lead to a correct by design methodology. This report identifies the EDA tools that would be needed to design, analyze, simulate, and evaluate electronic systems for spacecraft vehicles. In addition, the report presents recommendations to enhance the current JSC electronic design capabilities. This includes cost information and a discussion as to the impact, both positive and negative, of implementing the recommendations.

  7. Evolution of a Simulation Testbed into an Operational Tool

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Bilimoria, Karl D.; Sridhar, Banavar; Sterenchuk, Mike; Niznik, Tim; O'Neill, Tom; Clymer, Alexis; Gutierrez Nolasco, Sebastian; Edholm, Kaj; Shih, Fu-Tai

    2017-01-01

    This paper describes the evolution over a 20-year period of the Future ATM (Air Traffic Management) Concepts Evaluation Tool (FACET) from a National Airspace System (NAS) based simulation testbed into an operational tool. FACET was developed as a testbed for assessing futuristic ATM concepts, e.g., automated conflict detection and resolution. NAS Constraint Evaluation and Notification Tool (NASCENT) is an application, within FACET, for alerting airspace users of inefficiencies in flight operations and advising time- and fuel-saving reroutes.It is currently in use at American Airlines Integrated Operations Center in Fort Worth, TX. The concepts assessed,research conducted, and the operational capability developed, along with the NASA support and achievements are presented in this paper.

  8. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.

    2000-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  9. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  11. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  12. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  13. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckers, Koenraad J; McCabe, Kevin

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting themore » time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.« less

  14. Organic Scintillator Detector Response Simulations with DRiFT

    DOE PAGES

    Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen; ...

    2016-06-11

    Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less

  15. Organic scintillator detector response simulations with DRiFT

    NASA Astrophysics Data System (ADS)

    Andrews, M. T.; Bates, C. R.; McKigney, E. A.; Solomon, C. J.; Sood, A.

    2016-09-01

    This work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNP® output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed-field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNP® 6 , which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discrimination plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.

  16. Telemetry-Enhancing Scripts

    NASA Technical Reports Server (NTRS)

    Maimone, Mark W.

    2009-01-01

    Scripts Providing a Cool Kit of Telemetry Enhancing Tools (SPACKLE) is a set of software tools that fill gaps in capabilities of other software used in processing downlinked data in the Mars Exploration Rovers (MER) flight and test-bed operations. SPACKLE tools have helped to accelerate the automatic processing and interpretation of MER mission data, enabling non-experts to understand and/or use MER query and data product command simulation software tools more effectively. SPACKLE has greatly accelerated some operations and provides new capabilities. The tools of SPACKLE are written, variously, in Perl or the C or C++ language. They perform a variety of search and shortcut functions that include the following: Generating text-only, Event Report-annotated, and Web-enhanced views of command sequences; Labeling integer enumerations with their symbolic meanings in text messages and engineering channels; Systematic detecting of corruption within data products; Generating text-only displays of data-product catalogs including downlink status; Validating and labeling of commands related to data products; Performing of convenient searches of detailed engineering data spanning multiple Martian solar days; Generating tables of initial conditions pertaining to engineering, health, and accountability data; Simplified construction and simulation of command sequences; and Fast time format conversions and sorting.

  17. Calibration and validation of a spar-type floating offshore wind turbine model using the FAST dynamic simulation tool

    DOE PAGES

    Browning, J. R.; Jonkman, J.; Robertson, A.; ...

    2014-12-16

    In this study, high-quality computer simulations are required when designing floating wind turbines because of the complex dynamic responses that are inherent with a high number of degrees of freedom and variable metocean conditions. In 2007, the FAST wind turbine simulation tool, developed and maintained by the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL), was expanded to include capabilities that are suitable for modeling floating offshore wind turbines. In an effort to validate FAST and other offshore wind energy modeling tools, DOE funded the DeepCwind project that tested three prototype floating wind turbines at 1/50 th scalemore » in a wave basin, including a semisubmersible, a tension-leg platform, and a spar buoy. This paper describes the use of the results of the spar wave basin tests to calibrate and validate the FAST offshore floating simulation tool, and presents some initial results of simulated dynamic responses of the spar to several combinations of wind and sea states. Wave basin tests with the spar attached to a scale model of the NREL 5-megawatt reference wind turbine were performed at the Maritime Research Institute Netherlands under the DeepCwind project. This project included free-decay tests, tests with steady or turbulent wind and still water (both periodic and irregular waves with no wind), and combined wind/wave tests. The resulting data from the 1/50th model was scaled using Froude scaling to full size and used to calibrate and validate a full-size simulated model in FAST. Results of the model calibration and validation include successes, subtleties, and limitations of both wave basin testing and FAST modeling capabilities.« less

  18. Development and Testing of an Automatic Transmission Shift Schedule Algorithm for Vehicle Simulation (SAE Paper 2015-01-1142)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...

  19. Using Petri Net Tools to Study Properties and Dynamics of Biological Systems

    PubMed Central

    Peleg, Mor; Rubin, Daniel; Altman, Russ B.

    2005-01-01

    Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791

  20. Chemical vapor deposition fluid flow simulation modelling tool

    NASA Technical Reports Server (NTRS)

    Bullister, Edward T.

    1992-01-01

    Accurate numerical simulation of chemical vapor deposition (CVD) processes requires a general purpose computational fluid dynamics package combined with specialized capabilities for high temperature chemistry. In this report, we describe the implementation of these specialized capabilities in the spectral element code NEKTON. The thermal expansion of the gases involved is shown to be accurately approximated by the low Mach number perturbation expansion of the incompressible Navier-Stokes equations. The radiative heat transfer between multiple interacting radiating surfaces is shown to be tractable using the method of Gebhart. The disparate rates of reaction and diffusion in CVD processes are calculated via a point-implicit time integration scheme. We demonstrate the use above capabilities on prototypical CVD applications.

  1. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  2. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  3. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Appendix A: ROBSIM user's guide

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelley, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.

    1986-01-01

    The purpose of the Robotics Simulation Program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotics systems. ROBSIM is program in FORTRAN 77 for use on a VAX 11/750 computer under the VMS operating system. This user's guide describes the capabilities of the ROBSIM programs, including the system definition function, the analysis tools function and the postprocessor function. The options a user may encounter with each of these executables are explained in detail and the different program prompts appearing to the user are included. Some useful suggestions concerning the appropriate answers to be given by the user are provided. An example user interactive run in enclosed for each of the main program services, and some of the capabilities are illustrated.

  4. Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry: Spray Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.

    2009-04-26

    The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less

  5. DKIST Adaptive Optics System: Simulation Results

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Schmidt, Dirk

    2016-05-01

    The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.

  6. Avionics System Architecture Tool

    NASA Technical Reports Server (NTRS)

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian

    2005-01-01

    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  7. Modeling strength data for CREW CHIEF

    NASA Technical Reports Server (NTRS)

    Mcdaniel, Joe W.

    1990-01-01

    The Air Force has developed CREW CHIEF, a computer-aided design (CAD) tool for simulating and evaluating aircraft maintenance to determine if the required activities are feasible. CREW CHIEF gives the designer the ability to simulate maintenance activities with respect to reach, accessibility, strength, hand tool operation, and materials handling. While developing the CREW CHIEF, extensive research was performed to describe workers strength capabilities for using hand tools and manual handling of objects. More than 100,000 strength measures were collected and modeled for CREW CHIEF. These measures involved both male and female subjects in the 12 maintenance postures included in CREW CHIEF. The data collection and modeling effort are described.

  8. A Lunar Surface Operations Simulator

    NASA Technical Reports Server (NTRS)

    Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.; hide

    2008-01-01

    The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.

  9. An Integrated Software Package to Enable Predictive Simulation Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less

  10. Real-Time Simulation of Aeroheating of the Hyper-X Airplane

    NASA Technical Reports Server (NTRS)

    Gong, Les

    2005-01-01

    A capability for real-time computational simulation of aeroheating has been developed in support of the Hyper-X program, which is directed toward demonstrating the feasibility of operating an air-breathing ramjet/scramjet engine at mach 5, mach 7, and mach 10. The simulation software will serve as a valuable design tool for initial trajectory studies in which aerodynamic heating is expected to exert a major influence in the design of the Hyper-X airplane; this tool will aid in the selection of materials, sizing of structural skin thicknesses, and selection of components of a thermal-protection system (TPS) for structures that must be insulated against aeroheating.

  11. Simulation Based Optimization of Complex Monolithic Composite Structures Using Cellular Core Technology

    NASA Astrophysics Data System (ADS)

    Hickmott, Curtis W.

    Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional tooling methods. Components made with the cellular core tooling method showed an improved strength at the joints. It is expected that this knowledge will help optimize the processing of complex, integrated structures and benefit applications in aerospace where lighter, structurally efficient components would be advantageous.

  12. Evaluation of a Tactical Surface Metering Tool for Charlotte Douglas International Airport via Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Verma, Savita; Lee, Hanbong; Dulchinos, Victoria L.; Martin, Lynne; Stevens, Lindsay; Jung, Yoon; Chevalley, Eric; Jobe, Kim; Parke, Bonny

    2017-01-01

    NASA has been working with the FAA and aviation industry partners to develop and demonstrate new concepts and technologies that integrate arrival, departure, and surface traffic management capabilities. In March 2017, NASA conducted a human-in-the-loop (HITL) simulation for integrated surface and airspace operations, modeling Charlotte Douglas International Airport, to evaluate the operational procedures and information requirements for the tactical surface metering tool, and data exchange elements between the airline controlled ramp and ATC Tower. In this paper, we focus on the calibration of the tactical surface metering tool using various metrics measured from the HITL simulation results. Key performance metrics include gate hold times from pushback advisories, taxi-in-out times, runway throughput, and departure queue size. Subjective metrics presented in this paper include workload, situational awareness, and acceptability of the metering tool and its calibration.

  13. Evaluation of a Tactical Surface Metering Tool for Charlotte Douglas International Airport Via Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Verma, Savita; Lee, Hanbong; Martin, Lynne; Stevens, Lindsay; Jung, Yoon; Dulchinos, Victoria; Chevalley, Eric; Jobe, Kim; Parke, Bonny

    2017-01-01

    NASA has been working with the FAA and aviation industry partners to develop and demonstrate new concepts and technologies that integrate arrival, departure, and surface traffic management capabilities. In March 2017, NASA conducted a human-in-the-loop (HITL) simulation for integrated surface and airspace operations, modeling Charlotte Douglas International Airport, to evaluate the operational procedures and information requirements for the tactical surface metering tool, and data exchange elements between the airline controlled ramp and ATC Tower. In this paper, we focus on the calibration of the tactical surface metering tool using various metrics measured from the HITL simulation results. Key performance metrics include gate hold times from pushback advisories, taxi-in/out times, runway throughput, and departure queue size. Subjective metrics presented in this paper include workload, situational awareness, and acceptability of the metering tool and its calibration

  14. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckers, Koenraad J; McCabe, Kevin

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less

  15. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  16. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  17. Simulator technology as a tool for education in cardiac care.

    PubMed

    Hravnak, Marilyn; Beach, Michael; Tuite, Patricia

    2007-01-01

    Assisting nurses in gaining the cognitive and psychomotor skills necessary to safely and effectively care for patients with cardiovascular disease can be challenging for educators. Ideally, nurses would have the opportunity to synthesize and practice these skills in a protected training environment before application in the dynamic clinical setting. Recently, a technology known as high fidelity human simulation was introduced, which permits learners to interact with a simulated patient. The dynamic physiologic parameters and physical assessment capabilities of the simulated patient provide for a realistic learning environment. This article describes the High Fidelity Human Simulation Laboratory at the University of Pittsburgh School of Nursing and presents strategies for using this technology as a tool in teaching complex cardiac nursing care at the basic and advanced practice nursing levels. The advantages and disadvantages of high fidelity human simulation in learning are discussed.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less

  19. skelesim: an extensible, general framework for population genetic simulation in R.

    PubMed

    Parobek, Christian M; Archer, Frederick I; DePrenger-Levin, Michelle E; Hoban, Sean M; Liggins, Libby; Strand, Allan E

    2017-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares' complex capabilities, composing code and input files, a daunting bioinformatics barrier and a steep conceptual learning curve. skelesim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics and organizing data output, in a reproducible pipeline within the R environment. skelesim is designed to be an extensible framework that can 'wrap' around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skelesim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skelesim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skelesim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). © 2016 John Wiley & Sons Ltd.

  20. skeleSim: an extensible, general framework for population genetic simulation in R

    PubMed Central

    Parobek, Christian M.; Archer, Frederick I.; DePrenger-Levin, Michelle E.; Hoban, Sean M.; Liggins, Libby; Strand, Allan E.

    2016-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares’ complex capabilities, composing code and input files, a daunting bioinformatics barrier, and a steep conceptual learning curve. skeleSim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics, and organizing data output, in a reproducible pipeline within the R environment. skeleSim is designed to be an extensible framework that can ‘wrap’ around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skeleSim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skeleSim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skeleSim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). PMID:27736016

  1. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNamara, A; Held, K; Paganetti, H

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen

    Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less

  3. ASC FY17 Implementation Plan, Rev. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, P. G.

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less

  4. Aerospace Toolbox--a flight vehicle design, analysis, simulation, and software development environment II: an in-depth overview

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.

    2002-07-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  5. Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael

    2016-01-01

    Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.

  6. A High-Fidelity Batch Simulation Environment for Integrated Batch and Piloted Air Combat Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; McManus, John W.; Chappell, Alan R.

    1992-01-01

    A batch air combat simulation environment known as the Tactical Maneuvering Simulator (TMS) is presented. The TMS serves as a tool for developing and evaluating tactical maneuvering logics. The environment can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS is capable of simulating air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics and propulsive characteristics equivalent to those used in high-fidelity piloted simulation. Databases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system known as the Tactical Autopilot (TA) is implemented in the aircraft simulation model. The TA converts guidance commands issued by computerized maneuvering logics in the form of desired angle-of-attack and wind axis-bank angle into inputs to the inner-loop control augmentation system of the aircraft. This report describes the capabilities and operation of the TMS.

  7. Final report for the endowment of simulator agents with human-like episodic memory LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, Ann Elizabeth; Lippitt, Carl Edward; Thomas, Edward Victor

    This report documents work undertaken to endow the cognitive framework currently under development at Sandia National Laboratories with a human-like memory for specific life episodes. Capabilities have been demonstrated within the context of three separate problem areas. The first year of the project developed a capability whereby simulated robots were able to utilize a record of shared experience to perform surveillance of a building to detect a source of smoke. The second year focused on simulations of social interactions providing a queriable record of interactions such that a time series of events could be constructed and reconstructed. The third yearmore » addressed tools to promote desktop productivity, creating a capability to query episodic logs in real time allowing the model of a user to build on itself based on observations of the user's behavior.« less

  8. Capabilities overview of the MORET 5 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.

    2014-06-01

    The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.

  9. Fractal Landscape Algorithms for Environmental Simulations

    NASA Astrophysics Data System (ADS)

    Mao, H.; Moran, S.

    2014-12-01

    Natural science and geographical research are now able to take advantage of environmental simulations that more accurately test experimental hypotheses, resulting in deeper understanding. Experiments affected by the natural environment can benefit from 3D landscape simulations capable of simulating a variety of terrains and environmental phenomena. Such simulations can employ random terrain generation algorithms that dynamically simulate environments to test specific models against a variety of factors. Through the use of noise functions such as Perlin noise, Simplex noise, and diamond square algorithms, computers can generate simulations that model a variety of landscapes and ecosystems. This study shows how these algorithms work together to create realistic landscapes. By seeding values into the diamond square algorithm, one can control the shape of landscape. Perlin noise and Simplex noise are also used to simulate moisture and temperature. The smooth gradient created by coherent noise allows more realistic landscapes to be simulated. Terrain generation algorithms can be used in environmental studies and physics simulations. Potential studies that would benefit from simulations include the geophysical impact of flash floods or drought on a particular region and regional impacts on low lying area due to global warming and rising sea levels. Furthermore, terrain generation algorithms also serve as aesthetic tools to display landscapes (Google Earth), and simulate planetary landscapes. Hence, it can be used as a tool to assist science education. Algorithms used to generate these natural phenomena provide scientists a different approach in analyzing our world. The random algorithms used in terrain generation not only contribute to the generating the terrains themselves, but are also capable of simulating weather patterns.

  10. Interactive 3D Models and Simulations for Nuclear Security Education, Training, and Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warner, David K.; Dickens, Brian Scott; Heimer, Donovan J.

    By providing examples of products that have been produced in the past, it is the hopes of the authors that the audience will have a more thorough understanding of 3D modeling tools, potential applications, and capabilities that they can provide. Truly the applications and capabilities of these types of tools are only limited by one’s imagination. The future of three-dimensional models lies in the expansion into the world of virtual reality where one will experience a fully immersive first-person environment. The use of headsets and hand tools will allow students and instructors to have a more thorough spatial understanding ofmore » facilities and scenarios that they will encounter in the real world.« less

  11. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  12. RT 24 - Architecture, Modeling & Simulation, and Software Design

    DTIC Science & Technology

    2010-11-01

    focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement

  13. Analysis of the Assignment Scheduling Capability for Unmanned Aerial Vehicles (ASC-U) Simulation Tool

    DTIC Science & Technology

    2006-06-01

    dynamic programming approach known as a “rolling horizon” approach. This method accounts for state transitions within the simulation rather than modeling ... model is based on the framework developed for Dynamic Allocation of Fires and Sensors used to evaluate factors associated with networking assets in the...of UAVs required by all types of maneuver and support brigades. (Witsken, 2004) The Modeling , Virtual Environments, and Simulations Institute

  14. Simulation and Analyses of Multi-Body Separation in Launch Vehicle Staging Environment

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.; Hotchko, Nathaniel J.; Samareh, Jamshid; Covell, Peter F.; Tartabini, Paul V.

    2006-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of multi-body separation is critically needed for successful design and operation of next generation launch vehicles. As a part of this activity, ConSep simulation tool is being developed. ConSep is a generic MATLAB-based front-and-back-end to the commercially available ADAMS. solver, an industry standard package for solving multi-body dynamic problems. This paper discusses the 3-body separation capability in ConSep and its application to the separation of the Shuttle Solid Rocket Boosters (SRBs) from the External Tank (ET) and the Orbiter. The results are compared with STS-1 flight data.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattsson, Ann E.

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less

  16. Maestro Workflow Conductor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Natale, Francesco

    2017-06-01

    MaestroWF is a Python tool and software package for loading YAML study specifications that represents a simulation campaign. The package is capable of parameterizing a study, pulling dependencies automatically, formatting output directories, and managing the flow and execution of the campaign. MaestroWF also provides a set of abstracted objects that can also be used to develop user specific scripts for launching simulation campaigns.

  17. Assessment of Near-Field Sonic Boom Simulation Tools

    NASA Technical Reports Server (NTRS)

    Casper, J. H.; Cliff, S. E.; Thomas, S. D.; Park, M. A.; McMullen, M. S.; Melton, J. E.; Durston, D. A.

    2008-01-01

    A recent study for the Supersonics Project, within the National Aeronautics and Space Administration, has been conducted to assess current in-house capabilities for the prediction of near-field sonic boom. Such capabilities are required to simulate the highly nonlinear flow near an aircraft, wherein a sonic-boom signature is generated. There are many available computational fluid dynamics codes that could be used to provide the near-field flow for a sonic boom calculation. However, such codes have typically been developed for applications involving aerodynamic configuration, for which an efficiently generated computational mesh is usually not optimum for a sonic boom prediction. Preliminary guidelines are suggested to characterize a state-of-the-art sonic boom prediction methodology. The available simulation tools that are best suited to incorporate into that methodology are identified; preliminary test cases are presented in support of the selection. During this phase of process definition and tool selection, parallel research was conducted in an attempt to establish criteria that link the properties of a computational mesh to the accuracy of a sonic boom prediction. Such properties include sufficient grid density near shocks and within the zone of influence, which are achieved by adaptation and mesh refinement strategies. Prediction accuracy is validated by comparison with wind tunnel data.

  18. Distributed dynamic simulations of networked control and building performance applications.

    PubMed

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  19. Distributed dynamic simulations of networked control and building performance applications

    PubMed Central

    Yahiaoui, Azzedine

    2017-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135

  20. Gaining insight into the physics of dynamic atomic force microscopy in complex environments using the VEDA simulator

    NASA Astrophysics Data System (ADS)

    Kiracofe, Daniel; Melcher, John; Raman, Arvind

    2012-01-01

    Dynamic atomic force microscopy (dAFM) continues to grow in popularity among scientists in many different fields, and research on new methods and operating modes continues to expand the resolution, capabilities, and types of samples that can be studied. But many promising increases in capability are accompanied by increases in complexity. Indeed, interpreting modern dAFM data can be challenging, especially on complicated material systems, or in liquid environments where the behavior is often contrary to what is known in air or vacuum environments. Mathematical simulations have proven to be an effective tool in providing physical insight into these non-intuitive systems. In this article we describe recent developments in the VEDA (virtual environment for dynamic AFM) simulator, which is a suite of freely available, open-source simulation tools that are delivered through the cloud computing cyber-infrastructure of nanoHUB (www.nanohub.org). Here we describe three major developments. First, simulations in liquid environments are improved by enhancements in the modeling of cantilever dynamics, excitation methods, and solvation shell forces. Second, VEDA is now able to simulate many new advanced modes of operation (bimodal, phase-modulation, frequency-modulation, etc.). Finally, nineteen different tip-sample models are available to simulate the surface physics of a wide variety different material systems including capillary, specific adhesion, van der Waals, electrostatic, viscoelasticity, and hydration forces. These features are demonstrated through example simulations and validated against experimental data, in order to provide insight into practical problems in dynamic AFM.

  1. Gaining insight into the physics of dynamic atomic force microscopy in complex environments using the VEDA simulator.

    PubMed

    Kiracofe, Daniel; Melcher, John; Raman, Arvind

    2012-01-01

    Dynamic atomic force microscopy (dAFM) continues to grow in popularity among scientists in many different fields, and research on new methods and operating modes continues to expand the resolution, capabilities, and types of samples that can be studied. But many promising increases in capability are accompanied by increases in complexity. Indeed, interpreting modern dAFM data can be challenging, especially on complicated material systems, or in liquid environments where the behavior is often contrary to what is known in air or vacuum environments. Mathematical simulations have proven to be an effective tool in providing physical insight into these non-intuitive systems. In this article we describe recent developments in the VEDA (virtual environment for dynamic AFM) simulator, which is a suite of freely available, open-source simulation tools that are delivered through the cloud computing cyber-infrastructure of nanoHUB (www.nanohub.org). Here we describe three major developments. First, simulations in liquid environments are improved by enhancements in the modeling of cantilever dynamics, excitation methods, and solvation shell forces. Second, VEDA is now able to simulate many new advanced modes of operation (bimodal, phase-modulation, frequency-modulation, etc.). Finally, nineteen different tip-sample models are available to simulate the surface physics of a wide variety different material systems including capillary, specific adhesion, van der Waals, electrostatic, viscoelasticity, and hydration forces. These features are demonstrated through example simulations and validated against experimental data, in order to provide insight into practical problems in dynamic AFM.

  2. A Satellite Data Analysis and CubeSat Instrument Simulator Tool for Simultaneous Multi-spacecraft Measurements of Solar Energetic Particles

    NASA Astrophysics Data System (ADS)

    Vannitsen, Jordan; Rizzitelli, Federico; Wang, Kaiti; Segret, Boris; Juang, Jyh-Ching; Miau, Jiun-Jih

    2017-12-01

    This paper presents a Multi-satellite Data Analysis and Simulator Tool (MDAST), developed with the original goal to support the science requirements of a Martian 3-Unit CubeSat mission profile named Bleeping Interplanetary Radiation Determination Yo-yo (BIRDY). MDAST was firstly designed and tested by taking into account the positions, attitudes, instruments field of view and energetic particles flux measurements from four spacecrafts (ACE, MSL, STEREO A, and STEREO B). Secondly, the simulated positions, attitudes and instrument field of view from the BIRDY CubeSat have been adapted for input. And finally, this tool can be used for data analysis of the measurements from the four spacecrafts mentioned above so as to simulate the instrument trajectory and observation capabilities of the BIRDY CubeSat. The onset, peak and end time of a solar particle event is specifically defined and identified with this tool. It is not only useful for the BIRDY mission but also for analyzing data from the four satellites aforementioned and can be utilized for other space weather missions with further customization.

  3. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  4. An automated tool joint inspection device for the drill string

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, M.C.; Dale, B.A.; Kusenberger, F.N.

    1983-02-01

    This paper discusses the development of an automated tool joint inspection device (i.e., the Fatigue Crack Detector), which is capable of detecting defects in the threaded region of drill pipe and drill collars. On the basis of inspection tests conducted at a research test facility and at drilling rig sites, this device is capable of detecting both simulated defects (saw slots and drilled holes) and service-induced defects, such as fatigue cracks, pin stretch (plastic deformation), mashed threads, and corrosion pitting. The system employs an electromagnetic flux-leakage principle and has several advantages over the conventional method of magnetic particle inspection.

  5. Myokit: A simple interface to cardiac cellular electrophysiology.

    PubMed

    Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A

    2016-01-01

    Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Comparative analysis of the functionality of simulators of the da Vinci surgical robot.

    PubMed

    Smith, Roger; Truong, Mireille; Perez, Manuela

    2015-04-01

    The implementation of robotic technology in minimally invasive surgery has led to the need to develop more efficient and effective training methods, as well as assessment and skill maintenance tools for surgical education. Multiple simulators and procedures are available for educational and training purposes. A need for comparative evaluations of these simulators exists to aid users in selecting an appropriate device for their purposes. We conducted an objective review and comparison of the design and capabilities of all dedicated simulators of the da Vinci robot, the da Vinci Skill Simulator (DVSS) (Intuitive Surgical Inc., Sunnyvale, CA, USA), dV-Trainer (dVT) (Mimic Technologies Inc., Seattle, WA, USA), and Robotic Surgery Simulator (RoSS) (Simulated Surgical Skills, LLC, Williamsville, NY, USA). This provides base specifications of the hardware and software, with an emphasis on the training capabilities of each system. Each simulator contains a large number of training exercises, DVSS = 40, dVT = 65, and RoSS = 52 for skills development. All three offer 3D visual images but use different display technologies. The DVSS leverages the real robotic surgeon's console to provide visualization, hand controls, and foot pedals. The dVT and RoSS created simulated versions of all of these control systems. They include systems management services which allow instructors to collect, export, and analyze the scores of students using the simulators. This study is the first to provide comparative information of the three simulators functional capabilities with an emphasis on their educational skills. They offer unique advantages and capabilities in training robotic surgeons. Each device has been the subject of multiple validation experiments which have been published in the literature. But those do not provide specific details on the capabilities of the simulators which are necessary for an understanding sufficient to select the one best suited for an organization's needs.

  7. Development and Demonstration of a Computational Tool for the Analysis of Particle Vitiation Effects in Hypersonic Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Perkins, Hugh Douglas

    2010-01-01

    In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.

  8. Distributed collaborative environments for virtual capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.

  9. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  10. Application Program Interface for the Orion Aerodynamics Database

    NASA Technical Reports Server (NTRS)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced

  11. Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia; hide

    2011-01-01

    An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.

  12. A manipulator arm for zero-g simulations

    NASA Technical Reports Server (NTRS)

    Brodie, S. B.; Grant, C.; Lazar, J. J.

    1975-01-01

    A 12-ft counterbalanced Slave Manipulator Arm (SMA) was designed and fabricated to be used for resolving the questions of operational applications, capabilities, and limitations for such remote manned systems as the Payload Deployment and Retrieval Mechanism (PDRM) for the shuttle, the Free-Flying Teleoperator System, the Advanced Space Tug, and Planetary Rovers. As a developmental tool for the shuttle manipulator system (or PDRM), the SMA represents an approximate one-quarter scale working model for simulating and demonstrating payload handling, docking assistance, and satellite servicing. For the Free-Flying Teleoperator System and the Advanced Tug, the SMA provides a near full-scale developmental tool for satellite servicing, docking, and deployment/retrieval procedures, techniques, and support equipment requirements. For the Planetary Rovers, it provides an oversize developmental tool for sample handling and soil mechanics investigations. The design of the SMA was based on concepts developed for a 40-ft NASA technology arm to be used for zero-g shuttle manipulator simulations.

  13. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.

  14. Applying a multi-replication framework to support dynamic situation assessment and predictive capabilities

    NASA Astrophysics Data System (ADS)

    Lammers, Craig; McGraw, Robert M.; Steinman, Jeffrey S.

    2005-05-01

    Technological advances and emerging threats reduce the time between target detection and action to an order of a few minutes. To effectively assist with the decision-making process, C4I decision support tools must quickly and dynamically predict and assess alternative Courses Of Action (COAs) to assist Commanders in anticipating potential outcomes. These capabilities can be provided through the faster-than-real-time predictive simulation of plans that are continuously re-calibrating with the real-time picture. This capability allows decision-makers to assess the effects of re-tasking opportunities, providing the decision-maker with tremendous freedom to make time-critical, mid-course decisions. This paper presents an overview and demonstrates the use of a software infrastructure that supports DSAP capabilities. These DSAP capabilities are demonstrated through the use of a Multi-Replication Framework that supports (1) predictivie simulations using JSAF (Joint Semi-Automated Forces); (2) real-time simulation, also using JSAF, as a state estimation mechanism; and, (3) real-time C4I data updates through TBMCS (Theater Battle Management Core Systems). This infrastructure allows multiple replications of a simulation to be executed simultaneously over a grid faster-than-real-time, calibrated with live data feeds. A cost evaluator mechanism analyzes potential outcomes and prunes simulations that diverge from the real-time picture. In particular, this paper primarily serves to walk a user through the process for using the Multi-Replication Framework providing an enhanced decision aid.

  15. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-03-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  16. Unsteady Turbopump Flow Simulations

    NASA Technical Reports Server (NTRS)

    Centin, Kiris C.; Kwak, Dochan

    2001-01-01

    The objective of the current effort is two-fold: 1) to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine; and 2) to provide high-fidelity unsteady turbopump flow analysis capability to support the design of pump sub-systems for advanced space transportation vehicle. Since the space launch systems in the near future are likely to involve liquid propulsion system, increasing the efficiency and reliability of the turbopump components is an important task. To date, computational tools for design/analysis of turbopump flow are based on relatively lower fidelity methods. Unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available, at least, for real-world engineering applications. Present effort is an attempt to provide this capability so that developers of the vehicle will be able to extract such information as transient flow phenomena for start up, impact of non-uniform inflow, system vibration and impact on the structure. Those quantities are not readily available from simplified design tools. In this presentation, the progress being made toward complete turbo-pump simulation capability for a liquid rocket engine is reported. Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for the performance evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. Relative motion of the grid system for rotor-stator interaction was obtained by employing overset grid techniques. Time-accuracy of the scheme has been evaluated by using simple test cases. Unsteady computations for SSME turbopump, which contains 106 zones with 34.5 Million grid points, are currently underway on Origin 2000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability and the performance of the parallel versions of the code will be presented.

  17. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  18. Distributed environmental control

    NASA Technical Reports Server (NTRS)

    Cleveland, Gary A.

    1992-01-01

    We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).

  19. Six degree of freedom simulation system for evaluating automated rendezvous and docking spacecraft

    NASA Technical Reports Server (NTRS)

    Rourke, Kenneth H.; Tsugawa, Roy K.

    1991-01-01

    Future logistics supply and servicing vehicles such as cargo transfer vehicles (CTV) must have full 6 degree of freedom (6DOF) capability in order to perform requisite rendezvous, proximity operations, and capture operations. The design and performance issues encountered when developing a 6DOF maneuvering spacecraft are very complex with subtle interactions which are not immediately obvious or easily anticipated. In order to deal with these complexities and develop robust maneuvering spacecraft designs, a simulation system and associated family of tools are used at TRW for generating and validating spacecraft performance requirements and guidance algorithms. An overview of the simulator and tools is provided. These are used by TRW for autonomous rendezvous and docking research projects including CTV studies.

  20. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environmentmore » and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  2. An AI approach for scheduling space-station payloads at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Castillo, D.; Ihrie, D.; Mcdaniel, M.; Tilley, R.

    1987-01-01

    The Payload Processing for Space-Station Operations (PHITS) is a prototype modeling tool capable of addressing many Space Station related concerns. The system's object oriented design approach coupled with a powerful user interface provide the user with capabilities to easily define and model many applications. PHITS differs from many artificial intelligence based systems in that it couples scheduling and goal-directed simulation to ensure that on-orbit requirement dates are satisfied.

  3. A Three-Dimensional Parallel Time-Accurate Turbopump Simulation Procedure Using Overset Grid System

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Chan, William; Kwak, Dochan

    2002-01-01

    The objective of the current effort is to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine, including high-fidelity unsteady turbopump flow analysis. This capability is needed to support the design of pump sub-systems for advanced space transportation vehicles that are likely to involve liquid propulsion systems. To date, computational tools for design/analysis of turbopump flows are based on relatively lower fidelity methods. An unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available for real-world engineering applications. The present effort provides developers with information such as transient flow phenomena at start up, and nonuniform inflows, and will eventually impact on system vibration and structures. In the proposed paper, the progress toward the capability of complete simulation of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. CAD to solution auto-scripting capability is being developed for turbopump applications. The relative motion of the grid systems for the rotor-stator interaction was obtained using overset grid techniques. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on Origin 3000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability are presented along with the performance of parallel versions of the code.

  4. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  5. Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Plantenga, Todd D.

    2010-06-01

    The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.

  6. SpaceNet: Modeling and Simulating Space Logistics

    NASA Technical Reports Server (NTRS)

    Lee, Gene; Jordan, Elizabeth; Shishko, Robert; de Weck, Olivier; Armar, Nii; Siddiqi, Afreen

    2008-01-01

    This paper summarizes the current state of the art in interplanetary supply chain modeling and discusses SpaceNet as one particular method and tool to address space logistics modeling and simulation challenges. Fundamental upgrades to the interplanetary supply chain framework such as process groups, nested elements, and cargo sharing, enabled SpaceNet to model an integrated set of missions as a campaign. The capabilities and uses of SpaceNet are demonstrated by a step-by-step modeling and simulation of a lunar campaign.

  7. An extensive coronagraphic simulation applied to LBT

    NASA Astrophysics Data System (ADS)

    Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.

    2016-08-01

    In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.

  8. Development of a Searchable Metabolite Database and Simulator of Xenobiotic Metabolism

    EPA Science Inventory

    A computational tool (MetaPath) has been developed for storage and analysis of metabolic pathways and associated metadata. The system is capable of sophisticated text and chemical structure/substructure searching as well as rapid comparison of metabolites formed across chemicals,...

  9. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  10. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  11. Development and operation of a real-time simulation at the NASA Ames Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Sheppard, Shirin; Chetelat, Monique

    1993-01-01

    The Vertical Motion Simulator (VMS) facility at the NASA Ames Research Center combines the largest vertical motion capability in the world with a flexible real-time operating system allowing research to be conducted quickly and effectively. Due to the diverse nature of the aircraft simulated and the large number of simulations conducted annually, the challenge for the simulation engineer is to develop an accurate real-time simulation in a timely, efficient manner. The SimLab facility and the software tools necessary for an operating simulation will be discussed. Subsequent sections will describe the development process through operation of the simulation; this includes acceptance of the model, validation, integration and production phases.

  12. eLoom and Flatland: specification, simulation and visualization engines for the study of arbitrary hierarchical neural architectures.

    PubMed

    Caudell, Thomas P; Xiao, Yunhai; Healy, Michael J

    2003-01-01

    eLoom is an open source graph simulation software tool, developed at the University of New Mexico (UNM), that enables users to specify and simulate neural network models. Its specification language and libraries enables users to construct and simulate arbitrary, potentially hierarchical network structures on serial and parallel processing systems. In addition, eLoom is integrated with UNM's Flatland, an open source virtual environments development tool to provide real-time visualizations of the network structure and activity. Visualization is a useful method for understanding both learning and computation in artificial neural networks. Through 3D animated pictorially representations of the state and flow of information in the network, a better understanding of network functionality is achieved. ART-1, LAPART-II, MLP, and SOM neural networks are presented to illustrate eLoom and Flatland's capabilities.

  13. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. VIPER: Virtual Intelligent Planetary Exploration Rover

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Flueckiger, Lorenzo; Nguyen, Laurent; Washington, Richard

    2001-01-01

    Simulation and visualization of rover behavior are critical capabilities for scientists and rover operators to construct, test, and validate plans for commanding a remote rover. The VIPER system links these capabilities. using a high-fidelity virtual-reality (VR) environment. a kinematically accurate simulator, and a flexible plan executive to allow users to simulate and visualize possible execution outcomes of a plan under development. This work is part of a larger vision of a science-centered rover control environment, where a scientist may inspect and explore the environment via VR tools, specify science goals, and visualize the expected and actual behavior of the remote rover. The VIPER system is constructed from three generic systems, linked together via a minimal amount of customization into the integrated system. The complete system points out the power of combining plan execution, simulation, and visualization for envisioning rover behavior; it also demonstrates the utility of developing generic technologies. which can be combined in novel and useful ways.

  15. Analysis of Error Propagation Within Hierarchical Air Combat Models

    DTIC Science & Technology

    2016-06-01

    Model Simulation MANA Map Aware Non-Uniform Automata MCET Mine Warfare Capabilities and Effectiveness Tool MOE measure of effectiveness MOP measure of...model for a two-versus-two air engagement between jet fighters in the stochastic, agent-based Map Aware Non- uniform Automata (MANA) simulation...Master’s thesis, Naval Postgraduate School, Monterey, CA. McIntosh, G. C. (2009). MANA-V (Map aware non-uniform automata – Vector) supplementary manual

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less

  17. A generic multi-flex-body dynamics, controls simulation tool for space station

    NASA Technical Reports Server (NTRS)

    London, Ken W.; Lee, John F.; Singh, Ramen P.; Schubele, Buddy

    1991-01-01

    An order (n) multiflex body Space Station simulation tool is introduced. The flex multibody modeling is generic enough to model all phases of Space Station from build up through to Assembly Complete configuration and beyond. Multibody subsystems such as the Mobile Servicing System (MSS) undergoing a prescribed translation and rotation are also allowed. The software includes aerodynamic, gravity gradient, and magnetic field models. User defined controllers can be discrete or continuous. Extensive preprocessing of 'body by body' NASTRAN flex data is built in. A significant aspect, too, is the integrated controls design capability which includes model reduction and analytic linearization.

  18. Progress on the Multiphysics Capabilities of the Parallel Electromagnetic ACE3P Simulation Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kononenko, Oleksiy

    2015-03-26

    ACE3P is a 3D parallel simulation suite that is being developed at SLAC National Accelerator Laboratory. Effectively utilizing supercomputer resources, ACE3P has become a key tool for the coupled electromagnetic, thermal and mechanical research and design of particle accelerators. Based on the existing finite-element infrastructure, a massively parallel eigensolver is developed for modal analysis of mechanical structures. It complements a set of the multiphysics tools in ACE3P and, in particular, can be used for the comprehensive study of microphonics in accelerating cavities ensuring the operational reliability of a particle accelerator.

  19. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  20. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    NASA Astrophysics Data System (ADS)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  1. The Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Butler, Douglas J.; Kerstman, Eric

    2010-01-01

    This slide presentation reviews the goals and approach for the Integrated Medical Model (IMM). The IMM is a software decision support tool that forecasts medical events during spaceflight and optimizes medical systems during simulations. It includes information on the software capabilities, program stakeholders, use history, and the software logic.

  2. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.

  3. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Zinnecker, Alicia

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.

  4. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey Thomas; Zinnecker, Alicia Mae

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.

  5. An effective approach for road asset management through the FDTD simulation of the GPR signal

    NASA Astrophysics Data System (ADS)

    Benedetto, Andrea; Pajewski, Lara; Adabi, Saba; Kusayanagi, Wolfgang; Tosti, Fabio

    2015-04-01

    Ground-penetrating radar is a non-destructive tool widely used in many fields of application including pavement engineering surveys. Over the last decade, the need for further breakthroughs capable to assist end-users and practitioners as decision-support systems in more effective road asset management is increasing. In more details and despite the high potential and the consolidated results obtained over years by this non-destructive tool, pavement distress manuals are still based on visual inspections, so that only the effects and not the causes of faults are generally taken into account. In this framework, the use of simulation can represent an effective solution for supporting engineers and decision-makers in understanding the deep responses of both revealed and unrevealed damages. In this study, the potential of using finite-difference time-domain simulation of the ground-penetrating radar signal is analyzed by simulating several types of flexible pavement at different center frequencies of investigation typically used for road surveys. For these purposes, the numerical simulator GprMax2D, implementing the finite-difference time-domain method, was used, proving to be a highly effective tool for detecting road faults. In more details, comparisons with simplified undisturbed modelled pavement sections were carried out showing promising agreements with theoretical expectations, and good chances for detecting the shape of damages are demonstrated. Therefore, electromagnetic modelling has proved to represent a valuable support system in diagnosing the causes of damages, even for early or unrevealed faults. Further perspectives of this research will be focused on the modelling of more complex scenarios capable to represent more accurately the real boundary conditions of road cross-sections. Acknowledgements - This work has benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar".

  6. Risk Reduction and Training using Simulation Based Tools - 12180

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Irin P.

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less

  7. A Computational Framework for Efficient Low Temperature Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  8. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  9. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems II. Extension to the thermal infrared: equations and methods

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.

    2011-10-01

    In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.

  10. Simulating the Composite Propellant Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Williamson, Suzanne; Love, Gregory

    2000-01-01

    There is a strategic interest in understanding how the propellant manufacturing process contributes to military capabilities outside the United States. The paper will discuss how system dynamics (SD) has been applied to rapidly assess the capabilities and vulnerabilities of a specific composite propellant production complex. These facilities produce a commonly used solid propellant with military applications. The authors will explain how an SD model can be configured to match a specific production facility followed by a series of scenarios designed to analyze operational vulnerabilities. By using the simulation model to rapidly analyze operational risks, the analyst gains a better understanding of production complexities. There are several benefits of developing SD models to simulate chemical production. SD is an effective tool for characterizing complex problems, especially the production process where the cascading effect of outages quickly taxes common understanding. By programming expert knowledge into an SD application, these tools are transformed into a knowledge management resource that facilitates rapid learning without requiring years of experience in production operations. It also permits the analyst to rapidly respond to crisis situations and other time-sensitive missions. Most importantly, the quantitative understanding gained from applying the SD model lends itself to strategic analysis and planning.

  11. a Simulation Tool Assisting the Design of a Close Range Photogrammetry System for the Sardinia Radio Telescope

    NASA Astrophysics Data System (ADS)

    Buffa, F.; Pinna, A.; Sanna, G.

    2016-06-01

    The Sardinia Radio Telescope (SRT) is a 64 m diameter antenna, whose primary mirror is equipped with an active surface capable to correct its deformations by means of a thick network of actuators. Close range photogrammetry (CRP) was used to measure the self-load deformations of the SRT primary reflector from its optimal shape, which are requested to be minimized for the radio telescope to operate at full efficiency. In the attempt to achieve such performance, we conceived a near real-time CRP system which requires the cameras to be installed in fixed positions and at the same time to avoid any interference with the antenna operativeness. The design of such system is not a trivial task, and to assist our decision we therefore developed a simulation pipeline to realistically reproduce and evaluate photogrammetric surveys of large structures. The described simulation environment consists of (i) a detailed description of the SRT model, included the measurement points and the camera parameters, (ii) a tool capable of generating realistic images accordingly to the above model, and (iii) a self-calibrating bundle adjustment to evaluate the performance in terms of RMSE of the camera configurations.

  12. CAPS Simulation Environment Development

    NASA Technical Reports Server (NTRS)

    Murphy, Douglas G.; Hoffman, James A.

    2005-01-01

    The final design for an effective Comet/Asteroid Protection System (CAPS) will likely come after a number of competing designs have been simulated and evaluated. Because of the large number of design parameters involved in a system capable of detecting an object, accurately determining its orbit, and diverting the impact threat, a comprehensive simulation environment will be an extremely valuable tool for the CAPS designers. A successful simulation/design tool will aid the user in identifying the critical parameters in the system and eventually allow for automatic optimization of the design once the relationships of the key parameters are understood. A CAPS configuration will consist of space-based detectors whose purpose is to scan the celestial sphere in search of objects likely to make a close approach to Earth and to determine with the greatest possible accuracy the orbits of those objects. Other components of a CAPS configuration may include systems for modifying the orbits of approaching objects, either for the purpose of preventing a collision or for positioning the object into an orbit where it can be studied or used as a mineral resource. The Synergistic Engineering Environment (SEE) is a space-systems design, evaluation, and visualization software tool being leveraged to simulate these aspects of the CAPS study. The long-term goal of the SEE is to provide capabilities to allow the user to build and compare various CAPS designs by running end-to-end simulations that encompass the scanning phase, the orbit determination phase, and the orbit modification phase of a given scenario. Herein, a brief description of the expected simulation phases is provided, the current status and available features of the SEE software system is reported, and examples are shown of how the system is used to build and evaluate a CAPS detection design. Conclusions and the roadmap for future development of the SEE are also presented.

  13. Analyzing JAVAD TR-G2 GPS Receiver's Sensitivities to SLS Trajectory

    NASA Technical Reports Server (NTRS)

    Schuler, Tristan

    2017-01-01

    Automated guidance and navigation systems are an integral part to successful space missions. Previous researchers created Python tools to receive and parse data from a JAVAD TR-G2 space-capable GPS receiver. I improved the tool by customizing the output for plotting and comparing several simulations. I analyzed position errors, data loss, and signal loss by comparing simulated receiver data from an IFEN GPS simulator to ‘truth data’ from a proposed trajectory. By adjusting the trajectory simulation’s gain, attitude, and start time, NASA can assess the best time to launch the SLS, where to position the antennas on the Block 1-B, and which filter to use. Some additional testing has begun with the Novatel SpaceQuestGPS receiver as well as a GNSS SDR receiver.

  14. NY TBO Research: Integrated Demand Management (IDM): IDM Concept, Tools, and Training Package

    NASA Technical Reports Server (NTRS)

    Smith, Nancy

    2016-01-01

    A series of human-in-the-loop simulation sessions were conducted in the Airspace Operations Laboratory (AOL) to evaluate a new traffic management concept called Integrated Demand Management (IDM). The simulation explored how to address chronic equity, throughput and delay issues associated with New Yorks high-volume airports by operationally integrating three current and NextGen capabilities the Collaborative Trajectory Options Program (CTOP), Time-Based Flow Management (TBFM) and Required Time of Arrival (RTA) in order to better manage traffic demand within the National Air Traffic System. A package of presentation slides was developed to describe the concept, tools, and training materials used in the simulation sessions. The package will be used to outbrief our stakeholders by both presenting orally and disseminating of the materials via email.

  15. 10 CFR 434.521 - The simulation tool.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...

  16. 10 CFR 434.521 - The simulation tool.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...

  17. 10 CFR 434.521 - The simulation tool.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...

  18. 10 CFR 434.521 - The simulation tool.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of the building including night setback during various times of the year; and 521.1.5 Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost..., and ground-coupled buildings. In addition, models shall be capable of translating the Design Energy...

  19. 10 CFR 434.521 - The simulation tool.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of the building including night setback during various times of the year; and 521.1.5Energy consumption information at a level necessary to determine the Energy Cost Budget and Design Energy Cost... buildings. In addition, models shall be capable of translating the Design Energy Consumption into energy...

  20. Modeling and Controls Development of 48V Mild Hybrid Electric Vehicles

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool (ALPHA) was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...

  1. ASCEM Data Brower (ASCEMDB) v0.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ROMOSAN, ALEXANDRU

    Data management tool designed for the Advanced Simulation Capability for Environmental Management (ASCEM) framework. Distinguishing features of this gateway include: (1) handling of complex geometry data, (2) advance selection mechanism, (3) state of art rendering of spatiotemporal data records, and (4) seamless integration with a distributed workflow engine.

  2. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  3. Damage Simulation in Composite Materials: Why It Matters and What Is Happening Currently at NASA in This Area

    NASA Technical Reports Server (NTRS)

    McElroy, Mack; de Carvalho, Nelson; Estes, Ashley; Lin, Shih-yung

    2017-01-01

    Use of lightweight composite materials in space and aircraft structure designs is often challenging due to high costs associated with structural certification. Of primary concern in the use of composite structures is durability and damage tolerance. This concern is due to the inherent susceptibility of composite materials to both fabrication and service induced flaws. Due to a lack of general industry accepted analysis tools applicable to composites damage simulation, a certification procedure relies almost entirely on testing. It is this reliance on testing, especially compared to structures comprised of legacy metallic materials where damage simulation tools are available, that can drive costs for using composite materials in aerospace structures. The observation that use of composites can be expensive due to testing requirements is not new and as such, research on analysis tools for simulating damage in composite structures has been occurring for several decades. A convenient approach many researchers/model-developers in this area have taken is to select a specific problem relevant to aerospace structural certification and develop a model that is accurate within that scope. Some examples are open hole tension tests, compression after impact tests, low-velocity impact, damage tolerance of an embedded flaw, and fatigue crack growth to name a few. Based on the premise that running analyses is cheaper than running tests, one motivation that many researchers in this area have is that if generally applicable and reliable damage simulation tools were available the dependence on certification testing could be lessened thereby reducing overall design cost. It is generally accepted that simulation tools if applied in this manner would still need to be thoroughly validated and that composite testing will never be completely replaced by analysis. Research and development is currently occurring at NASA to create numerical damage simulation tools applicable to damage in composites. The Advanced Composites Project (ACP) at NASA Langley has supported the development of composites damage simulation tools in a consortium of aerospace companies with a goal of reducing the certification time of a commercial aircraft by 30%. And while the scope of ACP does not include spacecraft, much of the methodology and simulation capabilities can apply to spacecraft certification in the Space Launch System and Orion programs as well. Some specific applications of composite damage simulation models in a certification program are (1) evaluation of damage during service when maintenance may be difficult or impossible, (2) a tool for early design iterations, (3) gaining insight into a particular damage process and applying this insight towards a test coupon or structural design, and (4) analysis of damage scenarios that are difficult or impossible to recreate in a test. As analysis capabilities improve, these applications and more will become realized resulting in a reduction in cost for use of composites in aerospace vehicles. NASA is engaged in this process from both research and application perspectives. In addition to the background information discussed previously, this presentation covers a look at recent research at NASA in this area and some current/potential applications in the Orion program.

  4. Monte Carlo capabilities of the SCALE code system

    DOE PAGES

    Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...

    2014-09-12

    SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less

  5. Teaching tactical combat casualty care using the TC3 sim game-based simulation: a study to measure training effectiveness.

    PubMed

    Sotomayor, Teresita M

    2010-01-01

    The effectiveness of games as instructional tools has been debated over the past several decades. This is due to the lack of empirical data to support such claims. The US ARMY developed a game-based simulation to support Tactical Combat Casualty Care (TCCC) Training. The TC3 Game based Simulation is a first person game that allows a Soldier to play the role of a combat medic during an infantry squad mission in an urban environment. This research documents results from a training effectiveness evaluation conducted at the Department of Combat Medic Training (Ft Sam Houston) in an effort to explore the capability of the game based simulation as a potential tool to support the TCCC program of instruction. Reaction to training, as well as, acquisition of knowledge and transfer of skills were explored using Kirkpatrick's Model of Training Effectiveness Evaluation. Results from the evaluation are discussed.

  6. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  7. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danon, Yaron; Nazarewicz, Witold; Talou, Patrick

    2013-02-18

    This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less

  8. A Three Dimensional Parallel Time Accurate Turbopump Simulation Procedure Using Overset Grid Systems

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Chan, William; Kwak, Dochan

    2001-01-01

    The objective of the current effort is to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine, including high-fidelity unsteady turbopump flow analysis. This capability is needed to support the design of pump sub-systems for advanced space transportation vehicles that are likely to involve liquid propulsion systems. To date, computational tools for design/analysis of turbopump flows are based on relatively lower fidelity methods. An unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available for real-world engineering applications. The present effort provides developers with information such as transient flow phenomena at start up, and non-uniform inflows, and will eventually impact on system vibration and structures. In the proposed paper, the progress toward the capability of complete simulation of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. CAD to solution auto-scripting capability is being developed for turbopump applications. The relative motion of the grid systems for the rotor-stator interaction was obtained using overset grid techniques. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on Origin 3000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability will be presented along with the performance of parallel versions of the code.

  9. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  10. Telescience - Concepts And Contributions To The Extreme Ultraviolet Explorer Mission

    NASA Astrophysics Data System (ADS)

    Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.

    1987-10-01

    A goal of the telescience concept is to allow scientists to use remotely located instruments as they would in their laboratory. Another goal is to increase reliability and scientific return of these instruments. In this paper we discuss the role of transparent software tools in development, integration, and postlaunch environments to achieve hands on access to the instrument. The use of transparent tools helps to reduce the parallel development of capability and to assure that valuable pre-launch experience is not lost in the operations phase. We also discuss the use of simulation as a rapid prototyping technique. Rapid prototyping provides a cost-effective means of using an iterative approach to instrument design. By allowing inexpensive produc-tion of testbeds, scientists can quickly tune the instrument to produce the desired scientific data. Using portions of the Extreme Ultraviolet Explorer (EUVE) system, we examine some of the results of preliminary tests in the use of simulation and tran-sparent tools. Additionally, we discuss our efforts to upgrade our software "EUVE electronics" simulator to emulate a full instrument, and give the pros and cons of the simulation facilities we have developed.

  11. Orion Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Hoelscher, Brian R.

    2007-01-01

    The Orion Entry, Descent, and Landing simulation was created over the past two years to serve as the primary Crew Exploration Vehicle guidance, navigation, and control (GN&C) design and analysis tool at the National Aeronautics and Space Administration (NASA). The Advanced NASA Technology Architecture for Exploration Studies (ANTARES) simulation is a six degree-of-freedom tool with a unique design architecture which has a high level of flexibility. This paper describes the decision history and motivations that guided the creation of this simulation tool. The capabilities of the models within ANTARES are presented in detail. Special attention is given to features of the highly flexible GN&C architecture and the details of the implemented GN&C algorithms. ANTARES provides a foundation simulation for the Orion Project that has already been successfully used for requirements analysis, system definition analysis, and preliminary GN&C design analysis. ANTARES will find useful application in engineering analysis, mission operations, crew training, avionics-in-the-loop testing, etc. This paper focuses on the entry simulation aspect of ANTARES, which is part of a bigger simulation package supporting the entire mission profile of the Orion vehicle. The unique aspects of entry GN&C design are covered, including how the simulation is being used for Monte Carlo dispersion analysis and for support of linear stability analysis. Sample simulation output from ANTARES is presented in an appendix.

  12. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  13. Multiscale Data Assimilation for Large-Eddy Simulations

    NASA Astrophysics Data System (ADS)

    Li, Z.; Cheng, X.; Gustafson, W. I., Jr.; Xiao, H.; Vogelmann, A. M.; Endo, S.; Toto, T.

    2017-12-01

    Large-eddy simulation (LES) is a powerful tool for understanding atmospheric turbulence, boundary layer physics and cloud development, and there is a great need for developing data assimilation methodologies that can constrain LES models. The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) User Facility has been developing the capability to routinely generate ensembles of LES. The LES ARM Symbiotic Simulation and Observation (LASSO) project (https://www.arm.gov/capabilities/modeling/lasso) is generating simulations for shallow convection days at the ARM Southern Great Plains site in Oklahoma. One of major objectives of LASSO is to develop the capability to observationally constrain LES using a hierarchy of ARM observations. We have implemented a multiscale data assimilation (MSDA) scheme, which allows data assimilation to be implemented separately for distinct spatial scales, so that the localized observations can be effectively assimilated to constrain the mesoscale fields in the LES area of about 15 km in width. The MSDA analysis is used to produce forcing data that drive LES. With such LES workflow we have examined 13 days with shallow convection selected from the period May-August 2016. We will describe the implementation of MSDA, present LES results, and address challenges and opportunities for applying data assimilation to LES studies.

  14. Test-Analysis Correlation for Space Shuttle External Tank Foam Impacting RCC Wing Leading Edge Component Panels

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2008-01-01

    The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.

  15. Modeling and Validation of Lithium-ion Automotive Battery Packs (SAE 2013-01-1539)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...

  16. Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...

  17. Teaching the Concept of Gibbs Energy Minimization through Its Application to Phase-Equilibrium Calculation

    ERIC Educational Resources Information Center

    Privat, Romain; Jaubert, Jean-Noe¨l; Berger, Etienne; Coniglio, Lucie; Lemaitre, Ce´cile; Meimaroglou, Dimitrios; Warth, Vale´rie

    2016-01-01

    Robust and fast methods for chemical or multiphase equilibrium calculation are routinely needed by chemical-process engineers working on sizing or simulation aspects. Yet, while industrial applications essentially require calculation tools capable of discriminating between stable and nonstable states and converging to nontrivial solutions,…

  18. A high-fidelity, six-degree-of-freedom batch simulation environment for tactical guidance research and evaluation

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.

    1993-01-01

    A batch air combat simulation environment, the tactical maneuvering simulator (TMS), is presented. The TMS is a tool for developing and evaluating tactical maneuvering logics, but it can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS can simulate air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics, and propulsive characteristics equivalent to those used in high-fidelity piloted simulations. Data bases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system, the tactical autopilot (TA), is implemented in the aircraft simulation model. The TA converts guidance commands by computerized maneuvering logics from desired angle of attack and wind-axis bank-angle inputs to the inner loop control augmentation system of the aircraft. The capabilities and operation of the TMS and the TA are described.

  19. Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems

    NASA Astrophysics Data System (ADS)

    Williams, John W.; Potter, Gary E.

    2002-11-01

    QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.

  20. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.

    PubMed

    McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S

    2015-10-20

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories

    PubMed Central

    McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.

    2015-01-01

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642

  2. A dynamic simulation based water resources education tool.

    PubMed

    Williams, Alison; Lansey, Kevin; Washburne, James

    2009-01-01

    Educational tools to assist the public in recognizing impacts of water policy in a realistic context are not generally available. This project developed systems with modeling-based educational decision support simulation tools to satisfy this need. The goal of this model is to teach undergraduate students and the general public about the implications of common water management alternatives so that they can better understand or become involved in water policy and make more knowledgeable personal or community decisions. The model is based on Powersim, a dynamic simulation software package capable of producing web-accessible, intuitive, graphic, user-friendly interfaces. Modules are included to represent residential, agricultural, industrial, and turf uses, as well as non-market values, water quality, reservoir, flow, and climate conditions. Supplementary materials emphasize important concepts and lead learners through the model, culminating in an open-ended water management project. The model is used in a University of Arizona undergraduate class and within the Arizona Master Watershed Stewards Program. Evaluation results demonstrated improved understanding of concepts and system interactions, fulfilling the project's objectives.

  3. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  4. State-of-the-Art for Hygrothermal Simulation Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R.; New, Joshua Ryan; Shrestha, Som S.

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability tomore » properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.« less

  5. Advanced Simulation and Computing Business Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rummel, E.

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsiblemore » for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.« less

  6. Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling

    NASA Astrophysics Data System (ADS)

    Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.

    2016-11-01

    Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.

  7. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool; (2) a low fidelity simulator development tool; (3) a dynamic, interactive interface between the HCI and the simulator; and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  8. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool, (2) a low fidelity simulator development tool, (3) a dynamic, interactive interface between the HCI and the simulator, and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  9. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  10. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  11. Human and Robotic Mission to Small Bodies: Mapping, Planning and Exploration

    NASA Technical Reports Server (NTRS)

    Neffian, Ara V.; Bellerose, Julie; Beyer, Ross A.; Archinal, Brent; Edwards, Laurence; Lee, Pascal; Colaprete, Anthony; Fong, Terry

    2013-01-01

    This study investigates the requirements, performs a gap analysis and makes a set of recommendations for mapping products and exploration tools required to support operations and scientific discovery for near- term and future NASA missions to small bodies. The mapping products and their requirements are based on the analysis of current mission scenarios (rendezvous, docking, and sample return) and recommendations made by the NEA Users Team (NUT) in the framework of human exploration. The mapping products that sat- isfy operational, scienti c, and public outreach goals include topography, images, albedo, gravity, mass, density, subsurface radar, mineralogical and thermal maps. The gap analysis points to a need for incremental generation of mapping products from low (flyby) to high-resolution data needed for anchoring and docking, real-time spatial data processing for hazard avoidance and astronaut or robot localization in low gravity, high dynamic environments, and motivates a standard for coordinate reference systems capable of describing irregular body shapes. Another aspect investigated in this study is the set of requirements and the gap analysis for exploration tools that support visualization and simulation of operational conditions including soil interactions, environment dynamics, and communications coverage. Building robust, usable data sets and visualisation/simulation tools is the best way for mission designers and simulators to make correct decisions for future missions. In the near term, it is the most useful way to begin building capabilities for small body exploration without needing to commit to specific mission architectures.

  12. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  13. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preece, D.S.; Knudsen, S.D.

    The spherical element computer code DMC (Distinct Motion Code) used to model rock motion resulting from blasting has been enhanced to allow routine computer simulations of bench blasting. The enhancements required for bench blast simulation include: (1) modifying the gas flow portion of DMC, (2) adding a new explosive gas equation of state capability, (3) modifying the porosity calculation, and (4) accounting for blastwell spacing parallel to the face. A parametric study performed with DMC shows logical variation of the face velocity as burden, spacing, blastwell diameter and explosive type are varied. These additions represent a significant advance in themore » capability of DMC which will not only aid in understanding the physics involved in blasting but will also become a blast design tool. 8 refs., 7 figs., 1 tab.« less

  15. Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.

    2016-01-01

    The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.

  16. D-VASim: an interactive virtual laboratory environment for the simulation and analysis of genetic circuits.

    PubMed

    Baig, Hasan; Madsen, Jan

    2017-01-15

    Simulation and behavioral analysis of genetic circuits is a standard approach of functional verification prior to their physical implementation. Many software tools have been developed to perform in silico analysis for this purpose, but none of them allow users to interact with the model during runtime. The runtime interaction gives the user a feeling of being in the lab performing a real world experiment. In this work, we present a user-friendly software tool named D-VASim (Dynamic Virtual Analyzer and Simulator), which provides a virtual laboratory environment to simulate and analyze the behavior of genetic logic circuit models represented in an SBML (Systems Biology Markup Language). Hence, SBML models developed in other software environments can be analyzed and simulated in D-VASim. D-VASim offers deterministic as well as stochastic simulation; and differs from other software tools by being able to extract and validate the Boolean logic from the SBML model. D-VASim is also capable of analyzing the threshold value and propagation delay of a genetic circuit model. D-VASim is available for Windows and Mac OS and can be downloaded from bda.compute.dtu.dk/downloads/. haba@dtu.dk, jama@dtu.dk. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  18. Assessment of the Neutronic and Fuel Cycle Performance of the Transatomic Power Molten Salt Reactor Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Sean; Dewan, Leslie; Massie, Mark

    This report presents results from a collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear (GAIN) Nuclear Energy Voucher program. The TAP concept is a molten salt reactor using configurable zirconium hydride moderator rod assemblies to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches and time-dependent parametersmore » necessary to simulate the continuously changing physics in this complex system. The implementation of continuous-energy Monte Carlo transport and depletion tools in ChemTriton provide for full-core three-dimensional modeling and simulation. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this concept. Additional analyses of mass feed rates and enrichments, isotopic removals, tritium generation, core power distribution, core vessel helium generation, moderator rod heat deposition, and reactivity coeffcients provide additional information to make informed design decisions. This work demonstrates capabilities of ORNL modeling and simulation tools for neutronic and fuel cycle analysis of molten salt reactor concepts.« less

  19. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    PubMed

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  20. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  1. Assessment of the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing

    2007-01-01

    The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.

  2. A computer aided engineering tool for ECLS systems

    NASA Technical Reports Server (NTRS)

    Bangham, Michal E.; Reuter, James L.

    1987-01-01

    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  3. Modular Analytical Multicomponent Analysis in Gas Sensor Aarrays

    PubMed Central

    Chaiyboun, Ali; Traute, Rüdiger; Kiesewetter, Olaf; Ahlers, Simon; Müller, Gerhard; Doll, Theodor

    2006-01-01

    A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.

  4. Multi Sector Planning Tools for Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Mainini, Matthew; Brasil, Connie

    2010-01-01

    This paper discusses a suite of multi sector planning tools for trajectory-based operations that were developed and evaluated in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. The toolset included tools for traffic load and complexity assessment as well as trajectory planning and coordination. The situation assessment tools included an integrated suite of interactive traffic displays, load tables, load graphs, and dynamic aircraft filters. The planning toolset allowed for single and multi aircraft trajectory planning and data communication-based coordination of trajectories between operators. Also newly introduced was a real-time computation of sector complexity into the toolset that operators could use in lieu of aircraft count to better estimate and manage sector workload, especially in situations with convective weather. The tools were used during a joint NASA/FAA multi sector planner simulation in the AOL in 2009 that had multiple objectives with the assessment of the effectiveness of the tools being one of them. Current air traffic control operators who were experienced as area supervisors and traffic management coordinators used the tools throughout the simulation and provided their usefulness and usability ratings in post simulation questionnaires. This paper presents these subjective assessments as well as the actual usage data that was collected during the simulation. The toolset was rated very useful and usable overall. Many elements received high scores by the operators and were used frequently and successfully. Other functions were not used at all, but various requests for new functions and capabilities were received that could be added to the toolset.

  5. Modeling and Validation of Power-split and P2 Parallel Hybrid Electric Vehicles SAE 2013-01-1470)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types combined ...

  6. Spatial Cognition Support for Exploring the Design Mechanics of Building Structures

    ERIC Educational Resources Information Center

    Rudy, Margit; Hauck, Richard

    2008-01-01

    A web-based tool for visualizing the simulated structural behavior of building models was developed to support the teaching of structural design to architecture and engineering students by activating their spatial cognition capabilities. The main didactic issues involved establishing a consistent and complete three-dimensional vocabulary (3D)…

  7. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  8. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  9. Guided wave energy trapping to detect hidden multilayer delamination damage

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.

    2015-03-01

    Nondestructive Evaluation (NDE) and Structural Health Monitoring (SHM) simulation tools capable of modeling three-dimensional (3D) realistic energy-damage interactions are needed for aerospace composites. Current practice in NDE/SHM simulation for composites commonly involves over-simplification of the material parameters and/or a simplified two-dimensional (2D) approach. The unique damage types that occur in composite materials (delamination, microcracking, etc) develop as complex 3D geometry features. This paper discusses the application of 3D custom ultrasonic simulation tools to study wave interaction with multilayer delamination damage in carbon-fiber reinforced polymer (CFRP) composites. In particular, simulation based studies of ultrasonic guided wave energy trapping due to multilayer delamination damage were performed. The simulation results show changes in energy trapping at the composite surface as additional delaminations are added through the composite thickness. The results demonstrate a potential approach for identifying the presence of hidden multilayer delamination damage in applications where only single-sided access to a component is available. The paper also describes recent advancements in optimizing the custom ultrasonic simulation code for increases in computation speed.

  10. Supercomputers ready for use as discovery machines for neuroscience.

    PubMed

    Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus

    2012-01-01

    NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 10(8) neurons and 10(12) synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience.

  11. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  12. PetriScape - A plugin for discrete Petri net simulations in Cytoscape.

    PubMed

    Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan

    2016-06-04

    Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.

  13. PetriScape - A plugin for discrete Petri net simulations in Cytoscape.

    PubMed

    Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan

    2016-03-01

    Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.

  14. Supercomputers Ready for Use as Discovery Machines for Neuroscience

    PubMed Central

    Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus

    2012-01-01

    NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 108 neurons and 1012 synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience. PMID:23129998

  15. Simulation of Guided Wave Interaction with In-Plane Fiber Waviness

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.

    2016-01-01

    Reducing the timeline for certification of composite materials and enabling the expanded use of advanced composite materials for aerospace applications are two primary goals of NASA's Advanced Composites Project (ACP). A key a technical challenge area for accomplishing these goals is the development of rapid composite inspection methods with improved defect characterization capabilities. Ongoing work at NASA Langley is focused on expanding ultrasonic simulation capabilities for composite materials. Simulation tools can be used to guide the development of optimal inspection methods. Custom code based on elastodynamic finite integration technique is currently being developed and implemented to study ultrasonic wave interaction with manufacturing defects, such as in-plane fiber waviness (marcelling). This paper describes details of validation comparisons performed to enable simulation of guided wave propagation in composites containing fiber waviness. Simulation results for guided wave interaction with in-plane fiber waviness are also discussed. The results show that the wavefield is affected by the presence of waviness on both the surface containing fiber waviness, as well as the opposite surface to the location of waviness.

  16. Simulation of guided wave interaction with in-plane fiber waviness

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Juarez, Peter D.

    2017-02-01

    Reducing the timeline for certification of composite materials and enabling the expanded use of advanced composite materials for aerospace applications are two primary goals of NASA's Advanced Composites Project (ACP). A key a technical challenge area for accomplishing these goals is the development of rapid composite inspection methods with improved defect characterization capabilities. Ongoing work at NASA Langley is focused on expanding ultrasonic simulation capabilities for composite materials. Simulation tools can be used to guide the development of optimal inspection methods. Custom code based on elastodynamic finite integration technique is currently being developed and implemented to study ultrasonic wave interaction with manufacturing defects, such as in-plane fiber waviness (marcelling). This paper describes details of validation comparisons performed to enable simulation of guided wave propagation in composites containing fiber waviness. Simulation results for guided wave interaction with in-plane fiber waviness are also discussed. The results show that the wavefield is affected by the presence of waviness on both the surface containing fiber waviness, as well as the opposite surface to the location of waviness.

  17. Integration of Irma tactical scene generator into directed-energy weapon system simulation

    NASA Astrophysics Data System (ADS)

    Owens, Monte A.; Cole, Madison B., III; Laine, Mark R.

    2003-08-01

    Integrated high-fidelity physics-based simulations that include engagement models, image generation, electro-optical hardware models and control system algorithms have previously been developed by Boeing-SVS for various tracking and pointing systems. These simulations, however, had always used images with featureless or random backgrounds and simple target geometries. With the requirement to engage tactical ground targets in the presence of cluttered backgrounds, a new type of scene generation tool was required to fully evaluate system performance in this challenging environment. To answer this need, Irma was integrated into the existing suite of Boeing-SVS simulation tools, allowing scene generation capabilities with unprecedented realism. Irma is a US Air Force research tool used for high-resolution rendering and prediction of target and background signatures. The MATLAB/Simulink-based simulation achieves closed-loop tracking by running track algorithms on the Irma-generated images, processing the track errors through optical control algorithms, and moving simulated electro-optical elements. The geometry of these elements determines the sensor orientation with respect to the Irma database containing the three-dimensional background and target models. This orientation is dynamically passed to Irma through a Simulink S-function to generate the next image. This integrated simulation provides a test-bed for development and evaluation of tracking and control algorithms against representative images including complex background environments and realistic targets calibrated using field measurements.

  18. University of Washington/ Northwest National Marine Renewable Energy Center Tidal Current Technology Test Protocol, Instrumentation, Design Code, and Oceanographic Modeling Collaboration: Cooperative Research and Development Final Report, CRADA Number CRD-11-452

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick R.

    The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less

  19. Test/score/report: Simulation techniques for automating the test process

    NASA Technical Reports Server (NTRS)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.

  20. Modeling and Simulation of III-Nitride-Based Solar Cells using NextnanoRTM

    NASA Astrophysics Data System (ADS)

    Refaei, Malak

    Nextnano3 software is a well-known package for simulating semiconductor band-structures at the nanoscale and predicting the general electronic structure. In this work, it is further demonstrated as a viable tool for the simulation of III-nitride solar cells. In order to prove this feasibility, the generally accepted solar cell simulation package, PC1D, was chosen for comparison. To critique the results from both PC1D and Nextnano3, the fundamental drift-diffusion equations were used to calculate the performance of a simple p-n homojunction solar cell device analytically. Silicon was picked as the material for this comparison between the outputs of the two simulators as well as the results of the drift-diffusion equations because it is a well-known material in both software tools. After substantiating the capabilities of Nextnano3 for the simulation solar cells, an InGaN single-junction solar cell was simulated. The effects of various indium compositions and device structures on the performance of this InGaN p-n homojunction solar cell was then investigated using Nextnano 3 as a simulation tool. For single-junction devices with varying bandgap, an In0.6Ga0.4N device with a bandgap of 1.44 eV was found to be the optimum. The results of this research demonstrate that the Nextnano3 software can be used to usefully simulate solar cells in general, and III-nitride solar cells specifically, for future study of nanoscale structured devices.

  1. A Hybrid Parachute Simulation Environment for the Orion Parachute Development Project

    NASA Technical Reports Server (NTRS)

    Moore, James W.

    2011-01-01

    A parachute simulation environment (PSE) has been developed that aims to take advantage of legacy parachute simulation codes and modern object-oriented programming techniques. This hybrid simulation environment provides the parachute analyst with a natural and intuitive way to construct simulation tasks while preserving the pedigree and authority of established parachute simulations. NASA currently employs four simulation tools for developing and analyzing air-drop tests performed by the CEV Parachute Assembly System (CPAS) Project. These tools were developed at different times, in different languages, and with different capabilities in mind. As a result, each tool has a distinct interface and set of inputs and outputs. However, regardless of the simulation code that is most appropriate for the type of test, engineers typically perform similar tasks for each drop test such as prediction of loads, assessment of altitude, and sequencing of disreefs or cut-aways. An object-oriented approach to simulation configuration allows the analyst to choose models of real physical test articles (parachutes, vehicles, etc.) and sequence them to achieve the desired test conditions. Once configured, these objects are translated into traditional input lists and processed by the legacy simulation codes. This approach minimizes the number of sim inputs that the engineer must track while configuring an input file. An object oriented approach to simulation output allows a common set of post-processing functions to perform routine tasks such as plotting and timeline generation with minimal sensitivity to the simulation that generated the data. Flight test data may also be translated into the common output class to simplify test reconstruction and analysis.

  2. A computer simulation of Skylab dynamics and attitude control for performance verification and operational support

    NASA Technical Reports Server (NTRS)

    Buchanan, H.; Nixon, D.; Joyce, R.

    1974-01-01

    A simulation of the Skylab attitude and pointing control system (APCS) is outlined and discussed. Implementation is via a large hybrid computer and includes those factors affecting system momentum management, propellant consumption, and overall vehicle performance. The important features of the flight system are discussed; the mathematical models necessary for this treatment are outlined; and the decisions involved in implementation are discussed. A brief summary of the goals and capabilities of this tool is also included.

  3. Development of Elasto-Acoustic Integral Equation Based Solver to Assess/Simulate Sound Conducting Mechanisms in Human Head

    DTIC Science & Technology

    2013-09-09

    indicates energy flowing into and out of the bone. (b) The average energy flux density through the surface of the cochlear cavity (relative to the incident...simulation tool capable of handling a variety of aspects of wave propagation and the resulting energy flow in a human head subject to an incident...small amounts of energy transferred from air to a dense inhomogeneous object: such small energy flows are relevant only because of the exceedingly high

  4. Evaluation and utilization of beam simulation codes for the SNS ion source and low energy beam transport developmenta)

    NASA Astrophysics Data System (ADS)

    Han, B. X.; Welton, R. F.; Stockli, M. P.; Luciano, N. P.; Carmichael, J. R.

    2008-02-01

    Beam simulation codes PBGUNS, SIMION, and LORENTZ-3D were evaluated by modeling the well-diagnosed SNS base line ion source and low energy beam transport (LEBT) system. Then, an investigation was conducted using these codes to assist our ion source and LEBT development effort which is directed at meeting the SNS operational and also the power-upgrade project goals. A high-efficiency H- extraction system as well as magnetic and electrostatic LEBT configurations capable of transporting up to 100mA is studied using these simulation tools.

  5. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  6. Time-Dependent Simulations of Turbopump Flows

    NASA Technical Reports Server (NTRS)

    Kris, Cetin C.; Kwak, Dochan

    2001-01-01

    The objective of the current effort is to provide a computational framework for design and analysis of the entire fuel supply system of a liquid rocket engine, including high-fidelity unsteady turbopump flow analysis. This capability is needed to support the design of pump sub-systems for advanced space transportation vehicles that are likely to involve liquid propulsion systems. To date, computational tools for design/analysis of turbopump flows are based on relatively lower fidelity methods. An unsteady, three-dimensional viscous flow analysis tool involving stationary and rotational components for the entire turbopump assembly has not been available for real-world engineering applications. The present effort will provide developers with information such as transient flow phenomena at start up, impact of non-uniform inflows, system vibration and impact on the structure. In the proposed paper, the progress toward the capability of complete simulation of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of the hybrid MPI/Open-MP and MLP versions of the INS3D code. The relative motion of the grid systems for the rotor-stator interaction was obtained using overset grid techniques. Time-accuracy of the scheme has been evaluated with simple test cases. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on Origin 2000 systems at NASA Ames Research Center. Results from these time-accurate simulations with moving boundary capability will be presented along with the performance of parallel versions of the code.

  7. A tool to convert CAD models for importation into Geant4

    NASA Astrophysics Data System (ADS)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  8. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  9. Modeling Production Plant Forming Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhee, M; Becker, R; Couch, R

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaborationmore » with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.« less

  10. Design and simulation of EVA tools for first servicing mission of HST

    NASA Technical Reports Server (NTRS)

    Naik, Dipak; Dehoff, P. H.

    1993-01-01

    The Hubble Space Telescope (HST) was launched into near-earth orbit by the space shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A space shuttle repair mission in late 1993 will install small corrective mirrors that will restore the full intended optical capability of the HST. The first servicing mission (FSM) will involve considerable extravehicular activity (EVA). It is proposed to design special EVA tools for the FSM. This report includes details of the data acquisition system being developed to test the performance of the various EVA tools in ambient as well as simulated space environment.

  11. Surgical scissors extension adds the 7th axis of force feedback to the Freedom 6S.

    PubMed

    Powers, Marilyn J; Sinclair, Ian P W; Brouwer, Iman; Laroche, Denis

    2007-01-01

    A virtual reality surgical simulator ideally allows seamless transition between the real and virtual world. In that respect, all of a surgeon's motions and tools must be simulated. Until now researchers have been limited to using a pen-like tool in six degrees-of-freedom. This paper presents the addition of haptically enabled scissors to the end effector of a 6-DOF haptic device, the Freedom 6S. The scissors are capable of pinching a maximum torque of 460 mN.m with low inertia and low back-drive friction. The device is a balanced design so that the user feels like they are holding no more than actual scissors, although with some added inertia on the load end. The system is interchangeable between the 6-DOF and 7-DOF configurations to allow switching tools quickly.

  12. Uncertainty quantification's role in modeling and simulation planning, and credibility assessment through the predictive capability maturity model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rider, William J.; Witkowski, Walter R.; Mousseau, Vincent Andrew

    2016-04-13

    The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationallymore » simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.« less

  13. PowderSim: Lagrangian Discrete and Mesh-Free Continuum Simulation Code for Cohesive Soils

    NASA Technical Reports Server (NTRS)

    Johnson, Scott; Walton, Otis; Settgast, Randolph

    2013-01-01

    PowderSim is a calculation tool that combines a discrete-element method (DEM) module, including calibrated interparticle-interaction relationships, with a mesh-free, continuum, SPH (smoothed-particle hydrodynamics) based module that utilizes enhanced, calibrated, constitutive models capable of mimicking both large deformations and the flow behavior of regolith simulants and lunar regolith under conditions anticipated during in situ resource utilization (ISRU) operations. The major innovation introduced in PowderSim is to use a mesh-free method (SPH-based) with a calibrated and slightly modified critical-state soil mechanics constitutive model to extend the ability of the simulation tool to also address full-scale engineering systems in the continuum sense. The PowderSim software maintains the ability to address particle-scale problems, like size segregation, in selected regions with a traditional DEM module, which has improved contact physics and electrostatic interaction models.

  14. A spectral Poisson solver for kinetic plasma simulation

    NASA Astrophysics Data System (ADS)

    Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf

    2011-10-01

    Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.

  15. Cost effective simulation-based multiobjective optimization in the performance of an internal combustion engine

    NASA Astrophysics Data System (ADS)

    Aittokoski, Timo; Miettinen, Kaisa

    2008-07-01

    Solving real-life engineering problems can be difficult because they often have multiple conflicting objectives, the objective functions involved are highly nonlinear and they contain multiple local minima. Furthermore, function values are often produced via a time-consuming simulation process. These facts suggest the need for an automated optimization tool that is efficient (in terms of number of objective function evaluations) and capable of solving global and multiobjective optimization problems. In this article, the requirements on a general simulation-based optimization system are discussed and such a system is applied to optimize the performance of a two-stroke combustion engine. In the example of a simulation-based optimization problem, the dimensions and shape of the exhaust pipe of a two-stroke engine are altered, and values of three conflicting objective functions are optimized. These values are derived from power output characteristics of the engine. The optimization approach involves interactive multiobjective optimization and provides a convenient tool to balance between conflicting objectives and to find good solutions.

  16. SpectraPlot.com: Integrated spectroscopic modeling of atomic and molecular gases

    NASA Astrophysics Data System (ADS)

    Goldenstein, Christopher S.; Miller, Victor A.; Mitchell Spearrin, R.; Strand, Christopher L.

    2017-10-01

    SpectraPlot is a web-based application for simulating spectra of atomic and molecular gases. At the time this manuscript was written, SpectraPlot consisted of four primary tools for calculating: (1) atomic and molecular absorption spectra, (2) atomic and molecular emission spectra, (3) transition linestrengths, and (4) blackbody emission spectra. These tools currently employ the NIST ASD, HITRAN2012, and HITEMP2010 databases to perform line-by-line simulations of spectra. SpectraPlot employs a modular, integrated architecture, enabling multiple simulations across multiple databases and/or thermodynamic conditions to be visualized in an interactive plot window. The primary objective of this paper is to describe the architecture and spectroscopic models employed by SpectraPlot in order to provide its users with the knowledge required to understand the capabilities and limitations of simulations performed using SpectraPlot. Further, this manuscript discusses the accuracy of several underlying approximations used to decrease computational time, in particular, the use of far-wing cutoff criteria.

  17. Utilization of a CRT display light pen in the design of feedback control systems

    NASA Technical Reports Server (NTRS)

    Thompson, J. G.; Young, K. R.

    1972-01-01

    A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.

  18. New Integrated Modeling Capabilities: MIDAS' Recent Behavioral Enhancements

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.; Jarvis, Peter A.

    2005-01-01

    The Man-machine Integration Design and Analysis System (MIDAS) is an integrated human performance modeling software tool that is based on mechanisms that underlie and cause human behavior. A PC-Windows version of MIDAS has been created that integrates the anthropometric character "Jack (TM)" with MIDAS' validated perceptual and attention mechanisms. MIDAS now models multiple simulated humans engaging in goal-related behaviors. New capabilities include the ability to predict situations in which errors and/or performance decrements are likely due to a variety of factors including concurrent workload and performance influencing factors (PIFs). This paper describes a new model that predicts the effects of microgravity on a mission specialist's performance, and its first application to simulating the task of conducting a Life Sciences experiment in space according to a sequential or parallel schedule of performance.

  19. 3D Multispecies Nonlinear Perturbative Particle Simulation of Intense Nonneutral Particle Beams (Research supported by the Department of Energy and the Short Pulse Spallation Source Project and LANSCE Division of LANL.)

    NASA Astrophysics Data System (ADS)

    Qin, Hong; Davidson, Ronald C.; Lee, W. Wei-Li

    1999-11-01

    The Beam Equilibrium Stability and Transport (BEST) code, a 3D multispecies nonlinear perturbative particle simulation code, has been developed to study collective effects in intense charged particle beams described self-consistently by the Vlasov-Maxwell equations. A Darwin model is adopted for transverse electromagnetic effects. As a 3D multispecies perturbative particle simulation code, it provides several unique capabilities. Since the simulation particles are used to simulate only the perturbed distribution function and self-fields, the simulation noise is reduced significantly. The perturbative approach also enables the code to investigate different physics effects separately, as well as simultaneously. The code can be easily switched between linear and nonlinear operation, and used to study both linear stability properties and nonlinear beam dynamics. These features, combined with 3D and multispecies capabilities, provides an effective tool to investigate the electron-ion two-stream instability, periodically focused solutions in alternating focusing fields, and many other important problems in nonlinear beam dynamics and accelerator physics. Applications to the two-stream instability are presented.

  20. Integrated Simulation Design Challenges to Support TPS Repair Operations

    NASA Technical Reports Server (NTRS)

    Quiocho, Leslie J.; Crues, Edwin Z.; Huynh, An; Nguyen, Hung T.; MacLean, John

    2005-01-01

    During the Orbiter Repair Maneuver (ORM) operations planned for Return to Flight (RTF), the Shuttle Remote Manipulator System (SRMS) must grapple the International Space Station (ISS), undock the Orbiter, maneuver it through a long duration trajectory, and orient it to an EVA crewman poised at the end of the Space Station Remote Manipulator System (SSRMS) to facilitate the repair of the Thermal Protection System (TPS). Once repair has been completed and confirmed, then the SRMS proceeds back through the trajectory to dock the Orbiter to the Orbiter Docking System. In order to support analysis of the complex dynamic interactions of the integrated system formed by the Orbiter, ISS, SRMS, and SSRMS during the ORM, simulation tools used for previous 'nominal' mission support required substantial enhancements. These upgrades were necessary to provide analysts with the capabilities needed to study integrated system performance. This paper discusses the simulation design challenges encountered while developing simulation capabilities to mirror the ORM operations. The paper also describes the incremental build approach that was utilized, starting with the subsystem simulation elements and integration into increasing more complex simulations until the resulting ORM worksite dynamics simulation had been assembled. Furthermore, the paper presents an overall integrated simulation V&V methodology based upon a subsystem level testing, integrated comparisons, and phased checkout.

  1. Monte Carlo simulations of precise timekeeping in the Milstar communication satellite system

    NASA Technical Reports Server (NTRS)

    Camparo, James C.; Frueholz, R. P.

    1995-01-01

    The Milstar communications satellite system will provide secure antijam communication capabilities for DOD operations into the next century. In order to accomplish this task, the Milstar system will employ precise timekeeping on its satellites and at its ground control stations. The constellation will consist of four satellites in geosynchronous orbit, each carrying a set of four rubidium (Rb) atomic clocks. Several times a day, during normal operation, the Mission Control Element (MCE) will collect timing information from the constellation, and after several days use this information to update the time and frequency of the satellite clocks. The MCE will maintain precise time with a cesium (Cs) atomic clock, synchronized to UTC(USNO) via a GPS receiver. We have developed a Monte Carlo simulation of Milstar's space segment timekeeping. The simulation includes the effects of: uplink/downlink time transfer noise; satellite crosslink time transfer noise; satellite diurnal temperature variations; satellite and ground station atomic clock noise; and also quantization limits regarding satellite time and frequency corrections. The Monte Carlo simulation capability has proven to be an invaluable tool in assessing the performance characteristics of various timekeeping algorithms proposed for Milstar, and also in highlighting the timekeeping capabilities of the system. Here, we provide a brief overview of the basic Milstar timekeeping architecture as it is presently envisioned. We then describe the Monte Carlo simulation of space segment timekeeping, and provide examples of the simulation's efficacy in resolving timekeeping issues.

  2. Analyzing the effectiveness of flare dispensing programs against pulse width modulation seekers using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Şahingil, Mehmet C.; Aslan, Murat Š.

    2013-10-01

    Infrared guided missile seekers utilizing pulse width modulation in target tracking is one of the threats against air platforms. To be able to achieve a "soft-kill" protection of own platform against these type of threats, one needs to examine carefully the seeker operating principle with its special electronic counter-counter measure (ECCM) capability. One of the cost-effective ways of soft kill protection is to use flare decoys in accordance with an optimized dispensing program. Such an optimization requires a good understanding of the threat seeker, capabilities of the air platform and engagement scenario information between them. Modeling and simulation is very powerful tool to achieve a valuable insight and understand the underlying phenomenology. A careful interpretation of simulation results is crucial to infer valuable conclusions from the data. In such an interpretation there are lots of factors (features) which affect the results. Therefore, powerful statistical tools and pattern recognition algorithms are of special interest in the analysis. In this paper, we show how self-organizing maps (SOMs), which is one of those powerful tools, can be used in analyzing the effectiveness of various flare dispensing programs against a PWM seeker. We perform several Monte Carlo runs for a typical engagement scenario in a MATLAB-based simulation environment. In each run, we randomly change the flare dispending program and obtain corresponding class: "successful" or "unsuccessful", depending on whether the corresponding flare dispensing program deceives the seeker or not, respectively. Then, in the analysis phase, we use SOMs to interpret and visualize the results.

  3. Full 3D opto-electronic simulation tool for nanotextured solar cells (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Michallon, Jérôme; Collin, Stéphane

    2017-04-01

    Increasing efforts on the photovoltaics research have recently been devoted to material savings, leading to the emergence of new designs based on nanotextured and nanowire-based solar cells. The use of small absorber volumes, light-trapping nanostructures and unconventional carrier collection schemes (radial nanowire junctions, point contacts in planar structures,…) increases the impact of surfaces recombination and induces homogeneity in the photogenerated carrier concentrations. The investigation of their impacts on the device performances need to be addressed using full 3D coupled opto-electrical modeling. In this context, we have developed a new tool for full 3D opto-electrical simulation using the most advanced optical and electrical simulation techniques. We will present an overview of its simulation capabilities and the key issues that have been solved to make it fully operational and reliable. We will provide various examples of opto-electronic simulation of (i) nanostructured solar cells with localized contacts and (ii) nanowire solar cells. We will also show how opto-electronic simulation can be used to simulate light- and electron-beam induced current (LBIC/EBIC) experiments, targeting quantitative analysis of the passivation properties of surfaces.

  4. Flooding Capability for River-based Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Prescott, Steven; Ryan, Emerald

    2015-10-01

    This report describes the initial investigation into modeling and simulation tools for application of riverine flooding representation as part of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluations. The report provides examples of different flooding conditions and scenarios that could impact river and watershed systems. Both 2D and 3D modeling approaches are described.

  5. Training Community Modeling and Simulation Business Plan: 2008 Edition

    DTIC Science & Technology

    2009-12-01

    Collaborative information environment. Collaborative tools will help CCDRs and joint staffs plan and disseminate operations, link the staffs to subject matter...anticipating direct and indirect effects as they propagate through political, military, economic, sociological, and information infrastructures. Capabilities...will also 5-11 enhance training for joint staffs and task forces; crisis management; JUO; information warfare; interagency, intergovernmental, and

  6. 50 Years of Army Computing From ENIAC to MSRC

    DTIC Science & Technology

    2000-09-01

    processing capability. The scientifi c visualization program was started in 1984 to provide tools and expertise to help researchers graphically...and materials, forces modeling, nanoelectronics, electromagnetics and acoustics, signal image processing , and simulation and modeling. The ARL...mechanical and electrical calculating equipment, punch card data processing equipment, analog computers, and early digital machines. Before beginning, we

  7. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  8. Development and evaluation of the Screening Trajectory Ozone Prediction System (STOPS, version 1.0)

    NASA Astrophysics Data System (ADS)

    Czader, B. H.; Percell, P.; Byun, D.; Choi, Y.

    2014-11-01

    A hybrid Lagrangian-Eulerian modeling tool has been developed using the Eulerian framework of the Community Multiscale Air Quality (CMAQ) model. It is a moving nest that utilizes saved original CMAQ simulation results to provide boundary conditions, initial conditions, as well as emissions and meteorological parameters necessary for a simulation. Given that these file are available, this tool can run independently from the CMAQ whole domain simulation and it is designed to simulate source - receptor relationship upon changes in emissions. In this tool, the original CMAQ's horizontal domain is reduced to a small sub-domain that follows a trajectory defined by the mean mixed-layer wind. It has the same vertical structure and physical and chemical interactions as CMAQ except advection calculation. The advantage of this tool compared to other Lagrangian models is its capability of utilizing realistic boundary conditions that change with space and time as well as detailed chemistry treatment. The correctness of the algorithms and the overall performance was evaluated against CMAQ simulation results. Its performance depends on the atmospheric conditions occurring during the simulation period with the comparisons being most similar to CMAQ results under uniform wind conditions. The mean bias varies between -0.03 and -0.78 and the slope is between 0.99 and 1.01 for different analyzed cases. For complicated meteorological condition, such as wind circulation, the simulated mixing ratios deviate from CMAQ values as a result of Lagrangian approach of using mean wind for its movement, but are still close, with the mean varying between 0.07 and -4.29 and slope varying between 0.95 and 1.063 for different analyzed cases. For historical reasons this hybrid Lagrangian - Eulerian tool is named the Screening Trajectory Ozone Prediction System (STOPS) but its use is not limited to ozone prediction as similarly to CMAQ it can simulate concentrations of many species, including particulate matter and some toxic compounds, such as formaldehyde and 1,3-butadiene.

  9. Next generation simulation tools: the Systems Biology Workbench and BioSPICE integration.

    PubMed

    Sauro, Herbert M; Hucka, Michael; Finney, Andrew; Wellock, Cameron; Bolouri, Hamid; Doyle, John; Kitano, Hiroaki

    2003-01-01

    Researchers in quantitative systems biology make use of a large number of different software packages for modelling, analysis, visualization, and general data manipulation. In this paper, we describe the Systems Biology Workbench (SBW), a software framework that allows heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' capabilities via a fast binary encoded-message system. Our goal was to create a simple, high performance, opensource software infrastructure which is easy to implement and understand. SBW enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe in this paper the SBW architecture, a selection of current modules, including Jarnac, JDesigner, and SBWMeta-tool, and the close integration of SBW into BioSPICE, which enables both frameworks to share tools and compliment and strengthen each others capabilities.

  10. Physics-based interactive volume manipulation for sharing surgical process.

    PubMed

    Nakao, Megumi; Minato, Kotaro

    2010-05-01

    This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.

  11. Monte Carlo Methodology Serves Up a Software Success

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Widely used for the modeling of gas flows through the computation of the motion and collisions of representative molecules, the Direct Simulation Monte Carlo method has become the gold standard for producing research and engineering predictions in the field of rarefied gas dynamics. Direct Simulation Monte Carlo was first introduced in the early 1960s by Dr. Graeme Bird, a professor at the University of Sydney, Australia. It has since proved to be a valuable tool to the aerospace and defense industries in providing design and operational support data, as well as flight data analysis. In 2002, NASA brought to the forefront a software product that maintains the same basic physics formulation of Dr. Bird's method, but provides effective modeling of complex, three-dimensional, real vehicle simulations and parallel processing capabilities to handle additional computational requirements, especially in areas where computational fluid dynamics (CFD) is not applicable. NASA's Direct Simulation Monte Carlo Analysis Code (DAC) software package is now considered the Agency s premier high-fidelity simulation tool for predicting vehicle aerodynamics and aerothermodynamic environments in rarified, or low-density, gas flows.

  12. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  13. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  14. The development of an autonomous rendezvous and docking simulation using rapid integration and prototyping technology

    NASA Technical Reports Server (NTRS)

    Shackelford, John H.; Saugen, John D.; Wurst, Michael J.; Adler, James

    1991-01-01

    A generic planar 3 degree of freedom simulation was developed that supports hardware in the loop simulations, guidance and control analysis, and can directly generate flight software. This simulation was developed in a small amount of time utilizing rapid prototyping techniques. The approach taken to develop this simulation tool, the benefits seen using this approach to development, and on-going efforts to improve and extend this capability are described. The simulation is composed of 3 major elements: (1) Docker dynamics model, (2) Dockee dynamics model, and (3) Docker Control System. The docker and dockee models are based on simple planar orbital dynamics equations using a spherical earth gravity model. The docker control system is based on a phase plane approach to error correction.

  15. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  16. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  17. Simulation services and analysis tools at the CCMC to study multi-scale structure and dynamics of Earth's magnetopause

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M. M.; Liu, Y. H.; Rastaetter, L.; Pembroke, A. D.; Chen, L. J.; Hesse, M.; Glocer, A.; Komar, C. M.; Dorelli, J.; Roytershteyn, V.

    2016-12-01

    The presentation will provide overview of new tools, services and models implemented at the Community Coordinated Modeling Center (CCMC) to facilitate MMS dayside results analysis. We will provide updates on implementation of Particle-in-Cell (PIC) simulations at the CCMC and opportunities for on-line visualization and analysis of results of PIC simulations of asymmetric magnetic reconnection for different guide fields and boundary conditions. Fields, plasma parameters, particle distribution moments as well as particle distribution functions calculated in selected regions of the vicinity of reconnection sites can be analyzed through the web-based interactive visualization system. In addition there are options to request distribution functions in user selected regions of interest and to fly through simulated magnetic reconnection configurations and a map of distributions to facilitate comparisons with observations. A broad collection of global magnetosphere models hosted at the CCMC provide opportunity to put MMS observations and local PIC simulations into global context. We recently implemented the RECON-X post processing tool (Glocer et al, 2016) which allows users to determine the location of separator surface around closed field lines and between open field lines and solar wind field lines. The tool also finds the separatrix line where the two surfaces touch and positions of magnetic nulls. The surfaces and the separatrix line can be visualized relative to satellite positions in the dayside magnetosphere using an interactive HTML-5 visualization for each time step processed. To validate global magnetosphere models' capability to simulate locations of dayside magnetosphere boundaries we will analyze the proximity of MMS to simulated separatrix locations for a set of MMS diffusion region crossing events.

  18. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  19. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  20. VERSE - Virtual Equivalent Real-time Simulation

    NASA Technical Reports Server (NTRS)

    Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel

    2005-01-01

    Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.

  1. Coral Reef Remote Sensing Using Simulated VIIRS and LDCM Imagery

    NASA Technical Reports Server (NTRS)

    Estep, Leland; Spruce, Joseph P.; Blonski, Slawomir; Moore, Roxzana

    2008-01-01

    The Rapid Prototyping Capability (RPC) node at NASA Stennis Space Center, MS, was used to simulate NASA next-generation sensor imagery over well-known coral reef areas: Looe Key, FL, and Kaneohe Bay, HI. The objective was to assess the degree to which next-generation sensor systems-the Visible/Infrared Imager/Radiometer Suite (VIIRS) and the Landsat Data Continuity Mission (LDCM)- might provide key input to the National Oceanographic and Atmospheric Administration (NOAA) Integrated Coral Observing Network (ICON)/Coral Reef Early Warning System (CREWS) Decision Support Tool (DST). The DST data layers produced from the simulated imagery concerned water quality and benthic classification map layers. The water optical parameters of interest were chlorophyll (Chl) and the absorption coefficient (a). The input imagery used by the RPC for simulation included spaceborne (Hyperion) and airborne (AVIRIS) hyperspectral data. Specific field data to complement and aid in validation of the overflight data was used when available. The results of the experiment show that the next-generation sensor systems are capable of providing valuable data layer resources to NOAA s ICON/CREWS DST.

  2. Coral Reef Remote Sensing using Simulated VIIRS and LDCM Imagery

    NASA Technical Reports Server (NTRS)

    Estep, Leland; Spruce, Joseph P.

    2007-01-01

    The Rapid Prototyping Capability (RPC) node at NASA Stennis Space Center, MS, was used to simulate NASA next-generation sensor imagery over well-known coral reef areas: Looe Key, FL, and Kaneohe Bay, HI. The objective was to assess the degree to which next-generation sensor systems the Visible/Infrared Imager/Radiometer Suite (VIIRS) and the Landsat Data Continuity Mission (LDCM) might provide key input to the National Oceanographic and Atmospheric Administration (NOAA) Integrated Coral Observing Network (ICON)/Coral Reef Early Warning System (CREWS) Decision Support Tool (DST). The DST data layers produced from the simulated imagery concerned water quality and benthic classification map layers. The water optical parameters of interest were chlorophyll (Chl) and the absorption coefficient (a). The input imagery used by the RPC for simulation included spaceborne (Hyperion) and airborne (AVIRIS) hyperspectral data. Specific field data to complement and aid in validation of the overflight data was used when available. The results of the experiment show that the next-generation sensor systems are capable of providing valuable data layer resources to NOAA's ICON/CREWS DST.

  3. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  4. L3:PHI.CMD.P13.02 Support for CILC L1 Milestone Using STAR-CCM+

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slattery, Stuart R.; Gurecky, William L.

    2016-10-07

    This report documents work performed to support Consortium for the Advanced Simulation of LWRs (CASL) modeling of Chalk River Unidentified Deposit (CRUD) Induced Power Shift (CIPS) and CRUD Induced Local Corrosion (CILC) using the Cicada package. The work documented here is intended to complement current and future CIPS and CILC modeling activities in CASL. We provide tools for crud and corrosion-related simulation and analysis by developing a better understanding of the interplay between the coupled physics that describe the phenomena at different time and length scales. We intend to use these models to better inform future simulation capability and development.

  5. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  6. 3CE Methodology for Conducting a Modeling, Simulation, and Instrumentation Tool Capability Analysis

    DTIC Science & Technology

    2010-05-01

    flRmurn I F )T:Ir,tir)l! MCr)lto.-lng DHin nttbli..’"Ollc:~ E,;m:a..liut .!,)’l’lt’Mn:l’lll.ll~ t Managemen t F unction a l Arem 1 .5 Toola na...a modeling, simulation, and instrumentation (MS&I) environment. This methodology uses the DoDAF product set to document operational and systems...engineering process were identified and resolved, such as duplication of data elements derived from DoDAF operational and system views used to

  7. Evaluation of CFD Methods for Simulation of Two-Phase Boiling Flow Phenomena in a Helical Coil Steam Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David; Shaver, Dillon; Liu, Yang

    The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less

  8. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  9. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  10. Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan

    2012-01-01

    The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.

  11. The layered sensing operations center: a modeling and simulation approach to developing complex ISR networks

    NASA Astrophysics Data System (ADS)

    Curtis, Christopher; Lenzo, Matthew; McClure, Matthew; Preiss, Bruce

    2010-04-01

    In order to anticipate the constantly changing landscape of global warfare, the United States Air Force must acquire new capabilities in the field of Intelligence, Surveillance, and Reconnaissance (ISR). To meet this challenge, the Air Force Research Laboratory (AFRL) is developing a unifying construct of "Layered Sensing" which will provide military decision-makers at all levels with the timely, actionable, and trusted information necessary for complete battlespace awareness. Layered Sensing is characterized by the appropriate combination of sensors and platforms (including those for persistent sensing), infrastructure, and exploitation capabilities to enable this synergistic awareness. To achieve the Layered Sensing vision, AFRL is pursuing a Modeling & Simulation (M&S) strategy through the Layered Sensing Operations Center (LSOC). An experimental ISR system-of-systems test-bed, the LSOC integrates DoD standard simulation tools with commercial, off-the-shelf video game technology for rapid scenario development and visualization. These tools will help facilitate sensor management performance characterization, system development, and operator behavioral analysis. Flexible and cost-effective, the LSOC will implement a non-proprietary, open-architecture framework with well-defined interfaces. This framework will incentivize the transition of current ISR performance models to service-oriented software design for maximum re-use and consistency. This paper will present the LSOC's development and implementation thus far as well as a summary of lessons learned and future plans for the LSOC.

  12. Adaptive Planning: Understanding Organizational Workload to Capability/ Capacity through Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Hase, Chris

    2010-01-01

    In August 2003, the Secretary of Defense (SECDEF) established the Adaptive Planning (AP) initiative [1] with an objective of reducing the time necessary to develop and revise Combatant Commander (COCOM) contingency plans and increase SECDEF plan visibility. In addition to reducing the traditional plan development timeline from twenty-four months to less than twelve months (with a goal of six months)[2], AP increased plan visibility to Department of Defense (DoD) leadership through In-Progress Reviews (IPRs). The IPR process, as well as the increased number of campaign and contingency plans COCOMs had to develop, increased the workload while the number of planners remained fixed. Several efforts from collaborative planning tools to streamlined processes were initiated to compensate for the increased workload enabling COCOMS to better meet shorter planning timelines. This paper examines the Joint Strategic Capabilities Plan (JSCP) directed contingency planning and staffing requirements assigned to a combatant commander staff through the lens of modeling and simulation. The dynamics of developing a COCOM plan are captured with an ExtendSim [3] simulation. The resulting analysis provides a quantifiable means by which to measure a combatant commander staffs workload associated with development and staffing JSCP [4] directed contingency plans with COCOM capability/capacity. Modeling and simulation bring significant opportunities in measuring the sensitivity of key variables in the assessment of workload to capability/capacity analysis. Gaining an understanding of the relationship between plan complexity, number of plans, planning processes, and number of planners with time required for plan development provides valuable information to DoD leadership. Through modeling and simulation AP leadership can gain greater insight in making key decisions on knowing where to best allocate scarce resources in an effort to meet DoD planning objectives.

  13. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tome, Carlos N; Caro, J A; Lebensohn, R A

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less

  14. Vacuum system transient simulator and its application to TFTR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sredniawski, J.

    The vacuum system transient simulator (VSTS) models transient gas transport throughout complex networks of ducts, valves, traps, vacuum pumps, and other related vacuum system components. VSTS is capable of treating gas models of up to 10 species, for all flow regimes from pure molecular to continuum. Viscous interactions between species are considered as well as non-uniform temperature of a system. Although this program was specifically developed for use on the Tokamak Fusion Test Reactor (TFTR) project at Princeton, it is a generalized tool capable of handling a broad range of vacuum system problems. During the TFTR engineering design phase, VSTSmore » has been used in many applications. Two applications selected for presentation are: torus vacuum pumping system performance between 400 Ci tritium pulses and tritium backstreaming to neutral beams during pulses.« less

  15. POLLUX: a program for simulated cloning, mutagenesis and database searching of DNA constructs.

    PubMed

    Dayringer, H E; Sammons, S A

    1991-04-01

    Computer support for research in biotechnology has developed rapidly and has provided several tools to aid the researcher. This report describes the capabilities of new computer software developed in this laboratory to aid in the documentation and planning of experiments in molecular biology. The program, POLLUX, provides a graphical medium for the entry, edit and manipulation of DNA constructs and a textual format for display and edit of construct descriptive data. Program operation and procedures are designed to mimic the actual laboratory experiments with respect to capability and the order in which they are performed. Flexible control over the content of the computer-generated displays and program facilities is provided by a mouse-driven menu interface. Programmed facilities for mutagenesis, simulated cloning and searching of the database from networked workstations are described.

  16. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  17. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2004-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  18. NEIGHBORHOOD SCALE AIR QUALITY MODELING IN HOUSTON USING URBAN CANOPY PARAMETERS IN MM5 AND CMAQ WITH IMPROVED CHARACTERIZATION OF MESOSCALE LAKE-LAND BREEZE CIRCULATION

    EPA Science Inventory

    Advanced capability of air quality simulation models towards accurate performance at finer scales will be needed for such models to serve as tools for performing exposure and risk assessments in urban areas. It is recognized that the impact of urban features such as street and t...

  19. Exoplanet Yield Estimation for Decadal Study Concepts using EXOSIMS

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Lowrance, Patrick; Savransky, Dmitry; Garrett, Daniel

    2016-01-01

    The anticipated upcoming large mission study concepts for the direct imaging of exo-earths present an exciting opportunity for exoplanet discovery and characterization. While these telescope concepts would also be capable of conducting a broad range of astrophysical investigations, the most difficult technology challenges are driven by the requirements for imaging exo-earths. The exoplanet science yield for these mission concepts will drive design trades and mission concept comparisons.To assist in these trade studies, the Exoplanet Exploration Program Office (ExEP) is developing a yield estimation tool that emphasizes transparency and consistent comparison of various design concepts. The tool will provide a parametric estimate of science yield of various mission concepts using contrast curves from physics-based model codes and Monte Carlo simulations of design reference missions using realistic constraints, such as solar avoidance angles, the observatory orbit, propulsion limitations of star shades, the accessibility of candidate targets, local and background zodiacal light levels, and background confusion by stars and galaxies. The python tool utilizes Dmitry Savransky's EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) design reference mission simulator that is being developed for the WFIRST Preliminary Science program. ExEP is extending and validating the tool for future mission concepts under consideration for the upcoming 2020 decadal review. We present a validation plan and preliminary yield results for a point design.

  20. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  1. Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew

    2017-01-15

    This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less

  2. A simple teaching tool for training the pelvic organ prolapse quantification system.

    PubMed

    Geiss, Ingrid M; Riss, Paul A; Hanzal, Engelbert; Dungl, Andrea

    2007-09-01

    The pelvic organ prolapse quantification (POPQ) system is currently the most common and specific system describing different prolapse stages. Nevertheless, its use is not yet accepted worldwide in routine care. Our aim was to develop a simple teaching tool for the POPQ system capable of simulating different stages of uterovaginal prolapse for use in medical education with hands on training. We constructed a moveable and flexible tool with an inverted Santa Claus' cap, which simulated the vaginal cuff and the tassel at the end representing the cervix. A wooden embroidery frame fixed the cap and served as the hymen, the reference point for all measurements. Inside the cap, we sewed buttons to define the anatomic landmark points Aa and Ap located 3 cm distal from the frame. After explaining the device to the students, we used the three-by-three grid for recording the quantitative description of the pelvic organ support. First, each student had to demonstrate a specific prolapse with his cap device. Then, a prolapse was simulated on the cap, and the student had to take the relevant measurements and record them in the POPQ grid. The main training effect to understand the POPQ system seems to be the possibility for each trainee to simulate a three-dimensional prolapse with this flexible vagina model.

  3. Time-efficient simulations of tight-binding electronic structures with Intel Xeon PhiTM many-core processors

    NASA Astrophysics Data System (ADS)

    Ryu, Hoon; Jeong, Yosang; Kang, Ji-Hoon; Cho, Kyu Nam

    2016-12-01

    Modelling of multi-million atomic semiconductor structures is important as it not only predicts properties of physically realizable novel materials, but can accelerate advanced device designs. This work elaborates a new Technology-Computer-Aided-Design (TCAD) tool for nanoelectronics modelling, which uses a sp3d5s∗ tight-binding approach to describe multi-million atomic structures, and simulate electronic structures with high performance computing (HPC), including atomic effects such as alloy and dopant disorders. Being named as Quantum simulation tool for Advanced Nanoscale Devices (Q-AND), the tool shows nice scalability on traditional multi-core HPC clusters implying the strong capability of large-scale electronic structure simulations, particularly with remarkable performance enhancement on latest clusters of Intel Xeon PhiTM coprocessors. A review of the recent modelling study conducted to understand an experimental work of highly phosphorus-doped silicon nanowires, is presented to demonstrate the utility of Q-AND. Having been developed via Intel Parallel Computing Center project, Q-AND will be open to public to establish a sound framework of nanoelectronics modelling with advanced HPC clusters of a many-core base. With details of the development methodology and exemplary study of dopant electronics, this work will present a practical guideline for TCAD development to researchers in the field of computational nanoelectronics.

  4. Functional and real-time requirements of a multisensor data fusion (MSDF) situation and threat assessment (STA) resource management (RM) system

    NASA Astrophysics Data System (ADS)

    Duquet, Jean Remi; Bergeron, Pierre; Blodgett, Dale E.; Couture, Jean; Macieszczak, Maciej; Mayrand, Michel; Chalmers, Bruce A.; Paradis, Stephane

    1998-03-01

    The Research and Development group at Lockheed Martin Canada, in collaboration with the Defence Research Establishment Valcartier, has undertaken a research project in order to capture and analyze the real-time and functional requirements of a next generation Command and Control System (CCS) for the Canadian Patrol Frigates, integrating Multi- Sensor Data Fusion (MSDF), Situation and Threat Assessment (STA) and Resource Management (RM). One important aspect of the project is to define how the use of Artificial Intelligence may optimize the performance of an integrated, real-time MSDF/STA/RM system. A closed-loop simulation environment is being developed to facilitate the evaluation of MSDF/STA/RM concepts, algorithms and architectures. This environment comprises (1) a scenario generator, (2) complex sensor, hardkill and softkill weapon models, (3) a real-time monitoring tool, (4) a distributed Knowledge-Base System (KBS) shell. The latter is being completely redesigned and implemented in-house since no commercial KBS shell could adequately satisfy all the project requirements. The closed- loop capability of the simulation environment, together with its `simulated real-time' capability, allows the interaction between the MSDF/STA/RM system and the environment targets during the execution of a scenario. This capability is essential to measure the performance of many STA and RM functionalities. Some benchmark scenarios have been selected to demonstrate quantitatively the capabilities of the selected MSDF/STA/RM algorithms. The paper describes the simulation environment and discusses the MSDF/STA/RM functionalities currently implemented and their performance as an automatic CCS.

  5. Simulations of binary black hole mergers

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey

    2017-01-01

    Advanced LIGO's observations of merging binary black holes have inaugurated the era of gravitational wave astronomy. Accurate models of binary black holes and the gravitational waves they emit are helping Advanced LIGO to find as many gravitational waves as possible and to learn as much as possible about the waves' sources. These models require numerical-relativity simulations of binary black holes, because near the time when the black holes merge, all analytic approximations break down. Following breakthroughs in 2005, many research groups have built numerical-relativity codes capable of simulating binary black holes. In this talk, I will discuss current challenges in simulating binary black holes for gravitational-wave astronomy, and I will discuss the tremendous progress that has already enabled such simulations to become an essential tool for Advanced LIGO.

  6. The structure of aqueous sodium hydroxide solutions: a combined solution x-ray diffraction and simulation study.

    PubMed

    Megyes, Tünde; Bálint, Szabolcs; Grósz, Tamás; Radnai, Tamás; Bakó, Imre; Sipos, Pál

    2008-01-28

    To determine the structure of aqueous sodium hydroxide solutions, results obtained from x-ray diffraction and computer simulation (molecular dynamics and Car-Parrinello) have been compared. The capabilities and limitations of the methods in describing the solution structure are discussed. For the solutions studied, diffraction methods were found to perform very well in describing the hydration spheres of the sodium ion and yield structural information on the anion's hydration structure. Classical molecular dynamics simulations were not able to correctly describe the bulk structure of these solutions. However, Car-Parrinello simulation proved to be a suitable tool in the detailed interpretation of the hydration sphere of ions and bulk structure of solutions. The results of Car-Parrinello simulations were compared with the findings of diffraction experiments.

  7. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  8. A Tool for Longitudinal Beam Dynamics in Synchrotrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostiguy, J.-F.; Lebedev, V. A.

    2017-05-01

    A number of codes are available to simulate longitudinal dynamics in synchrotrons. Some established ones include TIBETAN, LONG1D, ESME and ORBIT. While they embody a wealth of accumulated wisdom and experience, most of these codes were written decades ago and to some extent they reflect the constraints of their time. As a result, there is an interest for updated tools taking better advantage of modern software and hardware capabilities. At Fermilab, the PIP-II project has provided the impetus for development of such a tool. In this contribution, we discuss design decisions and code architecture. A selection of test cases basedmore » on an initial prototype are also presented.« less

  9. Biomaterial science meets computational biology.

    PubMed

    Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela

    2015-05-01

    There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.

  10. VERCE: a productive e-Infrastructure and e-Science environment for data-intensive seismology research

    NASA Astrophysics Data System (ADS)

    Vilotte, J. P.; Atkinson, M.; Spinuso, A.; Rietbrock, A.; Michelini, A.; Igel, H.; Frank, A.; Carpené, M.; Schwichtenberg, H.; Casarotti, E.; Filgueira, R.; Garth, T.; Germünd, A.; Klampanos, I.; Krause, A.; Krischer, L.; Leong, S. H.; Magnoni, F.; Matser, J.; Moguilny, G.

    2015-12-01

    Seismology addresses both fundamental problems in understanding the Earth's internal wave sources and structures and augmented societal applications, like earthquake and tsunami hazard assessment and risk mitigation; and puts a premium on open-data accessible by the Federated Digital Seismological Networks. The VERCE project, "Virtual Earthquake and seismology Research Community e-science environment in Europe", has initiated a virtual research environment to support complex orchestrated workflows combining state-of-art wave simulation codes and data analysis tools on distributed computing and data infrastructures (DCIs) along with multiple sources of observational data and new capabilities to combine simulation results with observational data. The VERCE Science Gateway provides a view of all the available resources, supporting collaboration with shared data and methods, with data access controls. The mapping to DCIs handles identity management, authority controls, transformations between representations and controls, and access to resources. The framework for computational science that provides simulation codes, like SPECFEM3D, democratizes their use by getting data from multiple sources, managing Earth models and meshes, distilling them as input data, and capturing results with meta-data. The dispel4py data-intensive framework allows for developing data-analysis applications using Python and the ObsPy library, which can be executed on different DCIs. A set of tools allows coupling with seismology and external data services. Provenance driven tools validate results and show relationships between data to facilitate method improvement. Lessons learned from VERCE training lead us to conclude that solid-Earth scientists could make significant progress by using VERCE e-science environment. VERCE has already contributed to the European Plate Observation System (EPOS), and is part of the EPOS implementation phase. Its cross-disciplinary capabilities are being extended for the EPOS implantation phase.

  11. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    PubMed Central

    2011-01-01

    Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355

  12. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  13. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactormore » innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  14. TU-A-17A-02: In Memoriam of Ben Galkin: Virtual Tools for Validation of X-Ray Breast Imaging Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, K; Bakic, P; Abbey, C

    2014-06-15

    This symposium will explore simulation methods for the preclinical evaluation of novel 3D and 4D x-ray breast imaging systems – the subject of AAPM taskgroup TG234. Given the complex design of modern imaging systems, simulations offer significant advantages over long and costly clinical studies in terms of reproducibility, reduced radiation exposures, a known reference standard, and the capability for studying patient and disease subpopulations through appropriate choice of simulation parameters. Our focus will be on testing the realism of software anthropomorphic phantoms and virtual clinical trials tools developed for the optimization and validation of breast imaging systems. The symposium willmore » review the stateof- the-science, as well as the advantages and limitations of various approaches to testing realism of phantoms and simulated breast images. Approaches based upon the visual assessment of synthetic breast images by expert observers will be contrasted with approaches based upon comparing statistical properties between synthetic and clinical images. The role of observer models in the assessment of realism will be considered. Finally, an industry perspective will be presented, summarizing the role and importance of virtual tools and simulation methods in product development. The challenges and conditions that must be satisfied in order for computational modeling and simulation to play a significantly increased role in the design and evaluation of novel breast imaging systems will be addressed. Learning Objectives: Review the state-of-the science in testing realism of software anthropomorphic phantoms and virtual clinical trials tools; Compare approaches based upon the visual assessment by expert observers vs. the analysis of statistical properties of synthetic images; Discuss the role of observer models in the assessment of realism; Summarize the industry perspective to virtual methods for breast imaging.« less

  15. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

  16. SSSFD manipulator engineering using statistical experiment design techniques

    NASA Technical Reports Server (NTRS)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  17. RF Models for Plasma-Surface Interactions in VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.

  18. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  19. Mechanical Property Analysis in the Retracted Pin-Tool (RPT) Region of Friction Stir Welded (FSW) Aluminum Lithium 2195

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey; Oelgoetz, Peter A.

    1999-01-01

    The "Auto-Adjustable Pin Tool for Friction Stir Welding", was developed at The Marshall Space Flight Center to address process deficiencies unique to the FSW process. The auto-adjustable pin tool, also called the retractable pin-tool (R.PT) automatically withdraws the welding probe of the pin-tool into the pin-tool's shoulder. The primary function of the auto-adjustable pin-tool is to allow for keyhole closeout, necessary for circumferential welding and localized weld repair, and, automated pin-length adjustment for the welding of tapered material thickness. An overview of the RPT hardware is presented. The paper follows with studies conducted using the RPT. The RPT was used to simulate two capabilities; welding tapered material thickness and closing out the keyhole in a circumferential weld. The retracted pin-tool regions in aluminum- lithium 2195 friction stir weldments were studied through mechanical property testing and metallurgical sectioning. Correlation's can be =de between retractable pin-tool programmed parameters, process parameters, microstructure, and resulting weld quality.

  20. Manufacturing Process Simulation of Large-Scale Cryotanks

    NASA Technical Reports Server (NTRS)

    Babai, Majid; Phillips, Steven; Griffin, Brian

    2003-01-01

    NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA.s Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aide in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI. As part of the SLI, The Boeing Company was awarded a basic period contract to research and propose options for both a metallic and a composite cryotank. Boeing then entered into a task agreement with the Marshall Space Flight Center to provide manufacturing simulation support. This paper highlights the accomplishments of this task agreement, while also introducing the capabilities of simulation software.

  1. New Tool Released for Engine-Airframe Blade-Out Structural Simulations

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles

    2004-01-01

    Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.

  2. NASA Virtual Glovebox (VBX): Emerging Simulation Technology for Space Station Experiment Design, Development, Training and Troubleshooting

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey D.; Twombly, I. Alexander; Maese, A. Christopher; Cagle, Yvonne; Boyle, Richard

    2003-01-01

    The International Space Station demonstrates the greatest capabilities of human ingenuity, international cooperation and technology development. The complexity of this space structure is unprecedented; and training astronaut crews to maintain all its systems, as well as perform a multitude of research experiments, requires the most advanced training tools and techniques. Computer simulation and virtual environments are currently used by astronauts to train for robotic arm manipulations and extravehicular activities; but now, with the latest computer technologies and recent successes in areas of medical simulation, the capability exists to train astronauts for more hands-on research tasks using immersive virtual environments. We have developed a new technology, the Virtual Glovebox (VGX), for simulation of experimental tasks that astronauts will perform aboard the Space Station. The VGX may also be used by crew support teams for design of experiments, testing equipment integration capability and optimizing the procedures astronauts will use. This is done through the 3D, desk-top sized, reach-in virtual environment that can simulate the microgravity environment in space. Additional features of the VGX allow for networking multiple users over the internet and operation of tele-robotic devices through an intuitive user interface. Although the system was developed for astronaut training and assisting support crews, Earth-bound applications, many emphasizing homeland security, have also been identified. Examples include training experts to handle hazardous biological and/or chemical agents in a safe simulation, operation of tele-robotic systems for assessing and diffusing threats such as bombs, and providing remote medical assistance to field personnel through a collaborative virtual environment. Thus, the emerging VGX simulation technology, while developed for space- based applications, can serve a dual use facilitating homeland security here on Earth.

  3. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  4. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.

    PubMed

    Shariff, Azmi Mohd; Zaini, Dzulkarnain

    2010-10-15

    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.

  5. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  6. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  7. The Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Mike G.; Bowman, James D.

    2007-01-01

    The paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which focuses on the investigation and development of technologies, processes and integrated simulations related to the collaborative distributed simulation of complex space systems in support of NASA's Exploration Initiative. This paper describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. In the network work area, DSES is developing a Distributed Simulation Network that will provide agency wide support for distributed simulation between all NASA centers. In the software work area, DSES is developing a collection of software models, tool and procedures that ease the burden of developing distributed simulations and provides a consistent interoperability infrastructure for agency wide participation in integrated simulation. Finally, for simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper will present current status and plans for each of these work areas with specific examples of simulations that support NASA's exploration initiatives.

  8. OsiriX software as a preoperative planning tool in cranial neurosurgery: A step-by-step guide for neurosurgical residents

    PubMed Central

    Spiriev, Toma; Nakov, Vladimir; Laleva, Lili; Tzekov, Christo

    2017-01-01

    Background: OsiriX (Pixmeo, Switzerland) is an open-source Digital Imaging and Communications in Medicine (DICOM) viewer that is gaining more and more attention in the neurosurgical community because of its user-friendly interface, powerful three-dimensional (3D) volumetric rendering capabilities, and various options for data integration. This paper presents in detail the use of OsiriX software as a preoperative planning tool in cranial neurosurgery. Methods: In January 2013, OsiriX software was introduced into our clinical practice as a preoperative planning tool. Its capabilities are being evaluated on an ongoing basis in routine elective cranial cases. Results: The program has proven to be highly effective at volumetrically representing data from radiological examinations in 3D. Among its benefits in preoperative planning are simulating the position and exact location of the lesion in 3D, tailoring the skin incision and craniotomy bone flap, enhancing the representation of normal and pathological anatomy, and aiding in planning the reconstruction of the affected area. Conclusion: OsiriX is a useful tool for preoperative planning and visualization in neurosurgery. The software greatly facilitates the surgeon's understanding of the relationship between normal and pathological anatomy and can be used as a teaching tool. PMID:29119039

  9. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  10. The Distributed Geothermal Market Demand Model (dGeo): Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O

    The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less

  11. Simulation of Lunar Surface Communications Network Exploration Scenarios

    NASA Technical Reports Server (NTRS)

    Linsky, Thomas W.; Bhasin, Kul B.; White, Alex; Palangala, Srihari

    2006-01-01

    Simulations and modeling of surface-based communications networks provides a rapid and cost effective means of requirement analysis, protocol assessments, and tradeoff studies. Robust testing in especially important for exploration systems, where the cost of deployment is high and systems cannot be easily replaced or repaired. However, simulation of the envisioned exploration networks cannot be achieved using commercial off the shelf network simulation software. Models for the nonstandard, non-COTS protocols used aboard space systems are not readily available. This paper will address the simulation of realistic scenarios representative of the activities which will take place on the surface of the Moon, including selection of candidate network architectures, and the development of an integrated simulation tool using OPNET modeler capable of faithfully modeling those communications scenarios in the variable delay, dynamic surface environments. Scenarios for exploration missions, OPNET development, limitations, and simulations results will be provided and discussed.

  12. Modelisation numerique de l'hydrologie pour l'aide a la gestion des bassins versants, par l'utilisation conjointe des systemes d'information geographique et de la methode des elements finis un nouvel outil pour le developpement durable SAGESS

    NASA Astrophysics Data System (ADS)

    Bel Hadj Kacem, Mohamed Salah

    All hydrological processes are affected by the spatial variability of the physical parameters of the watershed, and also by human intervention on the landscape. The water outflow from a watershed strictly depends on the spatial and temporal variabilities of the physical parameters of the watershed. It is now apparent that the integration of mathematical models into GIS's can benefit both GIS and three-dimension environmental models: a true modeling capability can help the modeling community bridge the gap between planners, scientists, decision-makers and end-users. The main goal of this research is to design a practical tool to simulate run-off water surface using Geographic design a practical tool to simulate run-off water surface using Geographic Information Systems and the simulation of the hydrological behavior by the Finite Element Method.

  13. A new statistical model for subgrid dispersion in large eddy simulations of particle-laden flows

    NASA Astrophysics Data System (ADS)

    Muela, Jordi; Lehmkuhl, Oriol; Pérez-Segarra, Carles David; Oliva, Asensi

    2016-09-01

    Dispersed multiphase turbulent flows are present in many industrial and commercial applications like internal combustion engines, turbofans, dispersion of contaminants, steam turbines, etc. Therefore, there is a clear interest in the development of models and numerical tools capable of performing detailed and reliable simulations about these kind of flows. Large Eddy Simulations offer good accuracy and reliable results together with reasonable computational requirements, making it a really interesting method to develop numerical tools for particle-laden turbulent flows. Nonetheless, in multiphase dispersed flows additional difficulties arises in LES, since the effect of the unresolved scales of the continuous phase over the dispersed phase is lost due to the filtering procedure. In order to solve this issue a model able to reconstruct the subgrid velocity seen by the particles is required. In this work a new model for the reconstruction of the subgrid scale effects over the dispersed phase is presented and assessed. This innovative methodology is based in the reconstruction of statistics via Probability Density Functions (PDFs).

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorier, Matthieu; Sisneros, Roberto; Bautista Gomez, Leonard

    While many parallel visualization tools now provide in situ visualization capabilities, the trend has been to feed such tools with large amounts of unprocessed output data and let them render everything at the highest possible resolution. This leads to an increased run time of simulations that still have to complete within a fixed-length job allocation. In this paper, we tackle the challenge of enabling in situ visualization under performance constraints. Our approach shuffles data across processes according to its content and filters out part of it in order to feed a visualization pipeline with only a reorganized subset of themore » data produced by the simulation. Our framework leverages fast, generic evaluation procedures to score blocks of data, using information theory, statistics, and linear algebra. It monitors its own performance and adapts dynamically to achieve appropriate visual fidelity within predefined performance constraints. Experiments on the Blue Waters supercomputer with the CM1 simulation show that our approach enables a 5 speedup with respect to the initial visualization pipeline and is able to meet performance constraints.« less

  15. Toward Interactive Scenario Analysis and Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gayle, Thomas R.; Summers, Kenneth Lee; Jungels, John

    2015-01-01

    As Modeling and Simulation (M&S) tools have matured, their applicability and importance have increased across many national security challenges. In particular, they provide a way to test how something may behave without the need to do real world testing. However, current and future changes across several factors including capabilities, policy, and funding are driving a need for rapid response or evaluation in ways that many M&S tools cannot address. Issues around large data, computational requirements, delivery mechanisms, and analyst involvement already exist and pose significant challenges. Furthermore, rising expectations, rising input complexity, and increasing depth of analysis will only increasemore » the difficulty of these challenges. In this study we examine whether innovations in M&S software coupled with advances in ''cloud'' computing and ''big-data'' methodologies can overcome many of these challenges. In particular, we propose a simple, horizontally-scalable distributed computing environment that could provide the foundation (i.e. ''cloud'') for next-generation M&S-based applications based on the notion of ''parallel multi-simulation''. In our context, the goal of parallel multi- simulation is to consider as many simultaneous paths of execution as possible. Therefore, with sufficient resources, the complexity is dominated by the cost of single scenario runs as opposed to the number of runs required. We show the feasibility of this architecture through a stable prototype implementation coupled with the Umbra Simulation Framework [6]. Finally, we highlight the utility through multiple novel analysis tools and by showing the performance improvement compared to existing tools.« less

  16. A Coupled Multiphysics Approach for Simulating Induced Seismicity, Ground Acceleration and Structural Damage

    NASA Astrophysics Data System (ADS)

    Podgorney, Robert; Coleman, Justin; Wilkins, Amdrew; Huang, Hai; Veeraraghavan, Swetha; Xia, Yidong; Permann, Cody

    2017-04-01

    Numerical modeling has played an important role in understanding the behavior of coupled subsurface thermal-hydro-mechanical (THM) processes associated with a number of energy and environmental applications since as early as the 1970s. While the ability to rigorously describe all key tightly coupled controlling physics still remains a challenge, there have been significant advances in recent decades. These advances are related primarily to the exponential growth of computational power, the development of more accurate equations of state, improvements in the ability to represent heterogeneity and reservoir geometry, and more robust nonlinear solution schemes. The work described in this paper documents the development and linkage of several fully-coupled and fully-implicit modeling tools. These tools simulate: (1) the dynamics of fluid flow, heat transport, and quasi-static rock mechanics; (2) seismic wave propagation from the sources of energy release through heterogeneous material; and (3) the soil-structural damage resulting from ground acceleration. These tools are developed in Idaho National Laboratory's parallel Multiphysics Object Oriented Simulation Environment, and are integrated together using a global implicit approach. The governing equations are presented, the numerical approach for simultaneously solving and coupling the three coupling physics tools is discussed, and the data input and output methodology is outlined. An example is presented to demonstrate the capabilities of the coupled multiphysics approach. The example involves simulating a system conceptually similar to the geothermal development in Basel Switzerland, and the resultant induced seismicity, ground motion and structural damage is predicted.

  17. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  18. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  19. Modeling a Wireless Network for International Space Station

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Yaprak, Ece; Lamouri, Saad

    2000-01-01

    This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.

  20. Designsafe-Ci a Cyberinfrastructure for Natural Hazard Simulation and Data

    NASA Astrophysics Data System (ADS)

    Dawson, C.; Rathje, E.; Stanzione, D.; Padgett, J.; Pinelli, J. P.

    2017-12-01

    DesignSafe is the web-based research platform of the Natural Hazards Engineering Research Infrastructure (NHERI) network that provides the computational tools needed to manage and analyze critical data for natural hazards research, with wind and storm surge related hazards being a primary focus. One of the simulation tools under DesignSafe is the Advanced Circulation (ADCIRC) model, a coastal ocean model used in storm surge analysis. ADCIRC is an unstructured, finite element model with high resolution capabilities for studying storm surge impacts, and has long been used in storm surge hind-casting and forecasting. In this talk, we will demonstrate the use of ADCIRC within the DesignSafe platform and its use for forecasting Hurricane Harvey. We will also demonstrate how to analyze, visualize and archive critical storm surge related data within DesignSafe.

  1. Prediction of Thermal Transport Properties of Materials with Microstructural Complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Youping

    This project aims at overcoming the major obstacle standing in the way of progress in dynamic multiscale simulation, which is the lack of a concurrent atomistic-continuum method that allows phonons, heat and defects to pass through the atomistic-continuum interface. The research has led to the development of a concurrent atomistic-continuum (CAC) methodology for multiscale simulations of materials microstructural, mechanical and thermal transport behavior. Its efficacy has been tested and demonstrated through simulations of dislocation dynamics and phonon transport coupled with microstructural evolution in a variety of materials and through providing visual evidences of the nature of phonon transport, such asmore » showing the propagation of heat pulses in single and polycrystalline solids is partially ballistic and partially diffusive. In addition to providing understanding on phonon scattering with phase interface and with grain boundaries, the research has contributed a multiscale simulation tool for understanding of the behavior of complex materials and has demonstrated the capability of the tool in simulating the dynamic, in situ experimental studies of nonequilibrium transient transport processes in material samples that are at length scales typically inaccessible by atomistically resolved methods.« less

  2. MuSim, a Graphical User Interface for Multiple Simulation Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Thomas; Cummings, Mary Anne; Johnson, Rolland

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X,more » and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.« less

  3. Coupling the System Analysis Module with SAS4A/SASSYS-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanning, T. H.; Hu, R.

    2016-09-30

    SAS4A/SASSYS-1 is a simulation tool used to perform deterministic analysis of anticipated events as well as design basis and beyond design basis accidents for advanced reactors, with an emphasis on sodium fast reactors. SAS4A/SASSYS-1 has been under development and in active use for nearly forty-five years, and is currently maintained by the U.S. Department of Energy under the Office of Advanced Reactor Technology. Although SAS4A/SASSYS-1 contains a very capable primary and intermediate system modeling component, PRIMAR-4, it also has some shortcomings: outdated data management and code structure makes extension of the PRIMAR-4 module somewhat difficult. The user input format formore » PRIMAR-4 also limits the number of volumes and segments that can be used to describe a given system. The System Analysis Module (SAM) is a fairly new code development effort being carried out under the U.S. DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM is being developed with advanced physical models, numerical methods, and software engineering practices; however, it is currently somewhat limited in the system components and phenomena that can be represented. For example, component models for electromagnetic pumps and multi-layer stratified volumes have not yet been developed. Nor is there support for a balance of plant model. Similarly, system-level phenomena such as control-rod driveline expansion and vessel elongation are not represented. This report documents fiscal year 2016 work that was carried out to couple the transient safety analysis capabilities of SAS4A/SASSYS-1 with the system modeling capabilities of SAM under the joint support of the ART and NEAMS programs. The coupling effort was successful and is demonstrated by evaluating an unprotected loss of flow transient for the Advanced Burner Test Reactor (ABTR) design. There are differences between the stand-alone SAS4A/SASSYS-1 simulations and the coupled SAS/SAM simulations, but these are mainly attributed to the limited maturity of the SAM development effort. The severe accident modeling capabilities in SAS4A/SASSYS-1 (sodium boiling, fuel melting and relocation) will continue to play a vital role for a long time. Therefore, the SAS4A/SASSYS-1 modernization effort should remain a high priority task under the ART program to ensure continued participation in domestic and international SFR safety collaborations and design optimizations. On the other hand, SAM provides an advanced system analysis tool, with improved numerical solution schemes, data management, code flexibility, and accuracy. SAM is still in early stages of development and will require continued support from NEAMS to fulfill its potential and to mature into a production tool for advanced reactor safety analysis. The effort to couple SAS4A/SASSYS-1 and SAM is the first step on the integration of these modeling capabilities.« less

  4. CONFIG: Integrated engineering of systems and their operation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.

  5. Aviation Trends Related to Atmospheric Environment Safety Technologies Project Technical Challenges

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Barr, Lawrence C.; Evans, Joni K.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    Current and future aviation safety trends related to the National Aeronautics and Space Administration's Atmospheric Environment Safety Technologies Project's three technical challenges (engine icing characterization and simulation capability; airframe icing simulation and engineering tool capability; and atmospheric hazard sensing and mitigation technology capability) were assessed by examining the National Transportation Safety Board (NTSB) accident database (1989 to 2008), incidents from the Federal Aviation Administration (FAA) accident/incident database (1989 to 2006), and literature from various industry and government sources. The accident and incident data were examined for events involving fixed-wing airplanes operating under Federal Aviation Regulation (FAR) Parts 121, 135, and 91 for atmospheric conditions related to airframe icing, ice-crystal engine icing, turbulence, clear air turbulence, wake vortex, lightning, and low visibility (fog, low ceiling, clouds, precipitation, and low lighting). Five future aviation safety risk areas associated with the three AEST technical challenges were identified after an exhaustive survey of a variety of sources and include: approach and landing accident reduction, icing/ice detection, loss of control in flight, super density operations, and runway safety.

  6. Sustainable Human Presence on the Moon using In Situ Resources

    NASA Technical Reports Server (NTRS)

    McLemore, Carol A.; Fikes, John C.; McCarley, Kevin S.; Darby, Charles A.; Curreri, Peter A.; Kennedy, James P.; Good, James E.; Gilley, Scott D.

    2008-01-01

    New capabilities, technologies and infrastructure must be developed to enable a sustained human presence on the moon and beyond. The key to having this permanent presence is the utilization of in situ resources. To this end, NASA is investigating how in situ resources can be utilized to improve mission success by reducing up-mass, improving safety, reducing risk, and bringing down costs for the overall mission. To ensure that this capability is available when needed, technology development is required now. NASA/Marshall Space Flight Center (MSFC) is supporting this endeavor, along with other NASA centers, by exploring how lunar regolith can be mined for uses such as construction, life support, propulsion, power, and fabrication. Efforts at MSFC include development of lunar regolith simulant for hardware testing and development, extraction of oxygen and other materials from the lunar regolith, production of parts and tools on the moon from local materials or from provisioned feedstocks, and capabilities to show that produced parts are "ready for use". This paper discusses the lunar regolith, how the regolith is being replicated in the development of simulants and possible uses of the regolith.

  7. Performance analysis of a parallel Monte Carlo code for simulating solar radiative transfer in cloudy atmospheres using CUDA-enabled NVIDIA GPU

    NASA Astrophysics Data System (ADS)

    Russkova, Tatiana V.

    2017-11-01

    One tool to improve the performance of Monte Carlo methods for numerical simulation of light transport in the Earth's atmosphere is the parallel technology. A new algorithm oriented to parallel execution on the CUDA-enabled NVIDIA graphics processor is discussed. The efficiency of parallelization is analyzed on the basis of calculating the upward and downward fluxes of solar radiation in both a vertically homogeneous and inhomogeneous models of the atmosphere. The results of testing the new code under various atmospheric conditions including continuous singlelayered and multilayered clouds, and selective molecular absorption are presented. The results of testing the code using video cards with different compute capability are analyzed. It is shown that the changeover of computing from conventional PCs to the architecture of graphics processors gives more than a hundredfold increase in performance and fully reveals the capabilities of the technology used.

  8. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  9. Highly Automated Arrival Management and Control System Suitable for Early NextGen

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Jung, Jaewoo

    2013-01-01

    This is a presentation of previously published work conducted in the development of the Terminal Area Precision Scheduling and Spacing (TAPSS) system. Included are concept and technical descriptions of the TAPSS system and results from human in the loop simulations conducted at Ames Research Center. The Terminal Area Precision Scheduling and Spacing system has demonstrated through research and extensive high-fidelity simulation studies to have benefits in airport arrival throughput, supporting efficient arrival descents, and enabling mixed aircraft navigation capability operations during periods of high congestion. NASA is currently porting the TAPSS system into the FAA TBFM and STARS system prototypes to ensure its ability to operate in the FAA automation Infrastructure. NASA ATM Demonstration Project is using the the TAPSS technologies to provide the ground-based automation tools to enable airborne Interval Management (IM) capabilities. NASA and the FAA have initiated a Research Transition Team to enable potential TAPSS and IM Technology Transfer.

  10. Impact of the Columbia Supercomputer on NASA Space and Exploration Mission

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Kwak, Dochan; Kiris, Cetin; Lawrence, Scott

    2006-01-01

    NASA's 10,240-processor Columbia supercomputer gained worldwide recognition in 2004 for increasing the space agency's computing capability ten-fold, and enabling U.S. scientists and engineers to perform significant, breakthrough simulations. Columbia has amply demonstrated its capability to accelerate NASA's key missions, including space operations, exploration systems, science, and aeronautics. Columbia is part of an integrated high-end computing (HEC) environment comprised of massive storage and archive systems, high-speed networking, high-fidelity modeling and simulation tools, application performance optimization, and advanced data analysis and visualization. In this paper, we illustrate the impact Columbia is having on NASA's numerous space and exploration applications, such as the development of the Crew Exploration and Launch Vehicles (CEV/CLV), effects of long-duration human presence in space, and damage assessment and repair recommendations for remaining shuttle flights. We conclude by discussing HEC challenges that must be overcome to solve space-related science problems in the future.

  11. Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2014-10-01

    The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).

  12. Development and evaluation of the Screening Trajectory Ozone Prediction System (STOPS, version 1.0)

    NASA Astrophysics Data System (ADS)

    Czader, B. H.; Percell, P.; Byun, D.; Kim, S.; Choi, Y.

    2015-05-01

    A hybrid Lagrangian-Eulerian based modeling tool has been developed using the Eulerian framework of the Community Multiscale Air Quality (CMAQ) model. It is a moving nest that utilizes saved original CMAQ simulation results to provide boundary conditions, initial conditions, as well as emissions and meteorological parameters necessary for a simulation. Given that these files are available, this tool can run independently of the CMAQ whole domain simulation, and it is designed to simulate source-receptor relationships upon changes in emissions. In this tool, the original CMAQ's horizontal domain is reduced to a small sub-domain that follows a trajectory defined by the mean mixed-layer wind. It has the same vertical structure and physical and chemical interactions as CMAQ except advection calculation. The advantage of this tool compared to other Lagrangian models is its capability of utilizing realistic boundary conditions that change with space and time as well as detailed chemistry treatment. The correctness of the algorithms and the overall performance was evaluated against CMAQ simulation results. Its performance depends on the atmospheric conditions occurring during the simulation period, with the comparisons being most similar to CMAQ results under uniform wind conditions. The mean bias for surface ozone mixing ratios varies between -0.03 and -0.78 ppbV and the slope is between 0.99 and 1.01 for different analyzed cases. For complicated meteorological conditions, such as wind circulation, the simulated mixing ratios deviate from CMAQ values as a result of the Lagrangian approach of using mean wind for its movement, but are still close, with the mean bias for ozone varying between 0.07 and -4.29 ppbV and the slope varying between 0.95 and 1.06 for different analyzed cases. For historical reasons, this hybrid Lagrangian-Eulerian based tool is named the Screening Trajectory Ozone Prediction System (STOPS), but its use is not limited to ozone prediction as, similarly to CMAQ, it can simulate concentrations of many species, including particulate matter and some toxic compounds, such as formaldehyde and 1,3-butadiene.

  13. Multidisciplinary model-based-engineering for laser weapon systems: recent progress

    NASA Astrophysics Data System (ADS)

    Coy, Steve; Panthaki, Malcolm

    2013-09-01

    We are working to develop a comprehensive, integrated software framework and toolset to support model-based engineering (MBE) of laser weapons systems. MBE has been identified by the Office of the Director, Defense Science and Engineering as one of four potentially "game-changing" technologies that could bring about revolutionary advances across the entire DoD research and development and procurement cycle. To be effective, however, MBE requires robust underlying modeling and simulation technologies capable of modeling all the pertinent systems, subsystems, components, effects, and interactions at any level of fidelity that may be required in order to support crucial design decisions at any point in the system development lifecycle. Very often the greatest technical challenges are posed by systems involving interactions that cut across two or more distinct scientific or engineering domains; even in cases where there are excellent tools available for modeling each individual domain, generally none of these domain-specific tools can be used to model the cross-domain interactions. In the case of laser weapons systems R&D these tools need to be able to support modeling of systems involving combined interactions among structures, thermal, and optical effects, including both ray optics and wave optics, controls, atmospheric effects, target interaction, computational fluid dynamics, and spatiotemporal interactions between lasing light and the laser gain medium. To address this problem we are working to extend Comet™, to add the addition modeling and simulation capabilities required for this particular application area. In this paper we will describe our progress to date.

  14. Closed Environment Module - modularization and extension of the V-HAB

    NASA Astrophysics Data System (ADS)

    Plötner, Peter; Czupalla, M. Markus; Zhukov, Anton

    2012-07-01

    The `Virtual Habitat' (V-HAB), is a Life Support System (LSS) simulation, created to provide the possibility for dynamic simulation of LSS for future human spaceflight missions. V-HAB creates the option to optimize LSS during early design phases. Furthermore, it allows simulating e.g. worst case scenarios which cannot be tested in reality. In a nutshell the tool allows the testing of LSS robustness by means of computer simulations. V-HAB is a modular simulation consisting of a: Closed Environment Module (CEM) Crew Module Biological Module Physio-Chemical Module The focus of the paper will be the Closed Environment Module (CEM) which is the core of V-HAB. The main function of the CEM is the embedding of all modules in the entire simulation and the control of the LSS. The CEM includes the possibility to simulate an arbitrary number of compartments and tanks with the interaction between connected compartments. Furthermore, a control program to actuate the LSS Technologies was implemented in the CEM, and is also introduced. In this paper the capabilities of the CEM are introduced based on selected test cases. In particular the following capabilities are demonstrated: Supply Leakage ON/OFF controller Power management Un-/docking Controller for tanks with maximum filling degree The CEM of the V-HAB simulation was verified by simulating the Atmosphere Revitalization part of the ISS and comparing it to actual measurement data. The results of this analysis are also presented in the paper.

  15. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less

  16. Study of Kapton Degradation under Simulated Shuttle Environment

    NASA Technical Reports Server (NTRS)

    Eck, T. G.; Hoffman, R. W.

    1985-01-01

    Weight loss and severe degradation of the surface of Kapton that occurs in low Earth orbit is studied. Atomic oxygen, the major ambient species at low Earth altitude and incident with approximately 5 eV energy in ram conditions, is the primary suspect, but a thorough study of oxygen-Kapton interactions has not yet been carried out. A low-energy ion source is used to simulate the shuttle low Earth orbit environment. This source, together with diagnostic tools including surface analysis and mass spectroscopic capability, is being used to carry out experiments from which quantum yields may be obtained.

  17. Coupled electromagnetic-thermodynamic simulations of microwave heating problems using the FDTD algorithm.

    PubMed

    Kopyt, Paweł; Celuch, Małgorzata

    2007-01-01

    A practical implementation of a hybrid simulation system capable of modeling coupled electromagnetic-thermodynamic problems typical in microwave heating is described. The paper presents two approaches to modeling such problems. Both are based on an FDTD-based commercial electromagnetic solver coupled to an external thermodynamic analysis tool required for calculations of heat diffusion. The first approach utilizes a simple FDTD-based thermal solver while in the second it is replaced by a universal commercial CFD solver. The accuracy of the two modeling systems is verified against the original experimental data as well as the measurement results available in literature.

  18. Towards a supported common NEAMS software stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormac Garvey

    2012-04-01

    The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less

  19. In-flight crew training

    NASA Technical Reports Server (NTRS)

    Gott, Charles; Galicki, Peter; Shores, David

    1990-01-01

    The Helmet Mounted Display system and Part Task Trainer are two projects currently underway that are closely related to the in-flight crew training concept. The first project is a training simulator and an engineering analysis tool. The simulator's unique helmet mounted display actually projects the wearer into the simulated environment of 3-D space. Miniature monitors are mounted in front of the wearers eyes. Partial Task Trainer is a kinematic simulator for the Shuttle Remote Manipulator System. The simulator consists of a high end graphics workstation with a high resolution color screen and a number of input peripherals that create a functional equivalent of the RMS control panel in the back of the Orbiter. It is being used in the training cycle for Shuttle crew members. Activities are underway to expand the capability of the Helmet Display System and the Partial Task Trainer.

  20. Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.

    PubMed

    Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner

    2016-01-01

    Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.; McCorkle, D.; Yang, C.

    Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less

  2. Simulation based planning of surgical interventions in pediatric cardiology

    NASA Astrophysics Data System (ADS)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  3. AN-CASE NET-CENTRIC modeling and simulation

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Chruscicki, Mary Carol; Turck, Kurt

    2009-05-01

    The objective of mission training exercises is to immerse the trainees into an environment that enables them to train like they would fight. The integration of modeling and simulation environments that can seamlessly leverage Live systems, and Virtual or Constructive models (LVC) as they are available offers a flexible and cost effective solution to extending the "war-gaming" environment to a realistic mission experience while evolving the development of the net-centric enterprise. From concept to full production, the impact of new capabilities on the infrastructure and concept of operations, can be assessed in the context of the enterprise, while also exposing them to the warfighter. Training is extended to tomorrow's tools, processes, and Tactics, Techniques and Procedures (TTPs). This paper addresses the challenges of a net-centric modeling and simulation environment that is capable of representing a net-centric enterprise. An overview of the Air Force Research Laboratory's (AFRL) Airborne Networking Component Architecture Simulation Environment (AN-CASE) is provide as well as a discussion on how it is being used to assess technologies for the purpose of experimenting with new infrastructure mechanisms that enhance the scalability and reliability of the distributed mission operations environment.

  4. TU-EF-204-07: Add Tube Current Modulation to a Low Dose Simulation Tool for CT Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Y.; Department of Physics, University of Arizona, Tucson, AZ; Wen, G.

    2015-06-15

    Purpose: We extended the capabilities of a low dose simulation tool to model Tube-Current Modulation (TCM). TCM is widely used in clinical practice to reduce radiation dose in CT scans. We expect the tool to be valuable for various clinical applications (e.g., optimize protocols, compare reconstruction techniques and evaluate TCM methods). Methods: The tube current is input as a function of z location, instead of a fixed value. Starting from the line integrals of a scan, a new Poisson noise realization at a lower dose is generated for each view. To validate the new functionality, we compared simulated scans withmore » real scans in image space. Results: First we assessed noise in the difference between the low-dose simulations and the original high-dose scan. When the simulated tube current is a step function of z location, the noise at each segment matches the noise of 3 separate constant-tube-current-simulations. Secondly, with a phantom that forces TCM, we compared a low-dose simulation with an equivalent real low-dose scan. The mean CT number of the simulated scan and the real low-dose scan were 137.7±0.6 and 137.8±0.5 respectively. Furthermore, with 240 ROIs, the noise of the simulated scan and the real low-dose scan were 24.03±0.45 and 23.99±0.43 respectively, and they were not statistically different (2-sample t-test, p-value=0.28). The facts that the noise reflected the trend of the TCM curve, and that the absolute noise measurements were not statistically different validated the TCM function. Conclusion: We successfully added tube-current modulation functionality in an existing low dose simulation tool. We demonstrated that the noise reflected an input tube-current modulation curve. In addition, we verified that the noise and mean CT number of our simulation agreed with a real low dose scan. The authors are all employees of Philips. Yijun Ding is also supported by NIBIB P41EB002035 and NIBIB R01EB000803.« less

  5. Future missions for observing Earth's changing gravity field: a closed-loop simulation tool

    NASA Astrophysics Data System (ADS)

    Visser, P. N.

    2008-12-01

    The GRACE mission has successfully demonstrated the observation from space of the changing Earth's gravity field at length and time scales of typically 1000 km and 10-30 days, respectively. Many scientific communities strongly advertise the need for continuity of observing Earth's gravity field from space. Moreover, a strong interest is being expressed to have gravity missions that allow a more detailed sampling of the Earth's gravity field both in time and in space. Designing a gravity field mission for the future is a complicated process that involves making many trade-offs, such as trade-offs between spatial, temporal resolution and financial budget. Moreover, it involves the optimization of many parameters, such as orbital parameters (height, inclination), distinction between which gravity sources to observe or correct for (for example are gravity changes due to ocean currents a nuisance or a signal to be retrieved?), observation techniques (low-low satellite-to-satellite tracking, satellite gravity gradiometry, accelerometers), and satellite control systems (drag-free?). A comprehensive tool has been developed and implemented that allows the closed-loop simulation of gravity field retrievals for different satellite mission scenarios. This paper provides a description of this tool. Moreover, its capabilities are demonstrated by a few case studies. Acknowledgments. The research that is being done with the closed-loop simulation tool is partially funded by the European Space Agency (ESA). An important component of the tool is the GEODYN software, kindly provided by NASA Goddard Space Flight Center in Greenbelt, Maryland.

  6. Ionospheric Simulation System for Satellite Observations and Global Assimilative Modeling Experiments (ISOGAME)

    NASA Technical Reports Server (NTRS)

    Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.

    2013-01-01

    ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.

  7. Cyberwar XXI: quantifying the unquantifiable: adaptive AI for next-generation conflict simulations

    NASA Astrophysics Data System (ADS)

    Miranda, Joseph; von Kleinsmid, Peter; Zalewski, Tony

    2004-08-01

    The era of the "Revolution in Military Affairs," "4th Generation Warfare" and "Asymmetric War" requires novel approaches to modeling warfare at the operational and strategic level of modern conflict. For example, "What if, in response to our planned actions, the adversary reacts in such-and-such a manner? What will our response be? What are the possible unintended consequences?" Next generation conflict simulation tools are required to help create and test novel courses of action (COA's) in support of real-world operations. Conflict simulations allow non-lethal and cost-effective exploration of the "what-if" of COA development. The challenge has been to develop an automated decision-support software tool which allows competing COA"s to be compared in simulated dynamic environments. Principal Investigator Joseph Miranda's research is based on modeling an integrated military, economic, social, infrastructure and information (PMESII) environment. The main effort was to develop an adaptive AI engine which models agents operating within an operational-strategic conflict environment. This was implemented in Cyberwar XXI - a simulation which models COA selection in a PMESII environment. Within this framework, agents simulate decision-making processes and provide predictive capability of the potential behavior of Command Entities. The 2003 Iraq is the first scenario ready for V&V testing.

  8. Simulation of the space station information system in Ada

    NASA Technical Reports Server (NTRS)

    Spiegel, James R.

    1986-01-01

    The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.

  9. Building complex simulations rapidly using MATRIX(x): The Space Station redesign

    NASA Technical Reports Server (NTRS)

    Carrington, C. K.

    1994-01-01

    MSFC's quick response to the Space Station redesign effort last year required the development of a computer simulation to model the attitude and station-keeping dynamics of a complex body with rotating solar arrays in orbit around the Earth. The simulation was written using a rapid-prototyping graphical simulation and design tool called MATRIX(x) and provided the capability to quickly remodel complex configuration changes by icon manipulation using a mouse. The simulation determines time-dependent inertia properties, and models forces and torques from gravity-gradient, solar radiation, and aerodynamic disturbances. Surface models are easily built from a selection of beams, plates, tetrahedrons, and cylinders. An optimization scheme was written to determine the torque equilibrium attitudes that balance gravity-gradient and aerodynamic torques over an orbit, and propellant-usage estimates were determined. The simulation has been adapted to model the attitude dynamics for small spacecraft.

  10. Advanced helmet mounted display (AHMD)

    NASA Astrophysics Data System (ADS)

    Sisodia, Ashok; Bayer, Michael; Townley-Smith, Paul; Nash, Brian; Little, Jay; Cassarly, William; Gupta, Anurag

    2007-04-01

    Due to significantly increased U.S. military involvement in deterrent, observer, security, peacekeeping and combat roles around the world, the military expects significant future growth in the demand for deployable virtual reality trainers with networked simulation capability of the battle space visualization process. The use of HMD technology in simulated virtual environments has been initiated by the demand for more effective training tools. The AHMD overlays computer-generated data (symbology, synthetic imagery, enhanced imagery) augmented with actual and simulated visible environment. The AHMD can be used to support deployable reconfigurable training solutions as well as traditional simulation requirements, UAV augmented reality, air traffic control and Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications. This paper will describe the design improvements implemented for production of the AHMD System.

  11. Initial Data Analysis Results for ATD-2 ISAS HITL Simulation

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2017-01-01

    To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.

  12. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This secondmore » capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.« less

  13. Model-based system-of-systems engineering for space-based command, control, communication, and information architecture design

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.

    This dissertation presents a model-based system-of-systems engineering (SoSE) approach as a design philosophy for architecting in system-of-systems (SoS) problems. SoS refers to a special class of systems in which numerous systems with operational and managerial independence interact to generate new capabilities that satisfy societal needs. Design decisions are more complicated in a SoS setting. A revised Process Model for SoSE is presented to support three phases in SoS architecting: defining the scope of the design problem, abstracting key descriptors and their interrelations in a conceptual model, and implementing computer-based simulations for architectural analyses. The Process Model enables improved decision support considering multiple SoS features and develops computational models capable of highlighting configurations of organizational, policy, financial, operational, and/or technical features. Further, processes for verification and validation of SoS models and simulations are also important due to potential impact on critical decision-making and, thus, are addressed. Two research questions frame the research efforts described in this dissertation. The first concerns how the four key sources of SoS complexity---heterogeneity of systems, connectivity structure, multi-layer interactions, and the evolutionary nature---influence the formulation of SoS models and simulations, trade space, and solution performance and structure evaluation metrics. The second question pertains to the implementation of SoSE architecting processes to inform decision-making for a subset of SoS problems concerning the design of information exchange services in space-based operations domain. These questions motivate and guide the dissertation's contributions. A formal methodology for drawing relationships within a multi-dimensional trade space, forming simulation case studies from applications of candidate architecture solutions to a campaign of notional mission use cases, and executing multi-purpose analysis studies is presented. These efforts are coupled to the generation of aggregate and time-dependent solution performance metrics via the hierarchical decomposition of objectives and the analytical recomposition of multi-attribute qualitative program drivers from quantifiable measures. This methodology was also applied to generate problem-specific solution structure evaluation metrics that facilitate the comparison of alternate solutions at a high level of aggregation, at lower levels of abstraction, and to relate options for design variables with associated performance values. For proof-of-capability demonstration, the selected application problem concerns the design of command, control, communication, and information (C3I) architecture services for a notional campaign of crewed and robotic lunar surface missions. The impetus for the work was the demonstration of using model-based SoSE for design of sustainable interoperability capabilities between all data and communication assets in extended lunar campaigns. A comprehensive Lunar C3I simulation tool was developed by a team of researchers at Purdue University in support of NASA's Constellation Program; the author of this dissertation was a key contributor to the creation of this tool and made modifications and extensions to key components relevant to the methodological concepts presented in this dissertation. The dissertation concludes with a presentation of example results based on the interrogation of the constructed Lunar C3I computational model. The results are based on a family of studies, structured around a trade-tree of architecture options, which were conducted to test the hypothesis that the SoSE approach is efficacious in the information-exchange architecture design in space exploration domain. Included in the family of proof-of-capability studies is a simulation of the Apollo 17 mission, which allows not only for partial verification and validation of the model, but also provides insights for prioritizing future model design iterations to make it more realistic representation of the "real world." A caveat within the results presented is that they serve within the capacity of a proof-of-capability demonstration, and as such, they are a product of models and analyses that need further development before the tool's results can be employed for decision-making. Additional discussion is provided for how to further develop and validate the Lunar C3I tool and also to make it extensible to other SoS design problems of similar nature in space exploration and other problem application domains.

  14. Sharp Interface Tracking in Rotating Microflows of Solvent Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glimm, James; Almeida, Valmor de; Jiao, Xiangmin

    2013-01-08

    The objective of this project is to develop a specialized sharp interface tracking simulation capability for predicting interaction of micron-sized drops and bubbles in rotating flows relevant to optimized design of contactor devices used in solvent extraction processes of spent nuclear fuel reprocessing. The primary outcomes of this project include the capability to resolve drops and bubbles micro-hydrodynamics in solvent extraction contactors, determining from first principles continuum fluid mechanics how micro-drops and bubbles interact with each other and the surrounding shearing fluid for realistic flows. In the near term, this effort will play a central role in providing parameters andmore » insight into the flow dynamics of models that average over coarser scales, say at the millimeter unit length. In the longer term, it will prove to be the platform to conduct full-device, detailed simulations as parallel computing power reaches the exaflop level. The team will develop an accurate simulation tool for flows containing interacting droplets and bubbles with sharp interfaces under conditions that mimic those found in realistic contactor operations. The main objective is to create an off-line simulation capability to model drop and bubble interactions in a domain representative of the averaged length scale. The technical approach is to combine robust interface tracking software, subgrid modeling, validation quality experiments, powerful computational hardware, and a team with simulation modeling, physical modeling and technology integration experience. Simulations will then fully resolve the microflow of drops and bubbles at the microsecond time scale. This approach is computationally intensive but very accurate in treating important coupled physical phenomena in the vicinity of interfaces. The method makes it possible to resolve spatial scales smaller than the typical distance between bubbles and to model some non-equilibrium thermodynamic features such as finite critical tension in cavitating liquids« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson, William I.; Vogelmann, Andrew M.; Cheng, Xiaoping

    The Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility began a pilot project in May 2015 to design a routine, high-resolution modeling capability to complement ARM’s extensive suite of measurements. This modeling capability has been named the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) project. The initial focus of LASSO is on shallow convection at the ARM Southern Great Plains (SGP) Climate Research Facility. The availability of LES simulations with concurrent observations will serve many purposes. LES helps bridge the scale gap between DOE ARM observations and models, and the use of routine LES addsmore » value to observations. It provides a self-consistent representation of the atmosphere and a dynamical context for the observations. Further, it elucidates unobservable processes and properties. LASSO will generate a simulation library for researchers that enables statistical approaches beyond a single-case mentality. It will also provide tools necessary for modelers to reproduce the LES and conduct their own sensitivity experiments. Many different uses are envisioned for the combined LASSO LES and observational library. For an observationalist, LASSO can help inform instrument remote sensing retrievals, conduct Observation System Simulation Experiments (OSSEs), and test implications of radar scan strategies or flight paths. For a theoretician, LASSO will help calculate estimates of fluxes and co-variability of values, and test relationships without having to run the model yourself. For a modeler, LASSO will help one know ahead of time which days have good forcing, have co-registered observations at high-resolution scales, and have simulation inputs and corresponding outputs to test parameterizations. Further details on the overall LASSO project are available at https://www.arm.gov/capabilities/modeling/lasso.« less

  16. Integrated Measurements and Characterization | Photovoltaic Research | NREL

    Science.gov Websites

    Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool

  17. Simulation and analysis of differential global positioning system for civil helicopter operations

    NASA Technical Reports Server (NTRS)

    Denaro, R. P.; Cabak, A. R.

    1983-01-01

    A Differential Global Positioning System (DGPS) computer simulation was developed, to provide a versatile tool for assessing DGPS referenced civil helicopter navigation. The civil helicopter community will probably be an early user of the GPS capability because of the unique mission requirements which include offshore exploration and low altitude transport into remote areas not currently served by ground based Navaids. The Monte Carlo simulation provided a sufficiently high fidelity dynamic motion and propagation environment to enable accurate comparisons of alternative differential GPS implementations and navigation filter tradeoffs. The analyst has provided the capability to adjust most aspects of the system, the helicopter flight profile, the receiver Kalman filter, and the signal propagation environment to assess differential GPS performance and parameter sensitivities. Preliminary analysis was conducted to evaluate alternative implementations of the differential navigation algorithm in both the position and measurement domain. Results are presented to show that significant performance gains are achieved when compared with conventional GPS but that differences due to DGPS implementation techniques were small. System performance was relatively insensitive to the update rates of the error correction information.

  18. In-Situ Visualization Experiments with ParaView Cinema in RAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kares, Robert John

    2015-10-15

    A previous paper described some numerical experiments performed using the ParaView/Catalyst in-situ visualization infrastructure deployed in the Los Alamos RAGE radiation-hydrodynamics code to produce images from a running large scale 3D ICF simulation. One challenge of the in-situ approach apparent in these experiments was the difficulty of choosing parameters likes isosurface values for the visualizations to be produced from the running simulation without the benefit of prior knowledge of the simulation results and the resultant cost of recomputing in-situ generated images when parameters are chosen suboptimally. A proposed method of addressing this difficulty is to simply render multiple images atmore » runtime with a range of possible parameter values to produce a large database of images and to provide the user with a tool for managing the resulting database of imagery. Recently, ParaView/Catalyst has been extended to include such a capability via the so-called Cinema framework. Here I describe some initial experiments with the first delivery of Cinema and make some recommendations for future extensions of Cinema’s capabilities.« less

  19. A dynamic motion simulator for future European docking systems

    NASA Technical Reports Server (NTRS)

    Brondino, G.; Marchal, PH.; Grimbert, D.; Noirault, P.

    1990-01-01

    Europe's first confrontation with docking in space will require extensive testing to verify design and performance and to qualify hardware. For this purpose, a Docking Dynamics Test Facility (DDTF) was developed. It allows reproduction on the ground of the same impact loads and relative motion dynamics which would occur in space during docking. It uses a 9 degree of freedom, servo-motion system, controlled by a real time computer, which simulates the docking spacecraft in a zero-g environment. The test technique involves and active loop based on six axis force and torque detection, a mathematical simulation of individual spacecraft dynamics, and a 9 degree of freedom servomotion of which 3 DOFs allow extension of the kinematic range to 5 m. The configuration was checked out by closed loop tests involving spacecraft control models and real sensor hardware. The test facility at present has an extensive configuration that allows evaluation of both proximity control and docking systems. It provides a versatile tool to verify system design, hardware items and performance capabilities in the ongoing HERMES and COLUMBUS programs. The test system is described and its capabilities are summarized.

  20. ROMUSE 2.0 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khuwaileh, Bassam; Turinsky, Paul; Williams, Brian J.

    2016-10-04

    ROMUSE (Reduced Order Modeling Based Uncertainty/Sensitivity Estimator) is an effort within the Consortium for Advanced Simulation of Light water reactors (CASL) to provide an analysis tool to be used in conjunction with reactor core simulators, especially the Virtual Environment for Reactor Applications (VERA). ROMUSE is written in C++ and is currently capable of performing various types of parameters perturbations, uncertainty quantification, surrogate models construction and subspace analysis. Version 2.0 has the capability to interface with DAKOTA which gives ROMUSE access to the various algorithms implemented within DAKOTA. ROMUSE is mainly designed to interface with VERA and the Comprehensive Modeling andmore » Simulation Suite for Nuclear Safety Analysis and Design (SCALE) [1,2,3], however, ROMUSE can interface with any general model (e.g. python and matlab) with Input/Output (I/O) format that follows the Hierarchical Data Format 5 (HDF5). In this brief user manual, the use of ROMUSE will be overviewed and example problems will be presented and briefly discussed. The algorithms provided here range from algorithms inspired by those discussed in Ref.[4] to nuclear-specific algorithms discussed in Ref. [3].« less

  1. Exploring JWST's Capability to Constrain Habitability on Simulated Terrestrial TESS Planets

    NASA Astrophysics Data System (ADS)

    Tremblay, Luke; Britt, Amber; Batalha, Natasha; Schwieterman, Edward; Arney, Giada; Domagal-Goldman, Shawn; Mandell, Avi; Planetary Systems Laboratory; Virtual Planetary Laboratory

    2017-01-01

    In the following, we have worked to develop a flexible "observability" scale of biologically relevant molecules in the atmospheres of newly discovered exoplanets for the instruments aboard NASA's next flagship mission, the James Webb Space Telescope (JWST). We sought to create such a scale in order to provide the community with a tool with which to optimize target selection for JWST observations based on detections of the upcoming Transiting Exoplanet Satellite Survey (TESS). Current literature has laid the groundwork for defining both biologically relevant molecules as well as what characteristics would make a new world "habitable", but it has so far lacked a cohesive analysis of JWST's capabilities to observe these molecules in exoplanet atmospheres and thereby constrain habitability. In developing our Observability Scale, we utilized a range of hypothetical planets (over planetary radii and stellar insolation) and generated three self-consistent atmospheric models (of dierent molecular compositions) for each of our simulated planets. With these planets and their corresponding atmospheres, we utilized the most accurate JWST instrument simulator, created specically to process transiting exoplanet spectra. Through careful analysis of these simulated outputs, we were able to determine the relevant parameters that effected JWST's ability to constrain each individual molecular bands with statistical accuracy and therefore generate a scale based on those key parameters. As a preliminary test of our Observability Scale, we have also applied it to the list of TESS candidate stars in order to determine JWST's observational capabilities for any soon-to-be-detected planet in those solar systems.

  2. FireStem2D — A two-dimensional heat transfer model for simulating tree stem injury in fires

    Treesearch

    Efthalia K. Chatziefstratiou; Gil Bohrer; Anthony S. Bova; Ravishankar Subramanian; Renato P.M. Frasson; Amy Scherzer; Bret W. Butler; Matthew B. Dickinson

    2013-01-01

    FireStem2D, a software tool for predicting tree stem heating and injury in forest fires, is a physically-based, two-dimensional model of stem thermodynamics that results from heating at the bark surface. It builds on an earlier one-dimensional model (FireStem) and provides improved capabilities for predicting fire-induced mortality and injury before a fire occurs by...

  3. Purity and Cleanness of Aeorgel as a Cosmic Dust Capture Medium

    NASA Technical Reports Server (NTRS)

    Tsou, P.; Fleming, R.; Lindley, P.; Craig, A.; Blake, D.

    1994-01-01

    The capability for capturing micrometeoroids intact through laboratory simulations [Tsou 1988] and in space [Tsou 1993] in passive underdense silica aerogel offers a valuable tool for cosmic dust research. The integrity of the sample handling medium can substantially modify the integrity of the sample. Intact capture is a violent hypervelocity event: the integrity of the capturing medium can cause even greater modification of the sample.

  4. Improved Load Alleviation Capability for the KC-135

    DTIC Science & Technology

    1997-09-01

    software, such as Matlab, Mathematica, Simulink, and Robotica Front End for Mathematica available in the simulation laboratory Overview This thesis report is...outlined in Spong’s text in order to utilize the Robotica system development software which automates the process of calculating the kinematic and...kinematic and dynamic equations can be accomplished using a computer tool called Robotica Front End (RFE) [ 15], developed by Doctor Spong. Boom Root d3

  5. Parameterization of the 3-PG model for Pinus elliottii stands using alternative methods to estimate fertility rating, biomass partitioning and canopy closure

    Treesearch

    Carlos A. Gonzalez-Benecke; Eric J. Jokela; Wendell P. Cropper; Rosvel Bracho; Daniel J. Leduc

    2014-01-01

    The forest simulation model, 3-PG, has been widely applied as a useful tool for predicting growth of forest species in many countries. The model has the capability to estimate the effects of management, climate and site characteristics on many stand attributes using easily available data. Currently, there is an increasing interest in estimating biomass and assessing...

  6. Instantiating the art of war for effects-based operations

    NASA Astrophysics Data System (ADS)

    Burns, Carla L.

    2002-07-01

    Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.

  7. Proposed Facility Modifications to Support Propulsion Systems Testing Under Simulated Space Conditions at Plum Brook Station's Spacecraft Propulsion Research Facility (B-2)

    NASA Technical Reports Server (NTRS)

    Edwards, Daryl A.

    2008-01-01

    Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2's support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed in the early 1960s to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Exhaust system performance, including understanding the present facility capabilities, is the primary focus of this work. A variety of approaches and analytical tools are being employed to gain this understanding. This presentation discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.

  8. Proposed Facility Modifications to Support Propulsion Systems Testing Under Simulated Space Conditions at Plum Brook Station's Spacecraft Propulsion Research Facility (B-2)

    NASA Technical Reports Server (NTRS)

    Edwards, Daryl A.

    2007-01-01

    Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2 s support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed 4 decades ago to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Instrumental in this task is understanding the present facility capabilities and identifying what reasonable changes can be implemented. A variety of approaches and analytical tools are being employed to gain this understanding. This paper discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.

  9. Peri-Elastodynamic Simulations of Guided Ultrasonic Waves in Plate-Like Structure with Surface Mounted PZT.

    PubMed

    Patra, Subir; Ahmed, Hossain; Banerjee, Sourav

    2018-01-18

    Peridynamic based elastodynamic computation tool named Peri-elastodynamics is proposed herein to simulate the three-dimensional (3D) Lamb wave modes in materials for the first time. Peri-elastodynamics is a nonlocal meshless approach which is a scale-independent generalized technique to visualize the acoustic and ultrasonic waves in plate-like structure, micro-electro-mechanical systems (MEMS) and nanodevices for their respective characterization. In this article, the characteristics of the fundamental Lamb wave modes are simulated in a sample plate-like structure. Lamb wave modes are generated using a surface mounted piezoelectric (PZT) transducer which is actuated from the top surface. The proposed generalized Peri-elastodynamics method is not only capable of simulating two dimensional (2D) in plane wave under plane strain condition formulated previously but also capable of accurately simulating the out of plane Symmetric and Antisymmetric Lamb wave modes in plate like structures in 3D. For structural health monitoring (SHM) of plate-like structures and nondestructive evaluation (NDE) of MEMS devices, it is necessary to simulate the 3D wave-damage interaction scenarios and visualize the different wave features due to damages. Hence, in addition, to simulating the guided ultrasonic wave modes in pristine material, Lamb waves were also simulated in a damaged plate. The accuracy of the proposed technique is verified by comparing the modes generated in the plate and the mode shapes across the thickness of the plate with theoretical wave analysis.

  10. TopoGromacs: Automated Topology Conversion from CHARMM to GROMACS within VMD.

    PubMed

    Vermaas, Josh V; Hardy, David J; Stone, John E; Tajkhorshid, Emad; Kohlmeyer, Axel

    2016-06-27

    Molecular dynamics (MD) simulation engines use a variety of different approaches for modeling molecular systems with force fields that govern their dynamics and describe their topology. These different approaches introduce incompatibilities between engines, and previously published software bridges the gaps between many popular MD packages, such as between CHARMM and AMBER or GROMACS and LAMMPS. While there are many structure building tools available that generate topologies and structures in CHARMM format, only recently have mechanisms been developed to convert their results into GROMACS input. We present an approach to convert CHARMM-formatted topology and parameters into a format suitable for simulation with GROMACS by expanding the functionality of TopoTools, a plugin integrated within the widely used molecular visualization and analysis software VMD. The conversion process was diligently tested on a comprehensive set of biological molecules in vacuo. The resulting comparison between energy terms shows that the translation performed was lossless as the energies were unchanged for identical starting configurations. By applying the conversion process to conventional benchmark systems that mimic typical modestly sized MD systems, we explore the effect of the implementation choices made in CHARMM, NAMD, and GROMACS. The newly available automatic conversion capability breaks down barriers between simulation tools and user communities and allows users to easily compare simulation programs and leverage their unique features without the tedium of constructing a topology twice.

  11. TNA4OptFlux – a software tool for the analysis of strain optimization strategies

    PubMed Central

    2013-01-01

    Background Rational approaches for Metabolic Engineering (ME) deal with the identification of modifications that improve the microbes’ production capabilities of target compounds. One of the major challenges created by strain optimization algorithms used in these ME problems is the interpretation of the changes that lead to a given overproduction. Often, a single gene knockout induces changes in the fluxes of several reactions, as compared with the wild-type, and it is therefore difficult to evaluate the physiological differences of the in silico mutant. This is aggravated by the fact that genome-scale models per se are difficult to visualize, given the high number of reactions and metabolites involved. Findings We introduce a software tool, the Topological Network Analysis for OptFlux (TNA4OptFlux), a plug-in which adds to the open-source ME platform OptFlux the capability of creating and performing topological analysis over metabolic networks. One of the tool’s major advantages is the possibility of using these tools in the analysis and comparison of simulated phenotypes, namely those coming from the results of strain optimization algorithms. We illustrate the capabilities of the tool by using it to aid the interpretation of two E. coli strains designed in OptFlux for the overproduction of succinate and glycine. Conclusions Besides adding new functionalities to the OptFlux software tool regarding topological analysis, TNA4OptFlux methods greatly facilitate the interpretation of non-intuitive ME strategies by automating the comparison between perturbed and non-perturbed metabolic networks. The plug-in is available on the web site http://www.optflux.org, together with extensive documentation. PMID:23641878

  12. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.

  13. Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications

    NASA Technical Reports Server (NTRS)

    Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.

    2017-01-01

    Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.

  14. Design and simulation of EVA tools for first servicing mission of HST

    NASA Technical Reports Server (NTRS)

    Naik, Dipak; Dehoff, P. H.

    1994-01-01

    The Hubble Space Telescope (HST) was launched into near-earth orbit by the Space Shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A Space Shuttle repair mission in January 1994 installed small corrective mirrors that restored the full intended optical capability of the HST. The First Servicing Mission (FSM) involved considerable Extra Vehicular Activity (EVA). Special EVA tools for the FSM were designed and developed for this specific purpose. In an earlier report, the details of the Data Acquisition System developed to test the performance of the various EVA tools in ambient as well as simulated space environment were presented. The general schematic of the test setup is reproduced in this report for continuity. Although the data acquisition system was used extensively to test a number of fasteners, only the results of one test each carried on various fasteners and the Power Ratchet Tool are included in this report.

  15. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  16. The use of a virtual reality surgical simulator for cataract surgical skill assessment with 6 months of intervening operating room experience.

    PubMed

    Sikder, Shameema; Luo, Jia; Banerjee, P Pat; Luciano, Cristian; Kania, Patrick; Song, Jonathan C; Kahtani, Eman S; Edward, Deepak P; Towerki, Abdul-Elah Al

    2015-01-01

    To evaluate a haptic-based simulator, MicroVisTouch™, as an assessment tool for capsulorhexis performance in cataract surgery. The study is a prospective, unmasked, nonrandomized dual academic institution study conducted at the Wilmer Eye Institute at Johns Hopkins Medical Center (Baltimore, MD, USA) and King Khaled Eye Specialist Hospital (Riyadh, Saudi Arabia). This prospective study evaluated capsulorhexis simulator performance in 78 ophthalmology residents in the US and Saudi Arabia in the first round of testing and 40 residents in a second round for follow-up. Four variables (circularity, accuracy, fluency, and overall) were tested by the simulator and graded on a 0-100 scale. Circularity (42%), accuracy (55%), and fluency (3%) were compiled to give an overall score. Capsulorhexis performance was retested in the original cohort 6 months after baseline assessment. Average scores in all measured metrics demonstrated statistically significant improvement (except for circularity, which trended toward improvement) after baseline assessment. A reduction in standard deviation and improvement in process capability indices over the 6-month period was also observed. An interval objective improvement in capsulorhexis skill on a haptic-enabled cataract surgery simulator was associated with intervening operating room experience. Further work investigating the role of formalized simulator training programs requiring independent simulator use must be studied to determine its usefulness as an evaluation tool.

  17. Simulating Humans as Integral Parts of Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Bruins, Anthony C.; Rice, Robert; Nguyen, Lac; Nguyen, Heidi; Saito, Tim; Russell, Elaine

    2006-01-01

    The Collaborative-Virtual Environment Simulation Tool (C-VEST) software was developed for use in a NASA project entitled "3-D Interactive Digital Virtual Human." The project is oriented toward the use of a comprehensive suite of advanced software tools in computational simulations for the purposes of human-centered design of spacecraft missions and of the spacecraft, space suits, and other equipment to be used on the missions. The C-VEST software affords an unprecedented suite of capabilities for three-dimensional virtual-environment simulations with plug-in interfaces for physiological data, haptic interfaces, plug-and-play software, realtime control, and/or playback control. Mathematical models of the mechanics of the human body and of the aforementioned equipment are implemented in software and integrated to simulate forces exerted on and by astronauts as they work. The computational results can then support the iterative processes of design, building, and testing in applied systems engineering and integration. The results of the simulations provide guidance for devising measures to counteract effects of microgravity on the human body and for the rapid development of virtual (that is, simulated) prototypes of advanced space suits, cockpits, and robots to enhance the productivity, comfort, and safety of astronauts. The unique ability to implement human-in-the-loop immersion also makes the C-VEST software potentially valuable for use in commercial and academic settings beyond the original space-mission setting.

  18. Matlab Geochemistry: An open source geochemistry solver based on MRST

    NASA Astrophysics Data System (ADS)

    McNeece, C. J.; Raynaud, X.; Nilsen, H.; Hesse, M. A.

    2017-12-01

    The study of geological systems often requires the solution of complex geochemical relations. To address this need we present an open source geochemical solver based on the Matlab Reservoir Simulation Toolbox (MRST) developed by SINTEF. The implementation supports non-isothermal multicomponent aqueous complexation, surface complexation, ion exchange, and dissolution/precipitation reactions. The suite of tools available in MRST allows for rapid model development, in particular the incorporation of geochemical calculations into transport simulations of multiple phases, complex domain geometry and geomechanics. Different numerical schemes and additional physics can be easily incorporated into the existing tools through the object-oriented framework employed by MRST. The solver leverages the automatic differentiation tools available in MRST to solve arbitrarily complex geochemical systems with any choice of species or element concentration as input. Four mathematical approaches enable the solver to be quite robust: 1) the choice of chemical elements as the basis components makes all entries in the composition matrix positive thus preserving convexity, 2) a log variable transformation is used which transfers the nonlinearity to the convex composition matrix, 3) a priori bounds on variables are calculated from the structure of the problem, constraining Netwon's path and 4) an initial guess is calculated implicitly by sequentially adding model complexity. As a benchmark we compare the model to experimental and semi-analytic solutions of the coupled salinity-acidity transport system. Together with the reservoir simulation capabilities of MRST the solver offers a promising tool for geochemical simulations in reservoir domains for applications in a diversity of fields from enhanced oil recovery to radionuclide storage.

  19. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  20. Unsteady flow simulations around complex geometries using stationary or rotating unstructured grids

    NASA Astrophysics Data System (ADS)

    Sezer-Uzol, Nilay

    In this research, the computational analysis of three-dimensional, unsteady, separated, vortical flows around complex geometries is studied by using stationary or moving unstructured grids. Two main engineering problems are investigated. The first problem is the unsteady simulation of a ship airwake, where helicopter operations become even more challenging, by using stationary unstructured grids. The second problem is the unsteady simulation of wind turbine rotor flow fields by using moving unstructured grids which are rotating with the whole three-dimensional rigid rotor geometry. The three dimensional, unsteady, parallel, unstructured, finite volume flow solver, PUMA2, is used for the computational fluid dynamics (CFD) simulations considered in this research. The code is modified to have a moving grid capability to perform three-dimensional, time-dependent rotor simulations. An instantaneous log-law wall model for Large Eddy Simulations is also implemented in PUMA2 to investigate the very large Reynolds number flow fields of rotating blades. To verify the code modifications, several sample test cases are also considered. In addition, interdisciplinary studies, which are aiming to provide new tools and insights to the aerospace and wind energy scientific communities, are done during this research by focusing on the coupling of ship airwake CFD simulations with the helicopter flight dynamics and control analysis, the coupling of wind turbine rotor CFD simulations with the aeroacoustic analysis, and the analysis of these time-dependent and large-scale CFD simulations with the help of a computational monitoring, steering and visualization tool, POSSE.

  1. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  2. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  3. Rule-based modeling with Virtual Cell

    PubMed Central

    Schaff, James C.; Vasilescu, Dan; Moraru, Ion I.; Loew, Leslie M.; Blinov, Michael L.

    2016-01-01

    Summary: Rule-based modeling is invaluable when the number of possible species and reactions in a model become too large to allow convenient manual specification. The popular rule-based software tools BioNetGen and NFSim provide powerful modeling and simulation capabilities at the cost of learning a complex scripting language which is used to specify these models. Here, we introduce a modeling tool that combines new graphical rule-based model specification with existing simulation engines in a seamless way within the familiar Virtual Cell (VCell) modeling environment. A mathematical model can be built integrating explicit reaction networks with reaction rules. In addition to offering a large choice of ODE and stochastic solvers, a model can be simulated using a network free approach through the NFSim simulation engine. Availability and implementation: Available as VCell (versions 6.0 and later) at the Virtual Cell web site (http://vcell.org/). The application installs and runs on all major platforms and does not require registration for use on the user’s computer. Tutorials are available at the Virtual Cell website and Help is provided within the software. Source code is available at Sourceforge. Contact: vcell_support@uchc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27497444

  4. Multiobjective optimization of low impact development stormwater controls

    NASA Astrophysics Data System (ADS)

    Eckart, Kyle; McPhee, Zach; Bolisetti, Tirupati

    2018-07-01

    Green infrastructure such as Low Impact Development (LID) controls are being employed to manage the urban stormwater and restore the predevelopment hydrological conditions besides improving the stormwater runoff water quality. Since runoff generation and infiltration processes are nonlinear, there is a need for identifying optimal combination of LID controls. A coupled optimization-simulation model was developed by linking the U.S. EPA Stormwater Management Model (SWMM) to the Borg Multiobjective Evolutionary Algorithm (Borg MOEA). The coupled model is capable of performing multiobjective optimization which uses SWMM simulations as a tool to evaluate potential solutions to the optimization problem. The optimization-simulation tool was used to evaluate low impact development (LID) stormwater controls. A SWMM model was developed, calibrated, and validated for a sewershed in Windsor, Ontario and LID stormwater controls were tested for three different return periods. LID implementation strategies were optimized using the optimization-simulation model for five different implementation scenarios for each of the three storm events with the objectives of minimizing peak flow in the stormsewers, reducing total runoff, and minimizing cost. For the sewershed in Windsor, Ontario, the peak run off and total volume of the runoff were found to reduce by 13% and 29%, respectively.

  5. Utility of Emulation and Simulation Computer Modeling of Space Station Environmental Control and Life Support Systems

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.

  6. Simulation Testing of Embedded Flight Software

    NASA Technical Reports Server (NTRS)

    Shahabuddin, Mohammad; Reinholtz, William

    2004-01-01

    Virtual Real Time (VRT) is a computer program for testing embedded flight software by computational simulation in a workstation, in contradistinction to testing it in its target central processing unit (CPU). The disadvantages of testing in the target CPU include the need for an expensive test bed, the necessity for testers and programmers to take turns using the test bed, and the lack of software tools for debugging in a real-time environment. By virtue of its architecture, most of the flight software of the type in question is amenable to development and testing on workstations, for which there is an abundance of commercially available debugging and analysis software tools. Unfortunately, the timing of a workstation differs from that of a target CPU in a test bed. VRT, in conjunction with closed-loop simulation software, provides a capability for executing embedded flight software on a workstation in a close-to-real-time environment. A scale factor is used to convert between execution time in VRT on a workstation and execution on a target CPU. VRT includes high-resolution operating- system timers that enable the synchronization of flight software with simulation software and ground software, all running on different workstations.

  7. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  8. Secure Large-Scale Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Dan (Technical Monitor)

    2001-01-01

    To fully conduct research that will support the far-term concepts, technologies and methods required to improve the safety of Air Transportation a simulation environment of the requisite degree of fidelity must first be in place. The Virtual National Airspace Simulation (VNAS) will provide the underlying infrastructure necessary for such a simulation system. Aerospace-specific knowledge management services such as intelligent data-integration middleware will support the management of information associated with this complex and critically important operational environment. This simulation environment, in conjunction with a distributed network of supercomputers, and high-speed network connections to aircraft, and to Federal Aviation Administration (FAA), airline and other data-sources will provide the capability to continuously monitor and measure operational performance against expected performance. The VNAS will also provide the tools to use this performance baseline to obtain a perspective of what is happening today and of the potential impact of proposed changes before they are introduced into the system.

  9. Genetic Adaptive Control for PZT Actuators

    NASA Technical Reports Server (NTRS)

    Kim, Jeongwook; Stover, Shelley K.; Madisetti, Vijay K.

    1995-01-01

    A piezoelectric transducer (PZT) is capable of providing linear motion if controlled correctly and could provide a replacement for traditional heavy and large servo systems using motors. This paper focuses on a genetic model reference adaptive control technique (GMRAC) for a PZT which is moving a mirror where the goal is to keep the mirror velocity constant. Genetic Algorithms (GAs) are an integral part of the GMRAC technique acting as the search engine for an optimal PID controller. Two methods are suggested to control the actuator in this research. The first one is to change the PID parameters and the other is to add an additional reference input in the system. The simulation results of these two methods are compared. Simulated Annealing (SA) is also used to solve the problem. Simulation results of GAs and SA are compared after simulation. GAs show the best result according to the simulation results. The entire model is designed using the Mathworks' Simulink tool.

  10. Toward high-speed 3D nonlinear soft tissue deformation simulations using Abaqus software.

    PubMed

    Idkaidek, Ashraf; Jasiuk, Iwona

    2015-12-01

    We aim to achieve a fast and accurate three-dimensional (3D) simulation of a porcine liver deformation under a surgical tool pressure using the commercial finite element software Abaqus. The liver geometry is obtained using magnetic resonance imaging, and a nonlinear constitutive law is employed to capture large deformations of the tissue. Effects of implicit versus explicit analysis schemes, element type, and mesh density on computation time are studied. We find that Abaqus explicit and implicit solvers are capable of simulating nonlinear soft tissue deformations accurately using first-order tetrahedral elements in a relatively short time by optimizing the element size. This study provides new insights and guidance on accurate and relatively fast nonlinear soft tissue simulations. Such simulations can provide force feedback during robotic surgery and allow visualization of tissue deformations for surgery planning and training of surgical residents.

  11. An online tool for tracking soil nitrogen

    NASA Astrophysics Data System (ADS)

    Wang, J.; Umar, M.; Banger, K.; Pittelkow, C. M.; Nafziger, E. D.

    2016-12-01

    Near real-time crop models can be useful tools for optimizing agricultural management practices. For example, model simulations can potentially provide current estimates of nitrogen availability in soil, helping growers decide whether more nitrogen needs to be applied in a given season. Traditionally, crop models have been used at point locations (i.e. single fields) with homogenous soil, climate and initial conditions. However, nitrogen availability across fields with varied weather and soil conditions at a regional or national level is necessary to guide better management decisions. This study presents the development of a publicly available, online tool that automates the integration of high-spatial-resolution forecast and past weather and soil data in DSSAT to estimate nitrogen availability for individual fields in Illinois. The model has been calibrated with field experiments from past year at six research corn fields across Illinois. These sites were treated with applications of different N fertilizer timings and amounts. The tool requires minimal management information from growers and yet has the capability to simulate nitrogen-water-crop interactions with calibrated parameters that are more appropriate for Illinois. The results from the tool will be combined with incoming field experiment data from 2016 for model validation and further improvement of model's predictive accuracy. The tool has the potential to help guide better nitrogen management practices to maximize economic and environmental benefits.

  12. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    NASA Astrophysics Data System (ADS)

    Hagerty, S.; Ellis, H., Jr.

    2016-09-01

    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization noise), and environmental effects (radiation hits with selectable angular distributions and 4-layer atmospheric turbulence model for ground based sensors). We have developed an accurate flash Light Detection and Ranging (LIDAR) model that supports reconstruction of 3-dimensional information on the RSO. PROXOR™ contains many important imaging effects such as intra-frame smear, realized by oversampling the image in time and capturing target motion and jitter during the integration time.

  13. Mixed-Dimensionality VLSI-Type Configurable Tools for Virtual Prototyping of Biomicrofluidic Devices and Integrated Systems

    NASA Astrophysics Data System (ADS)

    Makhijani, Vinod B.; Przekwas, Andrzej J.

    2002-10-01

    This report presents results of a DARPA/MTO Composite CAD Project aimed to develop a comprehensive microsystem CAD environment, CFD-ACE+ Multiphysics, for bio and microfluidic devices and complete microsystems. The project began in July 1998, and was a three-year team effort between CFD Research Corporation, California Institute of Technology (CalTech), University of California, Berkeley (UCB), and Tanner Research, with Mr. Don Verlee from Abbott Labs participating as a consultant on the project. The overall objective of this project was to develop, validate and demonstrate several applications of a user-configurable VLSI-type mixed-dimensionality software tool for design of biomicrofluidics devices and integrated systems. The developed tool would provide high fidelity 3-D multiphysics modeling capability, l-D fluidic circuits modeling, and SPICE interface for system level simulations, and mixed-dimensionality design. It would combine tools for layouts and process fabrication, geometric modeling, and automated grid generation, and interfaces to EDA tools (e.g. Cadence) and MCAD tools (e.g. ProE).

  14. Performance of technology-driven simulators for medical students--a systematic review.

    PubMed

    Michael, Michael; Abboudi, Hamid; Ker, Jean; Shamim Khan, Mohammed; Dasgupta, Prokar; Ahmed, Kamran

    2014-12-01

    Simulation-based education has evolved as a key training tool in high-risk industries such as aviation and the military. In parallel with these industries, the benefits of incorporating specialty-oriented simulation training within medical schools are vast. Adoption of simulators into medical school education programs has shown great promise and has the potential to revolutionize modern undergraduate education. An English literature search was carried out using MEDLINE, EMBASE, and psychINFO databases to identify all randomized controlled studies pertaining to "technology-driven" simulators used in undergraduate medical education. A validity framework incorporating the "framework for technology enhanced learning" report by the Department of Health, United Kingdom, was used to evaluate the capabilities of each technology-driven simulator. Information was collected regarding the simulator type, characteristics, and brand name. Where possible, we extracted information from the studies on the simulators' performance with respect to validity status, reliability, feasibility, education impact, acceptability, and cost effectiveness. We identified 19 studies, analyzing simulators for medical students across a variety of procedure-based specialities including; cardiovascular (n = 2), endoscopy (n = 3), laparoscopic surgery (n = 8), vascular access (n = 2), ophthalmology (n = 1), obstetrics and gynecology (n = 1), anesthesia (n = 1), and pediatrics (n = 1). Incorporation of simulators has so far been on an institutional level; no national or international trends have yet emerged. Simulators are capable of providing a highly educational and realistic experience for the medical students within a variety of speciality-oriented teaching sessions. Further research is needed to establish how best to incorporate simulators into a more primary stage of medical education; preclinical and clinical undergraduate medicine. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Data-Informed Large-Eddy Simulation of Coastal Land-Air-Sea Interactions

    NASA Astrophysics Data System (ADS)

    Calderer, A.; Hao, X.; Fernando, H. J.; Sotiropoulos, F.; Shen, L.

    2016-12-01

    The study of atmospheric flows in coastal areas has not been fully addressed due to the complex processes emerging from the land-air-sea interactions, e.g., abrupt change in land topography, strong current shear, wave shoaling, and depth-limited wave breaking. The available computational tools that have been applied to study such littoral regions are mostly based on open-ocean assumptions, which most times do not lead to reliable solutions. The goal of the present study is to better understand some of these near-shore processes, employing the advanced computational tools, developed in our research group. Our computational framework combines a large-eddy simulation (LES) flow solver for atmospheric flows, a sharp-interface immersed boundary method that can deal with real complex topographies (Calderer et al., J. Comp. Physics 2014), and a phase-resolved, depth-dependent, wave model (Yang and Shen, J. Comp. Physics 2011). Using real measured data taken in the FRF station in Duck, North Carolina, we validate and demonstrate the predictive capabilities of the present computational framework, which are shown to be in overall good agreement with the measured data under different wind-wave scenarios. We also analyse the effects of some of the complex processes captured by our simulation tools.

  16. Micro-Vibration Performance Prediction of SEPTA24 Using SMeSim (RUAG Space Mechanism Simulator Tool)

    NASA Astrophysics Data System (ADS)

    Omiciuolo, Manolo; Lang, Andreas; Wismer, Stefan; Barth, Stephan; Szekely, Gerhard

    2013-09-01

    Scientific space missions are currently challenging the performances of their payloads. The performances can be dramatically restricted by micro-vibration loads generated by any moving parts of the satellites, thus by Solar Array Drive Assemblies too. Micro-vibration prediction of SADAs is therefore very important to support their design and optimization in the early stages of a programme. The Space Mechanism Simulator (SMeSim) tool, developed by RUAG, enhances the capability of analysing the micro-vibration emissivity of a Solar Array Drive Assembly (SADA) under a specified set of boundary conditions. The tool is developed in the Matlab/Simulink® environment throughout a library of blocks simulating the different components a SADA is made of. The modular architecture of the blocks, assembled by the user, and the set up of the boundary conditions allow time-domain and frequency-domain analyses of a rigid multi-body model with concentrated flexibilities and coupled- electronic control of the mechanism. SMeSim is used to model the SEPTA24 Solar Array Drive Mechanism and predict its micro-vibration emissivity. SMeSim and the return of experience earned throughout its development and use can now support activities like verification by analysis of micro-vibration emissivity requirements and/or design optimization to minimize the micro- vibration emissivity of a SADA.

  17. Overview of Experimental Capabilities - Supersonics

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2007-01-01

    This viewgraph presentation gives an overview of experimental capabilities applicable to the area of supersonic research. The contents include: 1) EC Objectives; 2) SUP.11: Elements; 3) NRA; 4) Advanced Flight Simulator Flexible Aircraft Simulation Studies; 5) Advanced Flight Simulator Flying Qualities Guideline Development for Flexible Supersonic Transport Aircraft; 6) Advanced Flight Simulator Rigid/Flex Flight Control; 7) Advanced Flight Simulator Rapid Sim Model Exchange; 8) Flight Test Capabilities Advanced In-Flight Infrared (IR) Thermography; 9) Flight Test Capabilities In-Flight Schlieren; 10) Flight Test Capabilities CLIP Flow Calibration; 11) Flight Test Capabilities PFTF Flowfield Survey; 12) Ground Test Capabilities Laser-Induced Thermal Acoustics (LITA); 13) Ground Test Capabilities Doppler Global Velocimetry (DGV); 14) Ground Test Capabilities Doppler Global Velocimetry (DGV); and 15) Ground Test Capabilities EDL Optical Measurement Capability (PIV) for Rigid/Flexible Decelerator Models.

  18. Reacting Multi-Species Gas Capability for USM3D Flow Solver

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Schuster, David M.

    2012-01-01

    The USM3D Navier-Stokes flow solver contributed heavily to the NASA Constellation Project (CxP) as a highly productive computational tool for generating the aerodynamic databases for the Ares I and V launch vehicles and Orion launch abort vehicle (LAV). USM3D is currently limited to ideal-gas flows, which are not adequate for modeling the chemistry or temperature effects of hot-gas jet flows. This task was initiated to create an efficient implementation of multi-species gas and equilibrium chemistry into the USM3D code to improve its predictive capabilities for hot jet impingement effects. The goal of this NASA Engineering and Safety Center (NESC) assessment was to implement and validate a simulation capability to handle real-gas effects in the USM3D code. This document contains the outcome of the NESC assessment.

  19. Distributed collaborative environments for predictive battlespace awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.

  20. High Fidelity Simulations of Large-Scale Wireless Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma; Benz, Zachary

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulationsmore » (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.« less

  1. Towards reasoning and coordinating action in the mental space.

    PubMed

    Mohan, Vishwanathan; Morasso, Pietro

    2007-08-01

    Unlike a purely reactive system where the motor output is exclusively controlled by the actual sensory input, a cognitive system must be capable of running mental processes which virtually simulate action sequences aimed at achieving a goal. The mental process either attempts to find a feasible course of action compatible with a number of constraints (Internal, Environmental, Task Specific etc) or selects it from a repertoire of previously learned actions, according to the parameters of the task. If neither reasoning process succeeds, a typical backup strategy is to look for a tool that might allow the operator to match all the task constraints. This further necessitates having the capability to alter ones own goal structures to generate sub-goals which must be successfully accomplished in order to achieve the primary goal. In this paper, we introduce a forward/inverse motor control architecture (FMC/IMC) that relaxes an internal model of the overall kinematic chain to a virtual force field applied to the end effector, in the intended direction of movement. This is analogous to the mechanism of coordinating the motion of a wooden marionette by means of attached strings. The relaxation of the FMC/IMC pair provides a general solution for mentally simulating an action of reaching a target position taking into consideration a range of geometric constraints (range of motion in the joint space, internal and external constraints in the workspace) as well as effort-related constraints (range of torque of the actuators, etc.). In case, the forward simulation is successful, the movement is executed; otherwise the residual "error" or measure of inconsistency is taken as a starting point for breaking the action plan into a sequence of sub actions. This process is achieved using a recurrent neural network (RNN) which coordinates the overall reasoning process of framing and issuing goals to the forward inverse models, searching for alternatives tools in solution space and formation of sub-goals based on past context knowledge and present inputs. The RNN + FMC/IMC system is able to successfully reason and coordinate a diverse range of reaching and grasping sequences with/without tools. Using a simple robotic platform (5 DOF Scorbot arm + Stereo vision) we present results of reasoning and coordination of arm/tool movements (real and mental simulation) specifically directed towards solving the classical 2-stick paradigm from animal reasoning at a non linguistic level.

  2. NASA HPCC Technology for Aerospace Analysis and Design

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H.

    1999-01-01

    The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.

  3. Faster Aerodynamic Simulation With Cart3D

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.

  4. On a simulation study for reliable and secured smart grid communications

    NASA Astrophysics Data System (ADS)

    Mallapuram, Sriharsha; Moulema, Paul; Yu, Wei

    2015-05-01

    Demand response is one of key smart grid applications that aims to reduce power generation at peak hours and maintain a balance between supply and demand. With the support of communication networks, energy consumers can become active actors in the energy management process by adjusting or rescheduling their electricity usage during peak hours based on utilities pricing incentives. Nonetheless, the integration of communication networks expose the smart grid to cyber-attacks. In this paper, we developed a smart grid simulation test-bed and designed evaluation scenarios. By leveraging the capabilities of Matlab and ns-3 simulation tools, we conducted a simulation study to evaluate the impact of cyber-attacks on demand response application. Our data shows that cyber-attacks could seriously disrupt smart grid operations, thus confirming the need of secure and resilient communication networks for supporting smart grid operations.

  5. GTKDynamo: a PyMOL plug-in for QC/MM hybrid potential simulations

    PubMed Central

    Bachega, José Fernando R.; Timmers, Luís Fernando S.M.; Assirati, Lucas; Bachega, Leonardo R.; Field, Martin J.; Wymore, Troy

    2014-01-01

    Hybrid quantum chemical (QC)/molecular mechanical (MM) potentials are very powerful tools for molecular simulation. They are especially useful for studying processes in condensed phase systems, such as chemical reactions, that involve a relatively localized change in electronic structure and where the surrounding environment contributes to these changes but can be represented with more computationally efficient functional forms. Despite their utility, however, these potentials are not always straightforward to apply since the extent of significant electronic structure changes occurring in the condensed phase process may not be intuitively obvious. To facilitate their use we have developed an open-source graphical plug-in, GTKDynamo, that links the PyMOL visualization program and the pDynamo QC/MM simulation library. This article describes the implementation of GTKDynamo and its capabilities and illustrates its application to QC/MM simulations. PMID:24137667

  6. Process Modeling of Composite Materials for Wind-Turbine Rotor Blades: Experiments and Numerical Modeling

    PubMed Central

    Wieland, Birgit; Ropte, Sven

    2017-01-01

    The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458

  7. Process Modeling of Composite Materials for Wind-Turbine Rotor Blades: Experiments and Numerical Modeling.

    PubMed

    Wieland, Birgit; Ropte, Sven

    2017-10-05

    The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.

  8. Process Simulation and Modeling for Advanced Intermetallic Alloys.

    DTIC Science & Technology

    1994-06-01

    calorimetry, using a Stanton Redfera/Omnitherm DOC 1500 thermal analysis system, was the primary experimental tool for this investigation...samples during both heating and cooling in a high purity argon atmosphere at a rate of 20K/min. The DSC instrumental baseline was obtained using both empty...that is capable of fitting the observed data to given cell structures using a least squares procedure. RESULTS The results of the DOC observations are

  9. Satellite Constellation Optimization for Turkish Armed Forces

    DTIC Science & Technology

    2013-03-01

    capability. 29 III. OPTIMIZATION WITH STK A. ANALYSIS The goal was to minimize the number of satellites and then minimize the number of planes...www.oosa.unvienna.org/pdf/reports/ac105/AC105_1005E.pdf. Wertz, James R. and Larson, Wiley J. “Space Mission Analysis and Design (Third Edition).” Space...Systems Tool Kit software for simulation and analysis of several possible communications and remote sensing satellite constellations covering Turkish

  10. Prediction of car cabin environment by means of 1D and 3D cabin model

    NASA Astrophysics Data System (ADS)

    Fišer, J.; Pokorný, J.; Jícha, M.

    2012-04-01

    Thermal comfort and also reduction of energy requirements of air-conditioning system in vehicle cabins are currently very intensively investigated and up-to-date issues. The article deals with two approaches of modelling of car cabin environment; the first model was created in simulation language Modelica (typical 1D approach without cabin geometry) and the second one was created in specialized software Theseus-FE (3D approach with cabin geometry). Performance and capabilities of this tools are demonstrated on the example of the car cabin and the results from simulations are compared with the results from the real car cabin climate chamber measurements.

  11. Study of Some Planetary Atmospheres Features by Probe Entry and Descent Simulations

    NASA Technical Reports Server (NTRS)

    Gil, P. J. S.; Rosa, P. M. B.

    2005-01-01

    Characterization of planetary atmospheres is analyzed by its effects in the entry and descent trajectories of probes. Emphasis is on the most important variables that characterize atmospheres e.g. density profile with altitude. Probe trajectories are numerically determined with ENTRAP, a developing multi-purpose computational tool for entry and descent trajectory simulations capable of taking into account many features and perturbations. Real data from Mars Pathfinder mission is used. The goal is to be able to determine more accurately the atmosphere structure by observing real trajectories and what changes are to expect in probe descent trajectories if atmospheres have different properties than the ones assumed initially.

  12. High-power graphic computers for visual simulation: a real-time--rendering revolution

    NASA Technical Reports Server (NTRS)

    Kaiser, M. K.

    1996-01-01

    Advances in high-end graphics computers in the past decade have made it possible to render visual scenes of incredible complexity and realism in real time. These new capabilities make it possible to manipulate and investigate the interactions of observers with their visual world in ways once only dreamed of. This paper reviews how these developments have affected two preexisting domains of behavioral research (flight simulation and motion perception) and have created a new domain (virtual environment research) which provides tools and challenges for the perceptual psychologist. Finally, the current limitations of these technologies are considered, with an eye toward how perceptual psychologist might shape future developments.

  13. Minerva: User-Centered Science Operations Software Capability for Future Human Exploration

    NASA Technical Reports Server (NTRS)

    Deans, Matthew; Marquez, Jessica J.; Cohen, Tamar; Miller, Matthew J.; Deliz, Ivonne; Hillenius, Steven; Hoffman, Jeffrey; Lee, Yeon Jin; Lees, David; Norheim, Johannes; hide

    2017-01-01

    In June of 2016, the Biologic Analog Science Associated with Lava Terrains (BASALT) research project conducted its first field deployment, which we call BASALT-1. BASALT-1 consisted of a science-driven field campaign in a volcanic field in Idaho as a simulated human mission to Mars. Scientists and mission operators were provided a suite of ground software tools that we refer to collectively as Minerva to carry out their work. Minerva provides capabilities for traverse planning and route optimization, timeline generation and display, procedure management, execution monitoring, data archiving, visualization, and search. This paper describes the Minerva architecture, constituent components, use cases, and some preliminary findings from the BASALT-1 campaign.

  14. Computer aided systems human engineering: A hypermedia tool

    NASA Technical Reports Server (NTRS)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  15. AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE

    NASA Technical Reports Server (NTRS)

    Liever, P. A.; Sheta, E. F.; Habchi, S. D.

    2006-01-01

    A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.

  16. Non-Gaussian spatiotemporal simulation of multisite daily precipitation: downscaling framework

    NASA Astrophysics Data System (ADS)

    Ben Alaya, M. A.; Ouarda, T. B. M. J.; Chebana, F.

    2018-01-01

    Probabilistic regression approaches for downscaling daily precipitation are very useful. They provide the whole conditional distribution at each forecast step to better represent the temporal variability. The question addressed in this paper is: how to simulate spatiotemporal characteristics of multisite daily precipitation from probabilistic regression models? Recent publications point out the complexity of multisite properties of daily precipitation and highlight the need for using a non-Gaussian flexible tool. This work proposes a reasonable compromise between simplicity and flexibility avoiding model misspecification. A suitable nonparametric bootstrapping (NB) technique is adopted. A downscaling model which merges a vector generalized linear model (VGLM as a probabilistic regression tool) and the proposed bootstrapping technique is introduced to simulate realistic multisite precipitation series. The model is applied to data sets from the southern part of the province of Quebec, Canada. It is shown that the model is capable of reproducing both at-site properties and the spatial structure of daily precipitations. Results indicate the superiority of the proposed NB technique, over a multivariate autoregressive Gaussian framework (i.e. Gaussian copula).

  17. Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka

    1990-01-01

    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.

  18. Development of an Unstructured, Three-Dimensional Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph; Stern, Eric; Palmer, Grant; Muppidi, Suman; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. The extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries as well as multi-dimensional physics, which have been shown to be important in some scenarios and are not captured by one-dimensional models. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  19. A novel methodology for in-process monitoring of flow forming

    NASA Astrophysics Data System (ADS)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahidehpour, Mohammad

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less

  1. A Multi-Paradigm Modeling Framework to Simulate Dynamic Reciprocity in a Bioreactor

    PubMed Central

    Kaul, Himanshu; Cui, Zhanfeng; Ventikos, Yiannis

    2013-01-01

    Despite numerous technology advances, bioreactors are still mostly utilized as functional black-boxes where trial and error eventually leads to the desirable cellular outcome. Investigators have applied various computational approaches to understand the impact the internal dynamics of such devices has on overall cell growth, but such models cannot provide a comprehensive perspective regarding the system dynamics, due to limitations inherent to the underlying approaches. In this study, a novel multi-paradigm modeling platform capable of simulating the dynamic bidirectional relationship between cells and their microenvironment is presented. Designing the modeling platform entailed combining and coupling fully an agent-based modeling platform with a transport phenomena computational modeling framework. To demonstrate capability, the platform was used to study the impact of bioreactor parameters on the overall cell population behavior and vice versa. In order to achieve this, virtual bioreactors were constructed and seeded. The virtual cells, guided by a set of rules involving the simulated mass transport inside the bioreactor, as well as cell-related probabilistic parameters, were capable of displaying an array of behaviors such as proliferation, migration, chemotaxis and apoptosis. In this way the platform was shown to capture not only the impact of bioreactor transport processes on cellular behavior but also the influence that cellular activity wields on that very same local mass transport, thereby influencing overall cell growth. The platform was validated by simulating cellular chemotaxis in a virtual direct visualization chamber and comparing the simulation with its experimental analogue. The results presented in this paper are in agreement with published models of similar flavor. The modeling platform can be used as a concept selection tool to optimize bioreactor design specifications. PMID:23555740

  2. Numerical simulation of in-situ chemical oxidation (ISCO) and biodegradation of petroleum hydrocarbons using a coupled model for bio-geochemical reactive transport

    NASA Astrophysics Data System (ADS)

    Marin, I. S.; Molson, J. W.

    2013-05-01

    Petroleum hydrocarbons (PHCs) are a major source of groundwater contamination, being a worldwide and well-known problem. Formed by a complex mixture of hundreds of organic compounds (including BTEX - benzene, toluene, ethylbenzene and xylenes), many of which are toxic and persistent in the subsurface and are capable of creating a serious risk to human health. Several remediation technologies can be used to clean-up PHC contamination. In-situ chemical oxidation (ISCO) and intrinsic bioremediation (IBR) are two promising techniques that can be applied in this case. However, the interaction of these processes with the background aquifer geochemistry and the design of an efficient treatment presents a challenge. Here we show the development and application of BIONAPL/Phreeqc, a modeling tool capable of simulating groundwater flow, contaminant transport with coupled biological and geochemical processes in porous or fractured porous media. BIONAPL/Phreeqc is based on the well-tested BIONAPL/3D model, using a powerful finite element simulation engine, capable of simulating non-aqueous phase liquid (NAPL) dissolution, density-dependent advective-dispersive transport, and solving the geochemical and kinetic processes with the library Phreeqc. To validate the model, we compared BIONAPL/Phreeqc with results from the literature for different biodegradation processes and different geometries, with good agreement. We then used the model to simulate the behavior of sodium persulfate (NaS2O8) as an oxidant for BTEX degradation, coupled with sequential biodegradation in a 2D case and to evaluate the effect of inorganic geochemistry reactions. The results show the advantages of a treatment train remediation scheme based on ISCO and IBR. The numerical performance and stability of the integrated BIONAPL/Phreeqc model was also verified.

  3. Transport, Acceleration and Spatial Access of Solar Energetic Particles

    NASA Astrophysics Data System (ADS)

    Borovikov, D.; Sokolov, I.; Effenberger, F.; Jin, M.; Gombosi, T. I.

    2017-12-01

    Solar Energetic Particles (SEPs) are a major branch of space weather. Often driven by Coronal Mass Ejections (CMEs), SEPs have a very high destructive potential, which includes but is not limited to disrupting communication systems on Earth, inflicting harmful and potentially fatal radiation doses to crew members onboard spacecraft and, in extreme cases, to people aboard high altitude flights. However, currently the research community lacks efficient tools to predict such hazardous SEP events. Such a tool would serve as the first step towards improving humanity's preparedness for SEP events and ultimately its ability to mitigate their effects. The main goal of the presented research is to develop a computational tool that provides the said capabilities and meets the community's demand. Our model has the forecasting capability and can be the basis for operational system that will provide live information on the current potential threats posed by SEPs based on observations of the Sun. The tool comprises several numerical models, which are designed to simulate different physical aspects of SEPs. The background conditions in the interplanetary medium, in particular, the Coronal Mass Ejection driving the particle acceleration, play a defining role and are simulated with the state-of-the-art MHD solver, Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme (BATS-R-US). The newly developed particle code, Multiple-Field-Line-Advection Model for Particle Acceleration (M-FLAMPA), simulates the actual transport and acceleration of SEPs and is coupled to the MHD code. The special property of SEPs, the tendency to follow magnetic lines of force, is fully taken advantage of in the computational model, which substitutes a complicated 3-D model with a multitude of 1-D models. This approach significantly simplifies computations and improves the time performance of the overall model. Also, it plays an important role of mapping the affected region by connecting it with the origin of SEPs at the solar surface. Our model incorporates the effects of the near-Sun field line meandering that affects the perpendicular transport of SEPs and can explain the occurrence of large longitudinal spread observed even in the early phases of such events.

  4. CFD simulation of coaxial injectors

    NASA Technical Reports Server (NTRS)

    Landrum, D. Brian

    1993-01-01

    The development of improved performance models for the Space Shuttle Main Engine (SSME) is an important, ongoing program at NASA MSFC. These models allow prediction of overall system performance, as well as analysis of run-time anomalies which might adversely affect engine performance or safety. Due to the complexity of the flow fields associated with the SSME, NASA has increasingly turned to Computational Fluid Dynamics (CFD) techniques as modeling tools. An important component of the SSME system is the fuel preburner, which consists of a cylindrical chamber with a plate containing 264 coaxial injector elements at one end. A fuel rich mixture of gaseous hydrogen and liquid oxygen is injected and combusted in the chamber. This process preheats the hydrogen fuel before it enters the main combustion chamber, powers the hydrogen turbo-pump, and provides a heat dump for nozzle cooling. Issues of interest include the temperature and pressure fields at the turbine inlet and the thermal compatibility between the preburner chamber and injector plate. Performance anomalies can occur due to incomplete combustion, blocked injector ports, etc. The performance model should include the capability to simulate the effects of these anomalies. The current approach to the numerical simulation of the SSME fuel preburner flow field is to use a global model based on the MSFC sponsored FNDS code. This code does not have the capabilities of modeling several aspects of the problem such as detailed modeling of the coaxial injectors. Therefore, an effort has been initiated to develop a detailed simulation of the preburner coaxial injectors and provide gas phase boundary conditions just downstream of the injector face as input to the FDNS code. This simulation should include three-dimensional geometric effects such as proximity of injectors to baffles and chamber walls and interaction between injectors. This report describes an investigation into the numerical simulation of GH2/LOX coaxial injectors. The following sections will discuss the physical aspects of injectors, the CFD code employed, and preliminary results of a simulation of a single coaxial injector for which experimental data is available. It is hoped that this work will lay the foundation for the development of a unique and useful tool to support the SSME program.

  5. Sensor-scheduling simulation of disparate sensors for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Hobson, T.; Clarkson, I.

    2011-09-01

    The art and science of space situational awareness (SSA) has been practised and developed from the time of Sputnik. However, recent developments, such as the accelerating pace of satellite launch, the proliferation of launch capable agencies, both commercial and sovereign, and recent well-publicised collisions involving man-made space objects, has further magnified the importance of timely and accurate SSA. The United States Strategic Command (USSTRATCOM) operates the Space Surveillance Network (SSN), a global network of sensors tasked with maintaining SSA. The rapidly increasing number of resident space objects will require commensurate improvements in the SSN. Sensors are scarce resources that must be scheduled judiciously to obtain measurements of maximum utility. Improvements in sensor scheduling and fusion, can serve to reduce the number of additional sensors that may be required. Recently, Hill et al. [1] have proposed and developed a simulation environment named TASMAN (Tasking Autonomous Sensors in a Multiple Application Network) to enable testing of alternative scheduling strategies within a simulated multi-sensor, multi-target environment. TASMAN simulates a high-fidelity, hardware-in-the-loop system by running multiple machines with different roles in parallel. At present, TASMAN is limited to simulations involving electro-optic sensors. Its high fidelity is at once a feature and a limitation, since supercomputing is required to run simulations of appreciable scale. In this paper, we describe an alternative, modular and scalable SSA simulation system that can extend the work of Hill et al with reduced complexity, albeit also with reduced fidelity. The tool has been developed in MATLAB and therefore can be run on a very wide range of computing platforms. It can also make use of MATLAB’s parallel processing capabilities to obtain considerable speed-up. The speed and flexibility so obtained can be used to quickly test scheduling algorithms even with a relatively large number of space objects. We further describe an application of the tool by exploring how the relative mixture of electro-optical and radar sensors can impact the scheduling, fusion and achievable accuracy of an SSA system. By varying the mixture of sensor types, we are able to characterise the main advantages and disadvantages of each configuration.

  6. Development and deployment of a water-crop-nutrient simulation model embedded in a web application

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Coppola, Antonio; Manna, Piero; Orefice, Nadia; Terribile, Fabio

    2016-04-01

    It is long time by now that scientific research on environmental and agricultural issues spent large effort in the development and application of models for prediction and simulation in spatial and temporal domains. This is fulfilled by studying and observing natural processes (e.g. rainfall, water and chemicals transport in soils, crop growth) whose spatiotemporal behavior can be reproduced for instance to predict irrigation and fertilizer requirements and yield quantities/qualities. In this work a mechanistic model to simulate water flow and solute transport in the soil-plant-atmosphere continuum is presented. This desktop computer program was written according to the specific requirement of developing web applications. The model is capable to solve the following issues all together: (a) water balance and (b) solute transport; (c) crop modelling; (d) GIS-interoperability; (e) embedability in web-based geospatial Decision Support Systems (DSS); (f) adaptability at different scales of application; and (g) ease of code modification. We maintained the desktop characteristic in order to further develop (e.g. integrate novel features) and run the key program modules for testing and validation purporses, but we also developed a middleware component to allow the model run the simulations directly over the web, without software to be installed. The GIS capabilities allows the web application to make simulations in a user-defined region of interest (delimited over a geographical map) without the need to specify the proper combination of model parameters. It is possible since the geospatial database collects information on pedology, climate, crop parameters and soil hydraulic characteristics. Pedological attributes include the spatial distribution of key soil data such as soil profile horizons and texture. Further, hydrological parameters are selected according to the knowledge about the spatial distribution of soils. The availability and definition in the geospatial domain of these attributes allow the simulation outputs at a different spatial scale. Two different applications were implemented using the same framework but with different configurations of the software pieces making the physically based modelling chain: an irrigation tool simulating water requirements and their dates and a fertilization tool for optimizing in particular mineral nitrogen adds.

  7. Robotic Assisted Microsurgery - RAMS FY'97

    NASA Technical Reports Server (NTRS)

    1997-01-01

    JPL and Microdexterity Systems collaborated to develop new surgical capabilities. They developed a Robot Assisted Microsurgery (RAM) tool for surgeons to use for operating on the eye, ear, brain, and blood vessels with unprecedented dexterity. A surgeon can hold the surgical instrument with motions of 6 degrees of freedom with an accuracy of 25 microns in a 70 cu cm workspace. In 1996 a demonstration was performed to remove a microscopic particle from a simulated eyeball. In 1997, tests were performed at UCLA to compare telerobotics with mechanical operations. In 5 out of 7 tests, the RAM tool performed with a significant improvement of preciseness over mechanical operation. New design features include: (1) amplified forced feedback; (2) simultaneous slave robot instrumentation; (3) index control switch on master handle; and (4) tool control switches. Upgrades include: (1) increase in computational power; and (2) installation of hard disk memory storage device for independent operation and independent operation of forceps. In 1997 a final demonstration was performed using 2 telerobotics simultaneously in a microsurgery suture procedure to close a slit in a thin sheet of latex rubber which extended the capabilities of microsurgery procedures. After completing trials and demonstrations for the FDA the potential benefits for thousands of operations will be exposed.

  8. Human-In-The-Loop Investigation of Interoperability Between Terminal Sequencing and Spacing, Automated Terminal Proximity Alert, and Wake-Separation Recategorization

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Bienert, Nancy; Borade, Abhay; Gabriel, Conrad; Gujral, Vimmy; Jobe, Kim; Martin, Lynne; Omar, Faisal; Prevot, Thomas; Mercer, Joey

    2016-01-01

    A human-in-the-loop simulation study addressed terminal-area controller-workstation interface variations for interoperability between three new capabilities being introduced by the FAA. The capabilities are Terminal Sequencing and Spacing (TSAS), Automated Terminal Proximity Alert (ATPA), and wake-separation recategorization, or 'RECAT.' TSAS provides controllers with Controller-Managed Spacing (CMS) tools, including slot markers, speed advisories, and early/late indications, together with runway assignments and sequence numbers. ATPA provides automatic monitor, warning, and alert cones to inform controllers about spacing between aircraft on approach. ATPA cones are sized according to RECAT, an improved method of specifying wake-separation standards. The objective of the study was to identify potential issues and provide recommendations for integrating TSAS with ATPA and RECAT. Participants controlled arrival traffic under seven different display configurations, then tested an 'exploratory' configuration developed with participant input. All the display conditions were workable and acceptable, but controllers strongly preferred having the CMS tools available on Feeder positions, and both CMS tools and ATPA available on Final positions. Controllers found the integrated systems favorable and liked being able to tailor configurations to individual preferences.

  9. Scientific Benefits of Space Science Models Archiving at Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Kuznetsova, Maria M.; Berrios, David; Chulaki, Anna; Hesse, Michael; MacNeice, Peter J.; Maddox, Marlo M.; Pulkkinen, Antti; Rastaetter, Lutz; Taktakishvili, Aleksandre

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the-art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. CCMC provides a web-based Run-on-Request system, by which the interested scientist can request simulations for a broad range of space science problems. To allow the models to be driven by data relevant to particular events CCMC developed a tool that automatically downloads data from data archives and transform them to required formats. CCMC also provides a tailored web-based visualization interface for the model output, as well as the capability to download the simulation output in portable format. CCMC offers a variety of visualization and output analysis tools to aid scientists in interpretation of simulation results. During eight years since the Run-on-request system became available the CCMC archived the results of almost 3000 runs that are covering significant space weather events and time intervals of interest identified by the community. The simulation results archived at CCMC also include a library of general purpose runs with modeled conditions that are used for education and research. Archiving results of simulations performed in support of several Modeling Challenges helps to evaluate the progress in space weather modeling over time. We will highlight the scientific benefits of CCMC space science model archive and discuss plans for further development of advanced methods to interact with simulation results.

  10. HF Propagation sensitivity study and system performance analysis with the Air Force Coverage Analysis Program (AFCAP)

    NASA Astrophysics Data System (ADS)

    Caton, R. G.; Colman, J. J.; Parris, R. T.; Nickish, L.; Bullock, G.

    2017-12-01

    The Air Force Research Laboratory, in collaboration with NorthWest Research Associates, is developing advanced software capabilities for high fidelity simulations of high frequency (HF) sky wave propagation and performance analysis of HF systems. Based on the HiCIRF (High-frequency Channel Impulse Response Function) platform [Nickisch et. al, doi:10.1029/2011RS004928], the new Air Force Coverage Analysis Program (AFCAP) provides the modular capabilities necessary for a comprehensive sensitivity study of the large number of variables which define simulations of HF propagation modes. In this paper, we report on an initial exercise of AFCAP to analyze the sensitivities of the tool to various environmental and geophysical parameters. Through examination of the channel scattering function and amplitude-range-Doppler output on two-way propagation paths with injected target signals, we will compare simulated returns over a range of geophysical conditions as well as varying definitions for environmental noise, meteor clutter, and sea state models for Bragg backscatter. We also investigate the impacts of including clutter effects due to field-aligned backscatter from small scale ionization structures at varied levels of severity as defined by the climatologically WideBand Model (WBMOD). In the absence of additional user provided information, AFCAP relies on International Reference Ionosphere (IRI) model to define the ionospheric state for use in 2D ray tracing algorithms. Because the AFCAP architecture includes the option for insertion of a user defined gridded ionospheric representation, we compare output from the tool using the IRI and ionospheric definitions from assimilative models such as GPSII (GPS Ionospheric Inversion).

  11. Making objective summaries of climate model behavior more accessible

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2016-12-01

    For multiple reasons, a more efficient and systematic evaluation of publically available climate model simulations is urgently needed. The IPCC, national assessments, and an assortment of other public and policy-driven needs place taxing demands on researchers. While cutting edge research is essential to meeting these needs, so too are results from well-established analysis, and these should be more efficiently produced, widely accessible, and be highly traceable. Furthermore, the number of simulations used by the research community is already large and expected to dramatically increase with the 6th phase of the Coupled Model Intercomparison Project (CMIP6). To help meet the demands on the research community and synthesize results from the rapidly expanding number and complexity of model simulations, well-established characteristics from all CMIP DECK (Diagnosis, Evaluation and Characterization of Klima) experiments will be routinely produced and made accessible. This presentation highlights the PCMDI Metrics Package (PMP), a capability that is designed to provide a diverse suite of objective summary statistics across spatial and temporal scales, gauging the agreement between models and observations. In addition to the PMP, ESMValTool is being developed to broadly diagnose CMIP simulations, and a variety of other packages target specialized sets of analysis. The challenges and opportunities of working towards coordinating these community-based capabilities will be discussed.

  12. A Cellular Automata-based Model for Simulating Restitution Property in a Single Heart Cell.

    PubMed

    Sabzpoushan, Seyed Hojjat; Pourhasanzade, Fateme

    2011-01-01

    Ventricular fibrillation is the cause of the most sudden mortalities. Restitution is one of the specific properties of ventricular cell. The recent findings have clearly proved the correlation between the slope of restitution curve with ventricular fibrillation. This; therefore, mandates the modeling of cellular restitution to gain high importance. A cellular automaton is a powerful tool for simulating complex phenomena in a simple language. A cellular automaton is a lattice of cells where the behavior of each cell is determined by the behavior of its neighboring cells as well as the automata rule. In this paper, a simple model is depicted for the simulation of the property of restitution in a single cardiac cell using cellular automata. At first, two state variables; action potential and recovery are introduced in the automata model. In second, automata rule is determined and then recovery variable is defined in such a way so that the restitution is developed. In order to evaluate the proposed model, the generated restitution curve in our study is compared with the restitution curves from the experimental findings of valid sources. Our findings indicate that the presented model is not only capable of simulating restitution in cardiac cell, but also possesses the capability of regulating the restitution curve.

  13. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.

  14. Dynamics modeling and loads analysis of an offshore floating wind turbine

    NASA Astrophysics Data System (ADS)

    Jonkman, Jason Mark

    The vast deepwater wind resource represents a potential to use offshore floating wind turbines to power much of the world with renewable energy. Many floating wind turbine concepts have been proposed, but dynamics models, which account for the wind inflow, aerodynamics, elasticity, and controls of the wind turbine, along with the incident waves, sea current, hydrodynamics, and platform and mooring dynamics of the floater, were needed to determine their technical and economic feasibility. This work presents the development of a comprehensive simulation tool for modeling the coupled dynamic response of offshore floating wind turbines, the verification of the simulation tool through model-to-model comparisons, and the application of the simulation tool to an integrated loads analysis for one of the promising system concepts. A fully coupled aero-hydro-servo-elastic simulation tool was developed with enough sophistication to address the limitations of previous frequency- and time-domain studies and to have the features required to perform loads analyses for a variety of wind turbine, support platform, and mooring system configurations. The simulation capability was tested using model-to-model comparisons. The favorable results of all of the verification exercises provided confidence to perform more thorough analyses. The simulation tool was then applied in a preliminary loads analysis of a wind turbine supported by a barge with catenary moorings. A barge platform was chosen because of its simplicity in design, fabrication, and installation. The loads analysis aimed to characterize the dynamic response and to identify potential loads and instabilities resulting from the dynamic couplings between the turbine and the floating barge in the presence of combined wind and wave excitation. The coupling between the wind turbine response and the barge-pitch motion, in particular, produced larger extreme loads in the floating turbine than experienced by an equivalent land-based turbine. Instabilities were also found in the system. The influence of conventional wind turbine blade-pitch control actions on the pitch damping of the floating turbine was also assessed. Design modifications for reducing the platform motions, improving the turbine response, and eliminating the instabilities are suggested. These suggestions are aimed at obtaining cost-effective designs that achieve favorable performance while maintaining structural integrity.

  15. Integrated surface/subsurface permafrost thermal hydrology: Model formulation and proof-of-concept simulations

    DOE PAGES

    Painter, Scott L.; Coon, Ethan T.; Atchley, Adam L.; ...

    2016-08-11

    The need to understand potential climate impacts and feedbacks in Arctic regions has prompted recent interest in modeling of permafrost dynamics in a warming climate. A new fine-scale integrated surface/subsurface thermal hydrology modeling capability is described and demonstrated in proof-of-concept simulations. The new modeling capability combines a surface energy balance model with recently developed three-dimensional subsurface thermal hydrology models and new models for nonisothermal surface water flows and snow distribution in the microtopography. Surface water flows are modeled using the diffusion wave equation extended to include energy transport and phase change of ponded water. Variation of snow depth in themore » microtopography, physically the result of wind scour, is also modeled heuristically with a diffusion wave equation. The multiple surface and subsurface processes are implemented by leveraging highly parallel community software. Fully integrated thermal hydrology simulations on the tilted open book catchment, an important test case for integrated surface/subsurface flow modeling, are presented. Fine-scale 100-year projections of the integrated permafrost thermal hydrological system on an ice wedge polygon at Barrow Alaska in a warming climate are also presented. Finally, these simulations demonstrate the feasibility of microtopography-resolving, process-rich simulations as a tool to help understand possible future evolution of the carbon-rich Arctic tundra in a warming climate.« less

  16. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    NASA Technical Reports Server (NTRS)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stimpson, Shane G; Powers, Jeffrey J; Clarno, Kevin T

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) aims to provide high-fidelity, multiphysics simulations of light water reactors (LWRs) by coupling a variety of codes within the Virtual Environment for Reactor Analysis (VERA). One of the primary goals of CASL is to predict local cladding failure through pellet-clad interaction (PCI). This capability is currently being pursued through several different approaches, such as with Tiamat, which is a simulation tool within VERA that more tightly couples the MPACT neutron transport solver, the CTF thermal hydraulics solver, and the MOOSE-based Bison-CASL fuel performance code. However, the process in this papermore » focuses on running fuel performance calculations with Bison-CASL to predict PCI using the multicycle output data from coupled neutron transport/thermal hydraulics simulations. In recent work within CASL, Watts Bar Unit 1 has been simulated over 12 cycles using the VERA core simulator capability based on MPACT and CTF. Using the output from these simulations, Bison-CASL results can be obtained without rerunning all 12 cycles, while providing some insight into PCI indicators. Multi-cycle Bison-CASL results are presented and compared against results from the FRAPCON fuel performance code. There are several quantities of interest in considering PCI and subsequent fuel rod failures, such as the clad hoop stress and maximum centerline fuel temperature, particularly as a function of time. Bison-CASL performs single-rod simulations using representative power and temperature distributions, providing high-resolution results for these and a number of other quantities. This will assist in identifying fuels rods as potential failure locations for use in further analyses.« less

  18. High Fidelity Simulations of Plume Impingement to the International Space Station

    NASA Technical Reports Server (NTRS)

    Lumpkin, Forrest E., III; Marichalar, Jeremiah; Stewart, Benedicte D.

    2012-01-01

    With the retirement of the Space Shuttle, the United States now depends on recently developed commercial spacecraft to supply the International Space Station (ISS) with cargo. These new vehicles supplement ones from international partners including the Russian Progress, the European Autonomous Transfer Vehicle (ATV), and the Japanese H-II Transfer Vehicle (HTV). Furthermore, to carry crew to the ISS and supplement the capability currently provided exclusively by the Russian Soyuz, new designs and a refinement to a cargo vehicle design are in work. Many of these designs include features such as nozzle scarfing or simultaneous firing of multiple thrusters resulting in complex plumes. This results in a wide variety of complex plumes impinging upon the ISS. Therefore, to ensure safe "proximity operations" near the ISS, the need for accurate and efficient high fidelity simulation of plume impingement to the ISS is as high as ever. A capability combining computational fluid dynamics (CFD) and the Direct Simulation Monte Carlo (DSMC) techniques has been developed to properly model the large density variations encountered as the plume expands from the high pressure in the combustion chamber to the near vacuum conditions at the orbiting altitude of the ISS. Details of the computational tools employed by this method, including recent software enhancements and the best practices needed to achieve accurate simulations, are discussed. Several recent examples of the application of this high fidelity capability are presented. These examples highlight many of the real world, complex features of plume impingement that occur when "visiting vehicles" operate in the vicinity of the ISS.

  19. Progress in and prospects for fluvial flood modelling.

    PubMed

    Wheater, H S

    2002-07-15

    Recent floods in the UK have raised public and political awareness of flood risk. There is an increasing recognition that flood management and land-use planning are linked, and that decision-support modelling tools are required to address issues of climate and land-use change for integrated catchment management. In this paper, the scientific context for fluvial flood modelling is discussed, current modelling capability is considered and research challenges are identified. Priorities include (i) appropriate representation of spatial precipitation, including scenarios of climate change; (ii) development of a national capability for continuous hydrological simulation of ungauged catchments; (iii) improved scientific understanding of impacts of agricultural land-use and land-management change, and the development of new modelling approaches to represent those impacts; (iv) improved representation of urban flooding, at both local and catchment scale; (v) appropriate parametrizations for hydraulic simulation of in-channel and flood-plain flows, assimilating available ground observations and remotely sensed data; and (vi) a flexible decision-support modelling framework, incorporating developments in computing, data availability, data assimilation and uncertainty analysis.

  20. Developing the User Experience for a Next Generation Nuclear Fuel Cycle Simulator (NGFCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Paul H.; Schneider, Erich; Pascucci, Valerio

    This project made substantial progress on its original aim for providing a modern user experience for nuclear fuel cycle analysis while also creating a robust and functional next- generation fuel cycle simulator. The Cyclus kernel experienced a dramatic clari cation of its interfaces and data model, becoming a full- edged agent-based framework, with strong support for third party developers of novel archetypes. The most important contribution of this project to the the development of Cyclus was the introduction of tools to facilitate archetype development. These include automated code generation of routine archetype components, metadata annotations to provide re ection andmore » rich description of each data member's purpose, and mechanisms for input validation and output of complex data. A comprehensive social science investigation of decision makers' interests in nuclear fuel cycles, and speci cally their interests in nuclear fuel cycle simulators (NFCSs) as tools for understanding nuclear fuel cycle options, was conducted. This included document review and analysis, stakeholder interviews, and a survey of decision makers. This information was used to study the role of visualization formats and features in communicating information about nuclear fuel cycles. A exible and user-friendly tool was developed for building Cyclus analysis models, featuring a drag-and-drop interface and automatic input form generation for novel archetypes. Cycic allows users to design fuel cycles from arbitrary collections of facilities for the rst time, with mechanisms that contribute to consistency within that fuel cycle. Interacting with some of the metadata capabilities introduced in the above-mentioned tools to support archetype development, Cycic also automates the generation of user input forms for novel archetypes with little to no special knowledge required by the archetype developers. Translation of the fundamental metrics of Cyclus into more interesting quantities is accomplished in the Cymetric python package. This package is speci cally designed to support the introduction of new metrics by building upon existing metrics. This concept allows for multiple dependencies and encourages building complex metrics out of incremental transformations to those prior metrics. New archetype developers can contribute their own archetype-speci c metric using the same capability. A simple demonstration of this capability focused on generating time-dependent cash ows for reactor deployment that could then be analyzed in di erent ways. Cyclist, a dedicated application for exploration of Cyclus results, was developed. It's primary capabilities at this stage are best-suited to experienced fuel cycle analysts, but it provides a basic platform for simpler visualizations for other audiences. An important part of its interface is the ability to uidly examine di erent slices of what is fundamentally a ve-dimensional sparse data set. A drag-and-drop interface simpli es the process of selecting which data is displayed in the plot as well as which dimensions are being used for« less

Top