Sample records for distributed simulation environment

  1. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  2. A Distributed Simulation Facility to Support Human Factors Research in Advanced Air Transportation Technology

    NASA Technical Reports Server (NTRS)

    Amonlirdviman, Keith; Farley, Todd C.; Hansman, R. John, Jr.; Ladik, John F.; Sherer, Dana Z.

    1998-01-01

    A distributed real-time simulation of the civil air traffic environment developed to support human factors research in advanced air transportation technology is presented. The distributed environment is based on a custom simulation architecture designed for simplicity and flexibility in human experiments. Standard Internet protocols are used to create the distributed environment, linking all advanced cockpit simulator, all Air Traffic Control simulator, and a pseudo-aircraft control and simulation management station. The pseudo-aircraft control station also functions as a scenario design tool for coordinating human factors experiments. This station incorporates a pseudo-pilot interface designed to reduce workload for human operators piloting multiple aircraft simultaneously in real time. The application of this distributed simulation facility to support a study of the effect of shared information (via air-ground datalink) on pilot/controller shared situation awareness and re-route negotiation is also presented.

  3. Using IMPRINT to Guide Experimental Design with Simulated Task Environments

    DTIC Science & Technology

    2015-06-18

    USING IMPRINT TO GUIDE EXPERIMENTAL DESIGN OF SIMULATED TASK ENVIRONMENTS THESIS Gregory...ENG-MS-15-J-052 USING IMPRINT TO GUIDE EXPERIMENTAL DESIGN WITH SIMULATED TASK ENVIRONMENTS THESIS Presented to the Faculty Department...Civilian, USAF June 2015 DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-J-052 USING IMPRINT

  4. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  5. Full immersion simulation: validation of a distributed simulation environment for technical and non-technical skills training in Urology.

    PubMed

    Brewin, James; Tang, Jessica; Dasgupta, Prokar; Khan, Muhammad S; Ahmed, Kamran; Bello, Fernando; Kneebone, Roger; Jaye, Peter

    2015-07-01

    To evaluate the face, content and construct validity of the distributed simulation (DS) environment for technical and non-technical skills training in endourology. To evaluate the educational impact of DS for urology training. DS offers a portable, low-cost simulated operating room environment that can be set up in any open space. A prospective mixed methods design using established validation methodology was conducted in this simulated environment with 10 experienced and 10 trainee urologists. All participants performed a simulated prostate resection in the DS environment. Outcome measures included surveys to evaluate the DS, as well as comparative analyses of experienced and trainee urologist's performance using real-time and 'blinded' video analysis and validated performance metrics. Non-parametric statistical methods were used to compare differences between groups. The DS environment demonstrated face, content and construct validity for both non-technical and technical skills. Kirkpatrick level 1 evidence for the educational impact of the DS environment was shown. Further studies are needed to evaluate the effect of simulated operating room training on real operating room performance. This study has shown the validity of the DS environment for non-technical, as well as technical skills training. DS-based simulation appears to be a valuable addition to traditional classroom-based simulation training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  6. A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1997-01-01

    This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.

  7. Log-Normal Distribution of Cosmic Voids in Simulations and Mocks

    NASA Astrophysics Data System (ADS)

    Russell, E.; Pycke, J.-R.

    2017-01-01

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.

  8. Collaborative modeling: the missing piece of distributed simulation

    NASA Astrophysics Data System (ADS)

    Sarjoughian, Hessam S.; Zeigler, Bernard P.

    1999-06-01

    The Department of Defense overarching goal of performing distributed simulation by overcoming geographic and time constraints has brought the problem of distributed modeling to the forefront. The High Level Architecture standard is primarily intended for simulation interoperability. However, as indicated, the existence of a distributed modeling infrastructure plays a fundamental and central role in supporting the development of distributed simulations. In this paper, we describe some fundamental distributed modeling concepts and their implications for constructing successful distributed simulations. In addition, we discuss the Collaborative DEVS Modeling environment that has been devised to enable graphically dispersed modelers to collaborate and synthesize modular and hierarchical models. We provide an actual example of the use of Collaborative DEVS Modeler in application to a project involving corporate partners developing an HLA-compliant distributed simulation exercise.

  9. An IP-Based Software System for Real-time, Closed Loop, Multi-Spacecraft Mission Simulations

    NASA Technical Reports Server (NTRS)

    Cary, Everett; Davis, George; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    This viewgraph presentation provides information on the architecture of a computerized testbest for simulating Distributed Space Systems (DSS) for controlling spacecraft flying in formation. The presentation also discusses and diagrams the Distributed Synthesis Environment (DSE) for simulating and planning DSS missions.

  10. Coupling indoor airflow, HVAC, control and building envelope heat transfer in the Modelica Buildings library

    DOE PAGES

    Zuo, Wangda; Wetter, Michael; Tian, Wei; ...

    2015-07-13

    Here, this paper describes a coupled dynamic simulation of an indoor environment with heating, ventilation, and air conditioning (HVAC) systems, controls and building envelope heat transfer. The coupled simulation can be used for the design and control of ventilation systems with stratified air distributions. Those systems are commonly used to reduce building energy consumption while improving the indoor environment quality. The indoor environment was simulated using the fast fluid dynamics (FFD) simulation programme. The building fabric heat transfer, HVAC and control system were modelled using the Modelica Buildings library. After presenting the concept, the mathematical algorithm and the implementation ofmore » the coupled simulation were introduced. The coupled FFD–Modelica simulation was then evaluated using three examples of room ventilation with complex flow distributions with and without feedback control. Lastly, further research and development needs were also discussed.« less

  11. Coupling indoor airflow, HVAC, control and building envelope heat transfer in the Modelica Buildings library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; Wetter, Michael; Tian, Wei

    Here, this paper describes a coupled dynamic simulation of an indoor environment with heating, ventilation, and air conditioning (HVAC) systems, controls and building envelope heat transfer. The coupled simulation can be used for the design and control of ventilation systems with stratified air distributions. Those systems are commonly used to reduce building energy consumption while improving the indoor environment quality. The indoor environment was simulated using the fast fluid dynamics (FFD) simulation programme. The building fabric heat transfer, HVAC and control system were modelled using the Modelica Buildings library. After presenting the concept, the mathematical algorithm and the implementation ofmore » the coupled simulation were introduced. The coupled FFD–Modelica simulation was then evaluated using three examples of room ventilation with complex flow distributions with and without feedback control. Lastly, further research and development needs were also discussed.« less

  12. LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu

    2017-01-20

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of thesemore » data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.« less

  13. Secure Large-Scale Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Dan (Technical Monitor)

    2001-01-01

    To fully conduct research that will support the far-term concepts, technologies and methods required to improve the safety of Air Transportation a simulation environment of the requisite degree of fidelity must first be in place. The Virtual National Airspace Simulation (VNAS) will provide the underlying infrastructure necessary for such a simulation system. Aerospace-specific knowledge management services such as intelligent data-integration middleware will support the management of information associated with this complex and critically important operational environment. This simulation environment, in conjunction with a distributed network of supercomputers, and high-speed network connections to aircraft, and to Federal Aviation Administration (FAA), airline and other data-sources will provide the capability to continuously monitor and measure operational performance against expected performance. The VNAS will also provide the tools to use this performance baseline to obtain a perspective of what is happening today and of the potential impact of proposed changes before they are introduced into the system.

  14. Arcade: A Web-Java Based Framework for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  15. Distributed dynamic simulations of networked control and building performance applications.

    PubMed

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  16. Distributed dynamic simulations of networked control and building performance applications

    PubMed Central

    Yahiaoui, Azzedine

    2017-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135

  17. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  18. An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing

    DTIC Science & Technology

    2002-08-01

    simulation and actual execution. KEYWORDS: Model Continuity, Modeling, Simulation, Experimental Frame, Real Time Systems , Intelligent Systems...the methodology for a stand-alone real time system. Then it will scale up to distributed real time systems . For both systems, step-wise simulation...MODEL CONTINUITY Intelligent real time systems monitor, respond to, or control, an external environment. This environment is connected to the digital

  19. Flexible Simulation E-Learning Environment for Studying Digital Circuits and Possibilities for It Deployment as Semantic Web Service

    ERIC Educational Resources Information Center

    Radoyska, P.; Ivanova, T.; Spasova, N.

    2011-01-01

    In this article we present a partially realized project for building a distributed learning environment for studying digital circuits Test and Diagnostics at TU-Sofia. We describe the main requirements for this environment, substantiate the developer platform choice, and present our simulation and circuit parameter calculation tools.…

  20. About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture

    NASA Astrophysics Data System (ADS)

    Grauer, Manfred; Barth, Thomas

    2004-06-01

    Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.

  1. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  2. Distributed and collaborative synthetic environments

    NASA Technical Reports Server (NTRS)

    Bajaj, Chandrajit L.; Bernardini, Fausto

    1995-01-01

    Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.

  3. A Distributed Simulation Software System for Multi-Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Burns, Richard; Davis, George; Cary, Everett

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  4. Particle Simulations of the Guard Electrode Effects on the Photoelectron Distribution Around an Electric Field Sensor

    NASA Astrophysics Data System (ADS)

    Miyake, Y.; Usui, H.; Kojima, H.

    2010-12-01

    In tenuous space plasma environment, photoelectrons emitted due to solar illumination produce a high-density photoelectron cloud localized in the vicinity of a spacecraft body and an electric field sensor. The photoelectron current emitted from the sensor has also received considerable attention because it becomes a primary factor in determining floating potentials of the sunlit spacecraft and sensor bodies. Considering the fact that asymmetric photoelectron distribution between sunlit and sunless sides of the spacecraft occasionally causes a spurious sunward electric field, we require quantitative evaluation of the photoelectron distribution around the spacecraft and its influence on electric field measurements by means of a numerical approach. In the current study, we applied the Particle-in-Cell plasma simulation to the analysis of the photoelectron environment around spacecraft. By using the PIC modeling, we can self-consistently consider the plasma kinetics. This enables us to simulate the formation of the photoelectron cloud as well as the spacecraft and sensor charging in a self-consistent manner. We report the progress of an analysis on photoelectron environment around MEFISTO, which is an electric field instrument for the BepiColombo/MMO spacecraft to Mercury’s magnetosphere. The photoelectron guard electrode is a key technology for ensuring an optimum photoelectron environment. We show some simulation results on the guard electrode effects on surrounding photoelectrons and discuss a guard operation condition for producing the optimum photoelectron environment. We also deal with another important issue, that is, how the guard electrode can mitigate an undesirable influence of an asymmetric photoelectron distribution on electric field measurements.

  5. Development of Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Capability 2 and Experimental Plans

    NASA Technical Reports Server (NTRS)

    Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.; hide

    2006-01-01

    The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.

  6. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    NASA Technical Reports Server (NTRS)

    Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  7. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  8. Studying distributed cognition of simulation-based team training with DiCoT.

    PubMed

    Rybing, Jonas; Nilsson, Heléne; Jonson, Carl-Oscar; Bang, Magnus

    2016-03-01

    Health care organizations employ simulation-based team training (SBTT) to improve skill, communication and coordination in a broad range of critical care contexts. Quantitative approaches, such as team performance measurements, are predominantly used to measure SBTTs effectiveness. However, a practical evaluation method that examines how this approach supports cognition and teamwork is missing. We have applied Distributed Cognition for Teamwork (DiCoT), a method for analysing cognition and collaboration aspects of work settings, with the purpose of assessing the methodology's usefulness for evaluating SBTTs. In a case study, we observed and analysed four Emergo Train System® simulation exercises where medical professionals trained emergency response routines. The study suggests that DiCoT is an applicable and learnable tool for determining key distributed cognition attributes of SBTTs that are of importance for the simulation validity of training environments. Moreover, we discuss and exemplify how DiCoT supports design of SBTTs with a focus on transfer and validity characteristics. Practitioner Summary: In this study, we have evaluated a method to assess simulation-based team training environments from a cognitive ergonomics perspective. Using a case study, we analysed Distributed Cognition for Teamwork (DiCoT) by applying it to the Emergo Train System®. We conclude that DiCoT is useful for SBTT evaluation and simulator (re)design.

  9. Distributed collaborative environments for predictive battlespace awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.

  10. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  11. Separation and reconstruction of high pressure water-jet reflective sound signal based on ICA

    NASA Astrophysics Data System (ADS)

    Yang, Hongtao; Sun, Yuling; Li, Meng; Zhang, Dongsu; Wu, Tianfeng

    2011-12-01

    The impact of high pressure water-jet on the different materials target will produce different reflective mixed sound. In order to reconstruct the reflective sound signals distribution on the linear detecting line accurately and to separate the environment noise effectively, the mixed sound signals acquired by linear mike array were processed by ICA. The basic principle of ICA and algorithm of FASTICA were described in detail. The emulation experiment was designed. The environment noise signal was simulated by using band-limited white noise and the reflective sound signal was simulated by using pulse signal. The reflective sound signal attenuation produced by the different distance transmission was simulated by weighting the sound signal with different contingencies. The mixed sound signals acquired by linear mike array were synthesized by using the above simulated signals and were whitened and separated by ICA. The final results verified that the environment noise separation and the reconstruction of the detecting-line sound distribution can be realized effectively.

  12. EDMC: An enhanced distributed multi-channel anti-collision algorithm for RFID reader system

    NASA Astrophysics Data System (ADS)

    Zhang, YuJing; Cui, Yinghua

    2017-05-01

    In this paper, we proposes an enhanced distributed multi-channel reader anti-collision algorithm for RFID environments which is based on the distributed multi-channel reader anti-collision algorithm for RFID environments (called DiMCA). We proposes a monitor method to decide whether reader receive the latest control news after it selected the data channel. The simulation result shows that it improves interrogation delay.

  13. Pedestrian simulation and distribution in urban space based on visibility analysis and agent simulation

    NASA Astrophysics Data System (ADS)

    Ying, Shen; Li, Lin; Gao, Yurong

    2009-10-01

    Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.

  14. Distributed collaborative environments for virtual capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.

  15. Evolution of A Distributed Live, Virtual, Constructive Environment for Human in the Loop Unmanned Aircraft Testing

    NASA Technical Reports Server (NTRS)

    Murphy, James R.; Otto, Neil M.

    2017-01-01

    NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The project's integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.

  16. Evolution of A Distributed Live, Virtual, Constructive Environment for Human in the Loop Unmanned Aircraft Testing

    NASA Technical Reports Server (NTRS)

    Murphy, Jim; Otto, Neil

    2017-01-01

    NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The projects integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.

  17. The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.

    ERIC Educational Resources Information Center

    Dede, Chris

    1995-01-01

    Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)

  18. Online compensation for target motion with scanned particle beams: simulation environment.

    PubMed

    Li, Qiang; Groezinger, Sven Oliver; Haberer, Thomas; Rietzel, Eike; Kraft, Gerhard

    2004-07-21

    Target motion is one of the major limitations of each high precision radiation therapy. Using advanced active beam delivery techniques, such as the magnetic raster scanning system for particle irradiation, the interplay between time-dependent beam and target position heavily distorts the applied dose distribution. This paper presents a simulation environment in which the time-dependent effect of target motion on heavy-ion irradiation can be calculated with dynamically scanned ion beams. In an extension of the existing treatment planning software for ion irradiation of static targets (TRiP) at GSI, the expected dose distribution is calculated as the sum of several sub-distributions for single target motion states. To investigate active compensation for target motion by adapting the position of the therapeutic beam during irradiation, the planned beam positions can be altered during the calculation. Applying realistic parameters to the planned motion-compensation methods at GSI, the effect of target motion on the expected dose uniformity can be simulated for different target configurations and motion conditions. For the dynamic dose calculation, experimentally measured profiles of the beam extraction in time were used. Initial simulations show the feasibility and consistency of an active motion compensation with the magnetic scanning system and reveal some strategies to improve the dose homogeneity inside the moving target. The simulation environment presented here provides an effective means for evaluating the dose distribution for a moving target volume with and without motion compensation. It contributes a substantial basis for the experimental research on the irradiation of moving target volumes with scanned ion beams at GSI which will be presented in upcoming papers.

  19. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  20. Parallel-distributed mobile robot simulator

    NASA Astrophysics Data System (ADS)

    Okada, Hiroyuki; Sekiguchi, Minoru; Watanabe, Nobuo

    1996-06-01

    The aim of this project is to achieve an autonomous learning and growth function based on active interaction with the real world. It should also be able to autonomically acquire knowledge about the context in which jobs take place, and how the jobs are executed. This article describes a parallel distributed movable robot system simulator with an autonomous learning and growth function. The autonomous learning and growth function which we are proposing is characterized by its ability to learn and grow through interaction with the real world. When the movable robot interacts with the real world, the system compares the virtual environment simulation with the interaction result in the real world. The system then improves the virtual environment to match the real-world result more closely. This the system learns and grows. It is very important that such a simulation is time- realistic. The parallel distributed movable robot simulator was developed to simulate the space of a movable robot system with an autonomous learning and growth function. The simulator constructs a virtual space faithful to the real world and also integrates the interfaces between the user, the actual movable robot and the virtual movable robot. Using an ultrafast CG (computer graphics) system (FUJITSU AG series), time-realistic 3D CG is displayed.

  1. Distributed interactive communication in simulated space-dwelling groups.

    PubMed

    Brady, Joseph V; Hienz, Robert D; Hursh, Steven R; Ragusa, Leonard C; Rouse, Charles O; Gasior, Eric D

    2004-03-01

    This report describes the development and preliminary application of an experimental test bed for modeling human behavior in the context of a computer generated environment to analyze the effects of variations in communication modalities, incentives and stressful conditions. In addition to detailing the methodological development of a simulated task environment that provides for electronic monitoring and recording of individual and group behavior, the initial substantive findings from an experimental analysis of distributed interactive communication in simulated space dwelling groups are described. Crews of three members each (male and female) participated in simulated "planetary missions" based upon a synthetic scenario task that required identification, collection, and analysis of geologic specimens with a range of grade values. The results of these preliminary studies showed clearly that cooperative and productive interactions were maintained between individually isolated and distributed individuals communicating and problem-solving effectively in a computer-generated "planetary" environment over extended time intervals without benefit of one another's physical presence. Studies on communication channel constraints confirmed the functional interchangeability between available modalities with the highest degree of interchangeability occurring between Audio and Text modes of communication. The effects of task-related incentives were determined by the conditions under which they were available with Positive Incentives effectively attenuating decrements in performance under stressful time pressure. c2003 Elsevier Ltd. All rights reserved.

  2. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  3. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  4. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  5. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  6. On validating remote sensing simulations using coincident real data

    NASA Astrophysics Data System (ADS)

    Wang, Mingming; Yao, Wei; Brown, Scott; Goodenough, Adam; van Aardt, Jan

    2016-05-01

    The remote sensing community often requires data simulation, either via spectral/spatial downsampling or through virtual, physics-based models, to assess systems and algorithms. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model is one such first-principles, physics-based model for simulating imagery for a range of modalities. Complex simulation of vegetation environments subsequently has become possible, as scene rendering technology and software advanced. This in turn has created questions related to the validity of such complex models, with potential multiple scattering, bidirectional distribution function (BRDF), etc. phenomena that could impact results in the case of complex vegetation scenes. We selected three sites, located in the Pacific Southwest domain (Fresno, CA) of the National Ecological Observatory Network (NEON). These sites represent oak savanna, hardwood forests, and conifer-manzanita-mixed forests. We constructed corresponding virtual scenes, using airborne LiDAR and imaging spectroscopy data from NEON, ground-based LiDAR data, and field-collected spectra to characterize the scenes. Imaging spectroscopy data for these virtual sites then were generated using the DIRSIG simulation environment. This simulated imagery was compared to real AVIRIS imagery (15m spatial resolution; 12 pixels/scene) and NEON Airborne Observation Platform (AOP) data (1m spatial resolution; 180 pixels/scene). These tests were performed using a distribution-comparison approach for select spectral statistics, e.g., established the spectra's shape, for each simulated versus real distribution pair. The initial comparison results of the spectral distributions indicated that the shapes of spectra between the virtual and real sites were closely matched.

  7. Guidelines for developing distributed virtual environment applications

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    1998-08-01

    We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.

  8. Application of the TEMPEST computer code for simulating hydrogen distribution in model containment structures. [PWR; BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.

    In this study several aspects of simulating hydrogen distribution in geometric configurations relevant to reactor containment structures were investigated using the TEMPEST computer code. Of particular interest was the performance of the TEMPEST turbulence model in a density-stratified environment. Computed results illustrated that the TEMPEST numerical procedures predicted the measured phenomena with good accuracy under a variety of conditions and that the turbulence model used is a viable approach in complex turbulent flow simulation.

  9. Cryospheric Research in China

    DTIC Science & Technology

    2015-03-30

    marine monitoring for environment and security, using satellite Earth observation technologies), the WCRP/CliC Project (an international cooperative...BIOME4) to simulate the responses of biome distribution to future climate change in China. The simulation results suggest that regional climate

  10. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems.

    PubMed

    Shehzad, Danish; Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  11. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems

    PubMed Central

    Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models. PMID:27413363

  12. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  13. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  14. Distributed Observer Network (DON), Version 3.0, User's Guide

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    The Distributed Observer Network (DON) is a data presentation tool developed by the National Aeronautics and Space Administration (NASA) to distribute and publish simulation results. Leveraging the display capabilities inherent in modern gaming technology, DON places users in a fully navigable 3-D environment containing graphical models and allows the users to observe how those models evolve and interact over time in a given scenario. Each scenario is driven with data that has been generated by authoritative NASA simulation tools and exported in accordance with a published data interface specification. This decoupling of the data from the source tool enables DON to faithfully display a simulator's results and ensure that every simulation stakeholder will view the exact same information every time.

  15. Multi-AUV autonomous task planning based on the scroll time domain quantum bee colony optimization algorithm in uncertain environment

    PubMed Central

    Zhang, Rubo; Yang, Yu

    2017-01-01

    Research on distributed task planning model for multi-autonomous underwater vehicle (MAUV). A scroll time domain quantum artificial bee colony (STDQABC) optimization algorithm is proposed to solve the multi-AUV optimal task planning scheme. In the uncertain marine environment, the rolling time domain control technique is used to realize a numerical optimization in a narrowed time range. Rolling time domain control is one of the better task planning techniques, which can greatly reduce the computational workload and realize the tradeoff between AUV dynamics, environment and cost. Finally, a simulation experiment was performed to evaluate the distributed task planning performance of the scroll time domain quantum bee colony optimization algorithm. The simulation results demonstrate that the STDQABC algorithm converges faster than the QABC and ABC algorithms in terms of both iterations and running time. The STDQABC algorithm can effectively improve MAUV distributed tasking planning performance, complete the task goal and get the approximate optimal solution. PMID:29186166

  16. Multi-AUV autonomous task planning based on the scroll time domain quantum bee colony optimization algorithm in uncertain environment.

    PubMed

    Li, Jianjun; Zhang, Rubo; Yang, Yu

    2017-01-01

    Research on distributed task planning model for multi-autonomous underwater vehicle (MAUV). A scroll time domain quantum artificial bee colony (STDQABC) optimization algorithm is proposed to solve the multi-AUV optimal task planning scheme. In the uncertain marine environment, the rolling time domain control technique is used to realize a numerical optimization in a narrowed time range. Rolling time domain control is one of the better task planning techniques, which can greatly reduce the computational workload and realize the tradeoff between AUV dynamics, environment and cost. Finally, a simulation experiment was performed to evaluate the distributed task planning performance of the scroll time domain quantum bee colony optimization algorithm. The simulation results demonstrate that the STDQABC algorithm converges faster than the QABC and ABC algorithms in terms of both iterations and running time. The STDQABC algorithm can effectively improve MAUV distributed tasking planning performance, complete the task goal and get the approximate optimal solution.

  17. Immersive Simulations for Smart Classrooms: Exploring Evolutionary Concepts in Secondary Science

    ERIC Educational Resources Information Center

    Lui, Michelle; Slotta, James D.

    2014-01-01

    This article presents the design of an immersive simulation and inquiry activity for technology-enhanced classrooms. Using a co-design method, researchers worked with a high school biology teacher to create a rainforest simulation, distributed across several large displays in the room to immerse students in the environment. The authors created and…

  18. Analysis of Regolith Simulant Ejecta Distributions from Normal Incident Hypervelocity Impact

    NASA Technical Reports Server (NTRS)

    Edwards, David L.; Cooke, William; Suggs, Rob; Moser, Danielle E.

    2008-01-01

    The National Aeronautics and Space Administration (NASA) has established the Constellation Program. The Constellation Program has defined one of its many goals as long-term lunar habitation. Critical to the design of a lunar habitat is an understanding of the lunar surface environment; of specific importance is the primary meteoroid and subsequent ejecta environment. The document, NASA SP-8013 'Meteoroid Environment Model Near Earth to Lunar Surface', was developed for the Apollo program in 1969 and contains the latest definition of the lunar ejecta environment. There is concern that NASA SP-8013 may over-estimate the lunar ejecta environment. NASA's Meteoroid Environment Office (MEO) has initiated several tasks to improve the accuracy of our understanding of the lunar surface ejecta environment. This paper reports the results of experiments on projectile impact into powdered pumice and unconsolidated JSC-1A Lunar Mare Regolith simulant targets. Projectiles were accelerated to velocities between 2.45 and 5.18 km/s at normal incidence using the Ames Vertical Gun Range (AVGR). The ejected particles were detected by thin aluminum foil targets strategically placed around the impact site and angular ejecta distributions were determined. Assumptions were made to support the analysis which include; assuming ejecta spherical symmetry resulting from normal impact and all ejecta particles were of mean target particle size. This analysis produces a hemispherical flux density distribution of ejecta with sufficient velocity to penetrate the aluminum foil detectors.

  19. Distributed collaborative decision support environments for predictive awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.; Stilman, Boris; Yakhnis, Vlad

    2005-05-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, rapidly assess the enemy"s course of action (eCOA) or possible actions and promulgate their own course of action (COA) - a need for predictive awareness. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Revolutionary new approaches to strategy generation and assessment such as Linguistic Geometry (LG) permit the rapid development of COA vs. enemy COA (eCOA). LG tools automatically generate and permit the operators to take advantage of winning strategies and tactics for mission planning and execution in near real-time. LG is predictive and employs deep "look-ahead" from the current state and provides a realistic, reactive model of adversary reasoning and behavior. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing research efforts in applying distributed collaborative environments to decision support for predictive mission awareness.

  20. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  1. MULTIMEDIA ENVIRONMENTAL DISTRIBUTION OF TOXICS (MEND-TOX): PART II, SOFTWARE IMPLEMENTATION AND CASE STUDIES

    EPA Science Inventory

    An integrated hybrid spatial-compartmental simulator is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass distribu...

  2. A THREE-DIMENSIONAL MODEL ASSESSMENT OF THE GLOBAL DISTRIBUTION OF HEXACHLOROBENZENE

    EPA Science Inventory

    The distributions of persistent organic pollutants (POPs) in the global environment have been studied typically with box/fugacity models with simplified treatments of atmospheric transport processes1. Such models are incapable of simulating the complex three-dimensional mechanis...

  3. Large Scale Simulation Platform for NODES Validation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotorrio, P.; Qin, Y.; Min, L.

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and lightmore » commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.« less

  4. Mission Simulation Facility: Simulation Support for Autonomy Development

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Plice, Laura; Neukom, Christian; Flueckiger, Lorenzo; Wagner, Michael

    2003-01-01

    The Mission Simulation Facility (MSF) supports research in autonomy technology for planetary exploration vehicles. Using HLA (High Level Architecture) across distributed computers, the MSF connects users autonomy algorithms with provided or third-party simulations of robotic vehicles and planetary surface environments, including onboard components and scientific instruments. Simulation fidelity is variable to meet changing needs as autonomy technology advances in Technical Readiness Level (TRL). A virtual robot operating in a virtual environment offers numerous advantages over actual hardware, including availability, simplicity, and risk mitigation. The MSF is in use by researchers at NASA Ames Research Center (ARC) and has demonstrated basic functionality. Continuing work will support the needs of a broader user base.

  5. Distributed intelligent scheduling of FMS

    NASA Astrophysics Data System (ADS)

    Wu, Zuobao; Cheng, Yaodong; Pan, Xiaohong

    1995-08-01

    In this paper, a distributed scheduling approach of a flexible manufacturing system (FMS) is presented. A new class of Petri nets called networked time Petri nets (NTPN) for system modeling of networking environment is proposed. The distributed intelligent scheduling is implemented by three schedulers which combine NTPN models with expert system techniques. The simulation results are shown.

  6. Modeling and Simulation in Support of Testing and Evaluation

    DTIC Science & Technology

    1997-03-01

    contains standardized automated test methodology, synthetic stimuli and environments based on TECOM Ground Truth data and physics . The VPG is a distributed...Systems Acquisition Management (FSAM) coursebook , Defense Systems Management College, January 1994. Crocker, Charles M. “Application of the Simulation

  7. A Fast-Time Simulation Environment for Airborne Merging and Spacing Research

    NASA Technical Reports Server (NTRS)

    Bussink, Frank J. L.; Doble, Nathan A.; Barmore, Bryan E.; Singer, Sharon

    2005-01-01

    As part of NASA's Distributed Air/Ground Traffic Management (DAG-TM) effort, NASA Langley Research Center is developing concepts and algorithms for merging multiple aircraft arrival streams and precisely spacing aircraft over the runway threshold. An airborne tool has been created for this purpose, called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR). To evaluate the performance of AMSTAR and complement human-in-the-loop experiments, a simulation environment has been developed that enables fast-time studies of AMSTAR operations. The environment is based on TMX, a multiple aircraft desktop simulation program created by the Netherlands National Aerospace Laboratory (NLR). This paper reviews the AMSTAR concept, discusses the integration of the AMSTAR algorithm into TMX and the enhancements added to TMX to support fast-time AMSTAR studies, and presents initial simulation results.

  8. Background and Pickup Ion Velocity Distribution Dynamics in Titan's Plasma Environment: 3D Hybrid Simulation and Comparison with CAPS T9 Observations

    NASA Technical Reports Server (NTRS)

    Lipatov, A. S.; Sittler, E. C., Jr.; Hartle, R. E.; Cooper, J. F.; Simpson, D. G.

    2011-01-01

    In this report we discuss the ion velocity distribution dynamics from the 3D hybrid simulation. In our model the background, pickup, and ionospheric ions are considered as a particles, whereas the electrons are described as a fluid. Inhomogeneous photoionization, electron-impact ionization and charge exchange are included in our model. We also take into account the collisions between the ions and neutrals. The current simulation shows that mass loading by pickup ions H(+); H2(+), CH4(+) and N2(+) is stronger than in the previous simulations when O+ ions are introduced into the background plasma. In our hybrid simulations we use Chamberlain profiles for the atmospheric components. We also include a simple ionosphere model with average mass M = 28 amu ions that were generated inside the ionosphere. The moon is considered as a weakly conducting body. Special attention will be paid to comparing the simulated pickup ion velocity distribution with CAPS T9 observations. Our simulation shows an asymmetry of the ion density distribution and the magnetic field, including the formation of the Alfve n wing-like structures. The simulation also shows that the ring-like velocity distribution for pickup ions relaxes to a Maxwellian core and a shell-like halo.

  9. Distributed simulation using a real-time shared memory network

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Mattern, Duane L.; Wong, Edmond; Musgrave, Jeffrey L.

    1993-01-01

    The Advanced Control Technology Branch of the NASA Lewis Research Center performs research in the area of advanced digital controls for aeronautic and space propulsion systems. This work requires the real-time implementation of both control software and complex dynamical models of the propulsion system. We are implementing these systems in a distributed, multi-vendor computer environment. Therefore, a need exists for real-time communication and synchronization between the distributed multi-vendor computers. A shared memory network is a potential solution which offers several advantages over other real-time communication approaches. A candidate shared memory network was tested for basic performance. The shared memory network was then used to implement a distributed simulation of a ramjet engine. The accuracy and execution time of the distributed simulation was measured and compared to the performance of the non-partitioned simulation. The ease of partitioning the simulation, the minimal time required to develop for communication between the processors and the resulting execution time all indicate that the shared memory network is a real-time communication technique worthy of serious consideration.

  10. Design of compact freeform LED flashlight capable of two different light distributions

    NASA Astrophysics Data System (ADS)

    Isaac, Annie Shalom; Neumann, Cornelius

    2016-04-01

    Free-form optical surfaces are designed for desired intensity requirements for applications ranging from general to automotive lighting. But a single compact free-form optics which satisfies two different intensity distributions is not presented so far. In this work, a compact LED flashlight fulfilling two different intensity requirements that could be used in potentially explosive atmospheres is designed and validated. The first target is selected after a study on visibility analysis in fog, dust, and smoke environments. Studies showed that a ring-like distribution (5°- 10°) have better visual recognition for short distances in smoky environments. The second target is selected to have a maximum intensity at the peak to provide visibility for longer distances. We realized these two different intensity requirements by moving the LED with respect to the optics along the optical axis. To fulfill the above- required intensity distributions, hybrid TIR optics was designed as free-form curves calculated by combining several geometric optic methods. We validated the free-form TIR hybrid optics using Monte Carlo ray trace simulation. The overall diameter of the optics is 29 mm and 10 mm in thickness. The simulated results showed an optical efficiency of about 84% to realize both target light distributions in a single optics. Then we designed a whole flashlight consisting of LED, PMMA hybrid optics, PC glass casing and a housing including the critical thermal management for explosive environments. To validate the results, a prototype for the designed optics was made. The measured results showed an overall agreement with the simulated results.

  11. Computer simulations of transport through membranes: passive diffusion, pores, channels and transporters.

    PubMed

    Tieleman, D Peter

    2006-10-01

    A key function of biological membranes is to provide mechanisms for the controlled transport of ions, nutrients, metabolites, peptides and proteins between a cell and its environment. We are using computer simulations to study several processes involved in transport. In model membranes, the distribution of small molecules can be accurately calculated; we are making progress towards understanding the factors that determine the partitioning behaviour in the inhomogeneous lipid environment, with implications for drug distribution, membrane protein folding and the energetics of voltage gating. Lipid bilayers can be simulated at a scale that is sufficiently large to study significant defects, such as those caused by electroporation. Computer simulations of complex membrane proteins, such as potassium channels and ATP-binding cassette (ABC) transporters, can give detailed information about the atomistic dynamics that form the basis of ion transport, selectivity, conformational change and the molecular mechanism of ATP-driven transport. This is illustrated in the present review with recent simulation studies of the voltage-gated potassium channel KvAP and the ABC transporter BtuCD.

  12. Providing a parallel and distributed capability for JMASS using SPEEDES

    NASA Astrophysics Data System (ADS)

    Valinski, Maria; Driscoll, Jonathan; McGraw, Robert M.; Meyer, Bob

    2002-07-01

    The Joint Modeling And Simulation System (JMASS) is a Tri-Service simulation environment that supports engineering and engagement-level simulations. As JMASS is expanded to support other Tri-Service domains, the current set of modeling services must be expanded for High Performance Computing (HPC) applications by adding support for advanced time-management algorithms, parallel and distributed topologies, and high speed communications. By providing support for these services, JMASS can better address modeling domains requiring parallel computationally intense calculations such clutter, vulnerability and lethality calculations, and underwater-based scenarios. A risk reduction effort implementing some HPC services for JMASS using the SPEEDES (Synchronous Parallel Environment for Emulation and Discrete Event Simulation) Simulation Framework has recently concluded. As an artifact of the JMASS-SPEEDES integration, not only can HPC functionality be brought to the JMASS program through SPEEDES, but an additional HLA-based capability can be demonstrated that further addresses interoperability issues. The JMASS-SPEEDES integration provided a means of adding HLA capability to preexisting JMASS scenarios through an implementation of the standard JMASS port communication mechanism that allows players to communicate.

  13. Building A Simulation Model For The Prediction Of Temperature Distribution In Pulsed Laser Spot Welding Of Dissimilar Low Carbon Steel 1020 To Aluminum Alloy 6061

    NASA Astrophysics Data System (ADS)

    Yousef, Adel K. M.; Taha, Ziad. A.; Shehab, Abeer A.

    2011-01-01

    This paper describes the development of a computer model used to analyze the heat flow during pulsed Nd: YAG laser spot welding of dissimilar metal; low carbon steel (1020) to aluminum alloy (6061). The model is built using ANSYS FLUENT 3.6 software where almost all the environments simulated to be similar to the experimental environments. A simulation analysis was implemented based on conduction heat transfer out of the key hole where no melting occurs. The effect of laser power and pulse duration was studied. Three peak powers 1, 1.66 and 2.5 kW were varied during pulsed laser spot welding (keeping the energy constant), also the effect of two pulse durations 4 and 8 ms (with constant peak power), on the transient temperature distribution and weld pool dimension were predicated using the present simulation. It was found that the present simulation model can give an indication for choosing the suitable laser parameters (i.e. pulse durations, peak power and interaction time required) during pulsed laser spot welding of dissimilar metals.

  14. SimBox: a simulation-based scalable architecture for distributed command and control of spaceport and service constellations

    NASA Astrophysics Data System (ADS)

    Prasad, Guru; Jayaram, Sanjay; Ward, Jami; Gupta, Pankaj

    2004-09-01

    In this paper, Aximetric proposes a decentralized Command and Control (C2) architecture for a distributed control of a cluster of on-board health monitoring and software enabled control systems called SimBOX that will use some of the real-time infrastructure (RTI) functionality from the current military real-time simulation architecture. The uniqueness of the approach is to provide a "plug and play environment" for various system components that run at various data rates (Hz) and the ability to replicate or transfer C2 operations to various subsystems in a scalable manner. This is possible by providing a communication bus called "Distributed Shared Data Bus" and a distributed computing environment used to scale the control needs by providing a self-contained computing, data logging and control function module that can be rapidly reconfigured to perform different functions. This kind of software-enabled control is very much needed to meet the needs of future aerospace command and control functions.

  15. An approach for modelling snowcover ablation and snowmelt runoff in cold region environments

    NASA Astrophysics Data System (ADS)

    Dornes, Pablo Fernando

    Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.

  16. VERSE - Virtual Equivalent Real-time Simulation

    NASA Technical Reports Server (NTRS)

    Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel

    2005-01-01

    Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.

  17. A Distributed Snow Evolution Modeling System (SnowModel)

    NASA Astrophysics Data System (ADS)

    Liston, G. E.; Elder, K.

    2004-12-01

    A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.

  18. Airborne Chemical Sensing with Mobile Robots

    PubMed Central

    Lilienthal, Achim J.; Loutfi, Amy; Duckett, Tom

    2006-01-01

    Airborne chemical sensing with mobile robots has been an active research area since the beginning of the 1990s. This article presents a review of research work in this field, including gas distribution mapping, trail guidance, and the different subtasks of gas source localisation. Due to the difficulty of modelling gas distribution in a real world environment with currently available simulation techniques, we focus largely on experimental work and do not consider publications that are purely based on simulations.

  19. Particle Acceleration and Fractional Transport in Turbulent Reconnection

    NASA Astrophysics Data System (ADS)

    Isliker, Heinz; Pisokas, Theophilos; Vlahos, Loukas; Anastasiadis, Anastasios

    2017-11-01

    We consider a large-scale environment of turbulent reconnection that is fragmented into a number of randomly distributed unstable current sheets (UCSs), and we statistically analyze the acceleration of particles within this environment. We address two important cases of acceleration mechanisms when particles interact with the UCS: (a) electric field acceleration and (b) acceleration by reflection at contracting islands. Electrons and ions are accelerated very efficiently, attaining an energy distribution of power-law shape with an index 1-2, depending on the acceleration mechanism. The transport coefficients in energy space are estimated from test-particle simulation data, and we show that the classical Fokker-Planck (FP) equation fails to reproduce the simulation results when the transport coefficients are inserted into it and it is solved numerically. The cause for this failure is that the particles perform Levy flights in energy space, while the distributions of the energy increments exhibit power-law tails. We then use the fractional transport equation (FTE) derived by Isliker et al., whose parameters and the order of the fractional derivatives are inferred from the simulation data, and solving the FTE numerically, we show that the FTE successfully reproduces the kinetic energy distribution of the test particles. We discuss in detail the analysis of the simulation data and the criteria that allow one to judge the appropriateness of either an FTE or a classical FP equation as a transport model.

  20. Particle Acceleration and Fractional Transport in Turbulent Reconnection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isliker, Heinz; Pisokas, Theophilos; Vlahos, Loukas

    We consider a large-scale environment of turbulent reconnection that is fragmented into a number of randomly distributed unstable current sheets (UCSs), and we statistically analyze the acceleration of particles within this environment. We address two important cases of acceleration mechanisms when particles interact with the UCS: (a) electric field acceleration and (b) acceleration by reflection at contracting islands. Electrons and ions are accelerated very efficiently, attaining an energy distribution of power-law shape with an index 1–2, depending on the acceleration mechanism. The transport coefficients in energy space are estimated from test-particle simulation data, and we show that the classical Fokker–Planckmore » (FP) equation fails to reproduce the simulation results when the transport coefficients are inserted into it and it is solved numerically. The cause for this failure is that the particles perform Levy flights in energy space, while the distributions of the energy increments exhibit power-law tails. We then use the fractional transport equation (FTE) derived by Isliker et al., whose parameters and the order of the fractional derivatives are inferred from the simulation data, and solving the FTE numerically, we show that the FTE successfully reproduces the kinetic energy distribution of the test particles. We discuss in detail the analysis of the simulation data and the criteria that allow one to judge the appropriateness of either an FTE or a classical FP equation as a transport model.« less

  1. Shielding Effectiveness in a Two-Dimensional Reverberation Chamber Using Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.

    2006-01-01

    Reverberation chambers are attaining an increased importance in determination of electromagnetic susceptibility of avionics equipment. Given the nature of the variable boundary condition, the ability of a given source to couple energy into certain modes and the passband characteristic due the chamber Q, the fields are typically characterized by statistical means. The emphasis of this work is to apply finite-element techniques at cutoff to the analysis of a two-dimensional structure to examine the notion of shielding-effectiveness issues in a reverberating environment. Simulated mechanical stirring will be used to obtain the appropriate statistical field distribution. The shielding effectiveness (SE) in a simulated reverberating environment is compared to measurements in a reverberation chamber. A log-normal distribution for the SE is observed with implications for system designers. The work is intended to provide further refinement in the consideration of SE in a complex electromagnetic environment.

  2. [Development of a microenvironment test chamber for airborne microbe research].

    PubMed

    Zhan, Ningbo; Chen, Feng; Du, Yaohua; Cheng, Zhi; Li, Chenyu; Wu, Jinlong; Wu, Taihu

    2017-10-01

    One of the most important environmental cleanliness indicators is airborne microbe. However, the particularity of clean operating environment and controlled experimental environment often leads to the limitation of the airborne microbe research. This paper designed and implemented a microenvironment test chamber for airborne microbe research in normal test conditions. Numerical simulation by Fluent showed that airborne microbes were evenly dispersed in the upper part of test chamber, and had a bottom-up concentration growth distribution. According to the simulation results, the verification experiment was carried out by selecting 5 sampling points in different space positions in the test chamber. Experimental results showed that average particle concentrations of all sampling points reached 10 7 counts/m 3 after 5 minutes' distributing of Staphylococcus aureus , and all sampling points showed the accordant mapping of concentration distribution. The concentration of airborne microbe in the upper chamber was slightly higher than that in the middle chamber, and that was also slightly higher than that in the bottom chamber. It is consistent with the results of numerical simulation, and it proves that the system can be well used for airborne microbe research.

  3. Evaporation Flux Distribution of Drops on a Hydrophilic or Hydrophobic Flat Surface by Molecular Simulations.

    PubMed

    Xie, Chiyu; Liu, Guangzhi; Wang, Moran

    2016-08-16

    The evaporation flux distribution of sessile drops is investigated by molecular dynamic simulations. Three evaporating modes are classified, including the diffusion dominant mode, the substrate heating mode, and the environment heating mode. Both hydrophilic and hydrophobic drop-substrate interactions are considered. To count the evaporation flux distribution, which is position dependent, we proposed an azimuthal-angle-based division method under the assumption of spherical crown shape of drops. The modeling results show that the edge evaporation, i.e., near the contact line, is enhanced for hydrophilic drops in all the three modes. The surface diffusion of liquid molecular absorbed on solid substrate for hydrophilic cases plays an important role as well as the space diffusion on the enhanced evaporation rate at the edge. For hydrophobic drops, the edge evaporation flux is higher for the substrate heating mode, but lower than elsewhere of the drop for the diffusion dominant mode; however, a nearly uniform distribution is found for the environment heating mode. The evidence shows that the temperature distribution inside drops plays a key role in the position-dependent evaporation flux.

  4. Leveraging Simulation Against the F-16 Flying Training Gap

    DTIC Science & Technology

    2005-11-01

    must leverage emerging simulation technology into combined flight training to counter mission employment complexity created by technology itself...two or more of these stand-alone simulators creates a mission training center (MTC), which when further networked create distributed mission...operations (DMO). Ultimately, the grand operational vision of DMO is to interconnect non-collocated users creating a “virtual” joint training environment

  5. Fractional Transport in Strongly Turbulent Plasmas.

    PubMed

    Isliker, Heinz; Vlahos, Loukas; Constantinescu, Dana

    2017-07-28

    We analyze statistically the energization of particles in a large scale environment of strong turbulence that is fragmented into a large number of distributed current filaments. The turbulent environment is generated through strongly perturbed, 3D, resistive magnetohydrodynamics simulations, and it emerges naturally from the nonlinear evolution, without a specific reconnection geometry being set up. Based on test-particle simulations, we estimate the transport coefficients in energy space for use in the classical Fokker-Planck (FP) equation, and we show that the latter fails to reproduce the simulation results. The reason is that transport in energy space is highly anomalous (strange), the particles perform Levy flights, and the energy distributions show extended power-law tails. Newly then, we motivate the use and derive the specific form of a fractional transport equation (FTE), we determine its parameters and the order of the fractional derivatives from the simulation data, and we show that the FTE is able to reproduce the high energy part of the simulation data very well. The procedure for determining the FTE parameters also makes clear that it is the analysis of the simulation data that allows us to make the decision whether a classical FP equation or a FTE is appropriate.

  6. Fractional Transport in Strongly Turbulent Plasmas

    NASA Astrophysics Data System (ADS)

    Isliker, Heinz; Vlahos, Loukas; Constantinescu, Dana

    2017-07-01

    We analyze statistically the energization of particles in a large scale environment of strong turbulence that is fragmented into a large number of distributed current filaments. The turbulent environment is generated through strongly perturbed, 3D, resistive magnetohydrodynamics simulations, and it emerges naturally from the nonlinear evolution, without a specific reconnection geometry being set up. Based on test-particle simulations, we estimate the transport coefficients in energy space for use in the classical Fokker-Planck (FP) equation, and we show that the latter fails to reproduce the simulation results. The reason is that transport in energy space is highly anomalous (strange), the particles perform Levy flights, and the energy distributions show extended power-law tails. Newly then, we motivate the use and derive the specific form of a fractional transport equation (FTE), we determine its parameters and the order of the fractional derivatives from the simulation data, and we show that the FTE is able to reproduce the high energy part of the simulation data very well. The procedure for determining the FTE parameters also makes clear that it is the analysis of the simulation data that allows us to make the decision whether a classical FP equation or a FTE is appropriate.

  7. Simulating the decentralized processes of the human immune system in a virtual anatomy model.

    PubMed

    Sarpe, Vladimir; Jacob, Christian

    2013-01-01

    Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.

  8. IS THE SIZE DISTRIBUTION OF URBAN AEROSOLS DETERMINED BY THERMODYNAMIC EQUILIBRIUM? (R826371C005)

    EPA Science Inventory

    A size-resolved equilibrium model, SELIQUID, is presented and used to simulate the size–composition distribution of semi-volatile inorganic aerosol in an urban environment. The model uses the efflorescence branch of aerosol behavior to predict the equilibrium partitioni...

  9. On the anisotropic satellite distribution around Milky-way-like galaxies in cosmological simulations.

    NASA Astrophysics Data System (ADS)

    Kihm, Seoneui; Seo, Seongu; Yoon, Suk-jin

    2018-01-01

    The presence of "anisotropic satellite distribution (ASD)" around massive galaxies is often taken as evidence against the ΛCDM cosmology. To address whether such anisotropy can be reconciled with the standard cosmology, we examine the spatial distributions of satellites around central galaxies in the hydrodynamic cosmological simulation, Illustris. In an attempt to understand the ASD of our Galaxy, we limit our analysis to the systems consisting of a MW-sized host and at least 11 satellites. We find that ASDs are rather a common feature in the simulation and that ASD systems tend to possess a larger fraction of recently accreted satellites than isotropy systems. We discuss a possible link of ASD formation to the surrounding environment in the ΛCDM setting.

  10. Minimization of Blast furnace Fuel Rate by Optimizing Burden and Gas Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Chenn Zhou

    2012-08-15

    The goal of the research is to improve the competitive edge of steel mills by using the advanced CFD technology to optimize the gas and burden distributions inside a blast furnace for achieving the best gas utilization. A state-of-the-art 3-D CFD model has been developed for simulating the gas distribution inside a blast furnace at given burden conditions, burden distribution and blast parameters. The comprehensive 3-D CFD model has been validated by plant measurement data from an actual blast furnace. Validation of the sub-models is also achieved. The user friendly software package named Blast Furnace Shaft Simulator (BFSS) has beenmore » developed to simulate the blast furnace shaft process. The research has significant benefits to the steel industry with high productivity, low energy consumption, and improved environment.« less

  11. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    PubMed

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully between Western Australia and UNM. We successfully demonstrated the ability to fully immerse participants in a distributed virtual environment independent of distance for collaborative team interaction in medical simulation designed for education and training. The ability to make mistakes in a safe environment is well received by students and has a positive impact on their understanding, as well as memory of the principles involved in correcting those mistakes. Bringing people together as virtual teams for interactive experiential learning and collaborative training, independent of distance, provides a platform for distributed "just-in-time" training, performance assessment and credentialing. Further validation is necessary to determine the potential value of the distributed VRE in knowledge transfer, improved future performance and should entail training participants to competence in using these tools.

  12. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  13. LiveInventor: An Interactive Development Environment for Robot Autonomy

    NASA Technical Reports Server (NTRS)

    Neveu, Charles; Shirley, Mark

    2003-01-01

    LiveInventor is an interactive development environment for robot autonomy developed at NASA Ames Research Center. It extends the industry-standard OpenInventor graphics library and scenegraph file format to include kinetic and kinematic information, a physics-simulation library, an embedded Scheme interpreter, and a distributed communication system.

  14. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    DOE PAGES

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; ...

    2015-11-09

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here in this paper, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive.

  15. The Structure of Vertical Wind Shear in Tropical Cyclone Environments: Implications for Forecasting and Predictability

    NASA Astrophysics Data System (ADS)

    Finocchio, Peter M.

    The vertical wind shear measured between 200 and 850 hPa is commonly used to diagnose environmental interactions with a tropical cyclone (TC) and to forecast the storm's intensity and structural evolution. More often than not, stronger vertical shear within this deep layer prohibits the intensification of TCs and leads to predictable asymmetries in precipitation. But such bulk measures of vertical wind shear can occasionally mislead the forecaster. In the first part of this dissertation, we use a series of idealized numerical simulations to examine how a TC responds to changing the structure of unidirectional vertical wind shear while fixing the 200-850-hPa shear magnitude. These simulations demonstrate a significant intensity response, in which shear concentrated in shallow layers of the lower troposphere prevents vortex intensification. We attribute the arrested development of TCs in lower-level shear to the intrusion of mid-level environmental air over the surface vortex early in the simulations. Convection developing on the downshear side of the storm interacts with the intruding air so as to enhance the downward flux of low-entropy air into the boundary layer. We also construct a two-dimensional intensity response surface from a set of simulations that sparsely sample the joint shear height-depth parameter space. This surface reveals regions of the two-parameter space for which TC intensity is particularly sensitive. We interpret these parameter ranges as those which lead to reduced intensity predictability. Despite the robust response to changing the shape of a sheared wind profile in idealized simulations, we do not encounter such sensitivity within a large set of reanalyzed TCs in the Northern Hemisphere. Instead, there is remarkable consistency in the structure of reanalyzed wind profiles around TCs. This is evident in the distributions of two new parameters describing the height and depth of vertical wind shear, which highlight a clear preference for shallow layers of upper-level shear. Many of the wind profiles tested in the idealized simulations have shear height or depth values on the tails of these distributions, suggesting that the environmental wind profiles around real TCs do not exhibit enough structural variability to have the clear statistical relationship to intensity change that we expected. In the final part of this dissertation, we use the reanalyzed TC environments to initialize ensembles of idealized simulations. Using a new modeling technique that allows for time-varying environments, these simulations examine the predictability implications of exposing a TC to different structures and magnitudes of vertical wind shear during its life cycle. We find that TCs in more deeply distributed vertical wind shear environments have a more uncertain intensity evolution than TCs exposed to shallower layers of upper-level shear. This higher uncertainty arises from a more marginal boundary layer environment that the deeply distributed shear establishes, which enhances the TC sensitivity to the magnitude of deep-layer shear. Simulated radar reflectivity also appears to evolve in a more uncertain fashion in environments with deeply distributed vertical shear. However, structural predictability timescales, computed as the time it takes for errors in the amplitude or phase of azimuthal asymmetries of reflectivity to saturate, are similar for wind profiles with shallow upper-level shear and deeply distributed shear. Both ensembles demonstrate predictability timescales of two to three days for the lowest azimuthal wavenumbers of amplitude and phase. As the magnitude of vertical wind shear increases to universally destructive levels, structural and intensity errors begin to decrease. Shallow upper-level shear primes the TC for a more pronounced recovery in the predictability of the wavenumber-one precipitation structure in stronger shear. The recovered low-wavenumber predictability of TC precipitation structure and the collapse in intensity spread in strong shear suggests that vertical wind shear is most effective at reducing TC predictability when its magnitude is near the threshold between favorable and unfavorable values and when it is deeply distributed through the troposphere. By isolating the effect of the environmental flow, the simulations and analyses in this dissertation offer a unique understanding of how vertical wind shear affects TCs. In particular, the results have important implications for designing and implementing future environmental observing strategies that will be critical for improving forecasts of these destructive storms.

  16. Measurement and Validation of Bidirectional Reflectance of Space Shuttle and Space Station Materials for Computerized Lighting Models

    NASA Technical Reports Server (NTRS)

    Fletcher, Lauren E.; Aldridge, Ann M.; Wheelwright, Charles; Maida, James

    1997-01-01

    Task illumination has a major impact on human performance: What a person can perceive in his environment significantly affects his ability to perform tasks, especially in space's harsh environment. Training for lighting conditions in space has long depended on physical models and simulations to emulate the effect of lighting, but such tests are expensive and time-consuming. To evaluate lighting conditions not easily simulated on Earth, personnel at NASA Johnson Space Center's (JSC) Graphics Research and Analysis Facility (GRAF) have been developing computerized simulations of various illumination conditions using the ray-tracing program, Radiance, developed by Greg Ward at Lawrence Berkeley Laboratory. Because these computer simulations are only as accurate as the data used, accurate information about the reflectance properties of materials and light distributions is needed. JSC's Lighting Environment Test Facility (LETF) personnel gathered material reflectance properties for a large number of paints, metals, and cloths used in the Space Shuttle and Space Station programs, and processed these data into reflectance parameters needed for the computer simulations. They also gathered lamp distribution data for most of the light sources used, and validated the ability to accurately simulate lighting levels by comparing predictions with measurements for several ground-based tests. The result of this study is a database of material reflectance properties for a wide variety of materials, and lighting information for most of the standard light sources used in the Shuttle/Station programs. The combination of the Radiance program and GRAF's graphics capability form a validated computerized lighting simulation capability for NASA.

  17. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  18. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  19. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  20. Electrical properties study under radiation of the 3D-open-shell-electrode detector

    NASA Astrophysics Data System (ADS)

    Liu, Manwen; Li, Zheng

    2018-05-01

    Since the 3D-Open-Shell-Electrode Detector (3DOSED) is proposed and the structure is optimized, it is important to study 3DOSED's electrical properties to determine the detector's working performance, especially in the heavy radiation environments, like the Large Hadron Collider (LHC) and it's upgrade, the High Luminosity (HL-LHC) at CERN. In this work, full 3D technology computer-aided design (TCAD) simulations have been done on this novel silicon detector structure. Simulated detector properties include the electric field distribution, the electric potential distribution, current-voltage (I-V) characteristics, capacitance-voltage (C-V) characteristics, charge collection property, and full depletion voltage. Through the analysis of calculations and simulation results, we find that the 3DOSED's electric field and potential distributions are very uniform, even in the tiny region near the shell openings with little perturbations. The novel detector fits the designing purpose of collecting charges generated by particle/light in a good fashion with a well defined funnel shape of electric potential distribution that makes these charges drifting towards the center collection electrode. Furthermore, by analyzing the I-V, C-V, charge collection property and full depletion voltage, we can expect that the novel detector will perform well, even in the heavy radiation environments.

  1. Modelling the effect of diffuse light on canopy photosynthesis in controlled environments

    NASA Technical Reports Server (NTRS)

    Cavazzoni, James; Volk, Tyler; Tubiello, Francesco; Monje, Oscar; Janes, H. W. (Principal Investigator)

    2002-01-01

    A layered canopy model was used to analyze the effects of diffuse light on canopy gross photosynthesis in controlled environment plant growth chambers, where, in contrast to the field, highly diffuse light can occur at high irradiance. The model suggests that high diffuse light fractions (approximately 0.7) and irradiance (1400 micromoles m-2 s-1) may enhance crop life-cycle canopy gross photosynthesis for hydroponic wheat by about 20% compared to direct light at the same irradiance. Our simulations suggest that high accuracy is not needed in specifying diffuse light fractions in chambers between approximately 0.7 and 1, because simulated photosynthesis for closed canopies plateau in this range. We also examined the effect of leaf angle distribution on canopy photosynthesis under growth chamber conditions, as these distributions determine canopy extinction coefficients for direct and diffuse light. We show that the spherical leaf angle distribution is not suitable for modeling photosynthesis of planophile canopies (e.g., soybean and peanut) in growth chambers. Also, the absorption of the light reflected from the surface below the canopy should generally be included in model simulations, as the corresponding albedo values in the photosynthetically active range may be quite high in growth chambers (e.g., approximately 0.5). In addition to the modeling implications, our results suggest that diffuse light conditions should be considered when drawing conclusions from experiments in controlled environments.

  2. Design Patterns for Learning and Assessment: Facilitating the Introduction of a Complex Simulation-Based Learning Environment into a Community of Instructors

    NASA Astrophysics Data System (ADS)

    Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.

    2010-04-01

    Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.

  3. AN-CASE NET-CENTRIC modeling and simulation

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Chruscicki, Mary Carol; Turck, Kurt

    2009-05-01

    The objective of mission training exercises is to immerse the trainees into an environment that enables them to train like they would fight. The integration of modeling and simulation environments that can seamlessly leverage Live systems, and Virtual or Constructive models (LVC) as they are available offers a flexible and cost effective solution to extending the "war-gaming" environment to a realistic mission experience while evolving the development of the net-centric enterprise. From concept to full production, the impact of new capabilities on the infrastructure and concept of operations, can be assessed in the context of the enterprise, while also exposing them to the warfighter. Training is extended to tomorrow's tools, processes, and Tactics, Techniques and Procedures (TTPs). This paper addresses the challenges of a net-centric modeling and simulation environment that is capable of representing a net-centric enterprise. An overview of the Air Force Research Laboratory's (AFRL) Airborne Networking Component Architecture Simulation Environment (AN-CASE) is provide as well as a discussion on how it is being used to assess technologies for the purpose of experimenting with new infrastructure mechanisms that enhance the scalability and reliability of the distributed mission operations environment.

  4. Online production validation in a HEP environment

    NASA Astrophysics Data System (ADS)

    Harenberg, T.; Kuhl, T.; Lang, N.; Mättig, P.; Sandhoff, M.; Schwanenberger, C.; Volkmer, F.

    2017-03-01

    In high energy physics (HEP) event simulations, petabytes of data are processed and stored requiring millions of CPU-years. This enormous demand for computing resources is handled by centers distributed worldwide, which form part of the LHC computing grid. The consumption of such an important amount of resources demands for an efficient production of simulation and for the early detection of potential errors. In this article we present a new monitoring framework for grid environments, which polls a measure of data quality during job execution. This online monitoring facilitates the early detection of configuration errors (specially in simulation parameters), and may thus contribute to significant savings in computing resources.

  5. Experimental and computational study of the effect of 1 atm background gas on nanoparticle generation in femtosecond laser ablation of metals

    NASA Astrophysics Data System (ADS)

    Wu, Han; Wu, Chengping; Zhang, Nan; Zhu, Xiaonong; Ma, Xiuquan; Zhigilei, Leonid V.

    2018-03-01

    Laser ablation of metal targets is actively used for generation of chemically clean nanoparticles for a broad range of practical applications. The processes involved in the nanoparticle formation at all relevant spatial and temporal scales are still not fully understood, making the precise control of the size and shape of the nanoparticles challenging. In this paper, a combination of molecular dynamics simulations and experiments is applied to investigate femtosecond laser ablation of aluminum targets in vacuum and in 1 atm argon background gas. The results of the simulations reveal a strong effect of the background gas environment on the initial plume expansion and evolution of the nanoparticle size distribution. The suppression of the generation of small/medium-size Al clusters and formation of a dense layer at the front of the expanding ablation plume, observed during the first nanosecond of the plume expansion in a simulation performed in the gas environment, have important implications on the characteristics of the nanoparticles deposited on a substrate and characterized in the experiments. The nanoparticles deposited in the gas environment are found to be more round-shaped and less flattened as compared to those deposited in vacuum. The nanoparticle size distributions exhibit power-law dependences with similar values of exponents obtained from fitting experimental and simulated data. Taken together, the results of this study suggest that the gas environment may be effectively used to control size and shape of nanoparticles generated by laser ablation.

  6. Comparisons between GRNTRN simulations and beam measurements of proton lateral broadening distributions

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher; Moyers, Michael; Walker, Steven; Tweed, John

    Recent developments in NASA's High Charge and Energy Transport (HZETRN) code have included lateral broadening of primary ion beams due to small-angle multiple Coulomb scattering, and coupling of the ion-nuclear scattering interactions with energy loss and straggling. The new version of HZETRN based on Green function methods, GRNTRN, is suitable for modeling transport with both space environment and laboratory boundary conditions. Multiple scattering processes are a necessary extension to GRNTRN in order to accurately model ion beam experiments, to simulate the physical and biological-effective radiation dose, and to develop new methods and strategies for light ion radiation therapy. In this paper we compare GRNTRN simulations of proton lateral scattering distributions with beam measurements taken at Loma Linda Medical University. The simulated and measured lateral proton distributions will be compared for a 250 MeV proton beam on aluminum, polyethylene, polystyrene, bone, iron, and lead target materials.

  7. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  8. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  9. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  10. We are not the 99 percent: quantifying asphericity in the distribution of Local Group satellites

    NASA Astrophysics Data System (ADS)

    Forero-Romero, Jaime E.; Arias, Verónica

    2018-05-01

    We use simulations to build an analytic probability distribution for the asphericity in the satellite distribution around Local Group (LG) type galaxies in the Lambda Cold Dark Matter (LCDM) paradigm. We use this distribution to estimate the atypicality of the satellite distributions in the LG even when the underlying simulations do not have enough systems fully resembling the LG in terms of its typical masses, separation and kinematics. We demonstrate the method using three different simulations (Illustris-1, Illustris-1-Dark and ELVIS) and a number of satellites ranging from 11 to 15. Detailed results differ greatly among the simulations suggesting a strong influence of the typical DM halo mass, the number of satellites and the simulated baryonic effects. However, there are three common trends. First, at most 2% of the pairs are expected to have satellite distributions with the same asphericity as the LG; second, at most 80% of the pairs have a halo with a satellite distribution as aspherical as in M31; and third, at most 4% of the pairs have a halo with satellite distribution as planar as in the MW. These quantitative results place the LG at the level of a 3σ outlier in the LCDM paradigm. We suggest that understanding the reasons for this atypicality requires quantifying the asphericity probability distribution as a function of halo mass and large scale environment. The approach presented here can facilitate that kind of study and other comparisons between different numerical setups and choices to study satellites around LG pairs in simulations.

  11. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    Conroy, Michael; Mazzone, Rebecca; Little, William; Elfrey, Priscilla; Mann, David; Mabie, Kevin; Cuddy, Thomas; Loundermon, Mario; Spiker, Stephen; McArthur, Frank; hide

    2010-01-01

    The Distributed Observer network (DON) is a NASA-collaborative environment that leverages game technology to bring three-dimensional simulations to conventional desktop and laptop computers in order to allow teams of engineers working on design and operations, either individually or in groups, to view and collaborate on 3D representations of data generated by authoritative tools such as Delmia Envision, Pro/Engineer, or Maya. The DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3D visual environment. DON has been designed to enhance accessibility and user ability to observe and analyze visual simulations in real time. A variety of NASA mission segment simulations [Synergistic Engineering Environment (SEE) data, NASA Enterprise Visualization Analysis (NEVA) ground processing simulations, the DSS simulation for lunar operations, and the Johnson Space Center (JSC) TRICK tool for guidance, navigation, and control analysis] were experimented with. Desired functionalities, [i.e. Tivo-like functions, the capability to communicate textually or via Voice-over-Internet Protocol (VoIP) among team members, and the ability to write and save notes to be accessed later] were targeted. The resulting DON application was slated for early 2008 release to support simulation use for the Constellation Program and its teams. Those using the DON connect through a client that runs on their PC or Mac. This enables them to observe and analyze the simulation data as their schedule allows, and to review it as frequently as desired. DON team members can move freely within the virtual world. Preset camera points can be established, enabling team members to jump to specific views. This improves opportunities for shared analysis of options, design reviews, tests, operations, training, and evaluations, and improves prospects for verification of requirements, issues, and approaches among dispersed teams.

  12. C3H7NO2S effect on concrete steel-rebar corrosion in 0.5 M H2SO4 simulating industrial/microbial environment

    NASA Astrophysics Data System (ADS)

    Okeniyi, Joshua Olusegun; Nwadialo, Christopher Chukwuweike; Olu-Steven, Folusho Emmanuel; Ebinne, Samaru Smart; Coker, Taiwo Ebenezer; Okeniyi, Elizabeth Toyin; Ogbiye, Adebanji Samuel; Durotoye, Taiwo Omowunmi; Badmus, Emmanuel Omotunde Oluwasogo

    2017-02-01

    This paper investigates C3H7NO2S (Cysteine) effect on the inhibition of reinforcing steel corrosion in concrete immersed in 0.5 M H2SO4, for simulating industrial/microbial environment. Different C3H7NO2S concentrations were admixed, in duplicates, in steel-reinforced concrete samples that were partially immersed in the acidic sulphate environment. Electrochemical monitoring techniques of open circuit potential, as per ASTM C876-91 R99, and corrosion rate, by linear polarization resistance, were then employed for studying anticorrosion effect in steel-reinforced concrete samples by the organic hydrocarbon admixture. Analyses of electrochemical test-data followed ASTM G16-95 R04 prescriptions including probability distribution modeling with significant testing by Kolmogorov-Smirnov and student's t-tests statistics. Results established that all datasets of corrosion potential distributed like the Normal, the Gumbel and the Weibull distributions but that only the Weibull model described all the corrosion rate datasets in the study, as per the Kolmogorov-Smirnov test-statistics. Results of the student's t-test showed that differences of corrosion test-data between duplicated samples with the same C3H7NO2S concentrations were not statistically significant. These results indicated that 0.06878 M C3H7NO2S exhibited optimal inhibition efficiency η = 90.52±1.29% on reinforcing steel corrosion in the concrete samples immersed in 0.5 M H2SO4, simulating industrial/microbial service-environment.

  13. Analysis of exposure to electromagnetic fields in a healthcare environment: simulation and experimental study.

    PubMed

    de Miguel-Bilbao, Silvia; Martín, Miguel Angel; Del Pozo, Alejandro; Febles, Victor; Hernández, José A; de Aldecoa, José C Fernández; Ramos, Victoria

    2013-11-01

    Recent advances in wireless technologies have lead to an increase in wireless instrumentation present in healthcare centers. This paper presents an analytical method for characterizing electric field (E-field) exposure within these environments. The E-field levels of the different wireless communications systems have been measured in two floors of the Canary University Hospital Consortium (CUHC). The electromagnetic (EM) conditions detected with the experimental measures have been estimated using the software EFC-400-Telecommunications (Narda Safety Test Solutions, Sandwiesenstrasse 7, 72793 Pfullingen, Germany). The experimental and simulated results are represented through 2D contour maps, and have been compared with the recommended safety and exposure thresholds. The maximum value obtained is much lower than the 3 V m(-1) that is established in the International Electrotechnical Commission Standard of Electromedical Devices. Results show a high correlation in terms of E-field cumulative distribution function (CDF) between the experimental and simulation results. In general, the CDFs of each pair of experimental and simulated samples follow a lognormal distribution with the same mean.

  14. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  15. Molecular dynamics simulations of electrostatics and hydration distributions around RNA and DNA motifs

    NASA Astrophysics Data System (ADS)

    Marlowe, Ashley E.; Singh, Abhishek; Semichaevsky, Andrey V.; Yingling, Yaroslava G.

    2009-03-01

    Nucleic acid nanoparticles can self-assembly through the formation of complementary loop-loop interactions or stem-stem interactions. Presence and concentration of ions can significantly affect the self-assembly process and the stability of the nanostructure. In this presentation we use explicit molecular dynamics simulations to examine the variations in cationic distributions and hydration environment around DNA and RNA helices and loop-loop interactions. Our simulations show that the potassium and sodium ionic distributions are different around RNA and DNA motifs which could be indicative of ion mediated relative stability of loop-loop complexes. Moreover in RNA loop-loop motifs ions are consistently present and exchanged through a distinct electronegative channel. We will also show how we used the specific RNA loop-loop motif to design a RNA hexagonal nanoparticle.

  16. Numerical Computation of Electric Field and Potential Along Silicone Rubber Insulators Under Contaminated and Dry Band Conditions

    NASA Astrophysics Data System (ADS)

    Arshad; Nekahi, A.; McMeekin, S. G.; Farzaneh, M.

    2016-09-01

    Electrical field distribution along the insulator surface is considered one of the important parameters for the performance evaluation of outdoor insulators. In this paper numerical simulations were carried out to investigate the electric field and potential distribution along silicone rubber insulators under various polluted and dry band conditions. Simulations were performed using commercially available simulation package Comsol Multiphysics based on the finite element method. Various pollution severity levels were simulated by changing the conductivity of pollution layer. Dry bands of 2 cm width were inserted at the high voltage end, ground end, middle part, shed, sheath, and at the junction of shed and sheath to investigate the effect of dry band location and width on electric field and potential distribution. Partial pollution conditions were simulated by applying pollution layer on the top and bottom surface respectively. It was observed from the simulation results that electric field intensity was higher at the metal electrode ends and at the junction of dry bands. Simulation results showed that potential distribution is nonlinear in the case of clean and partially polluted insulator and linear for uniform pollution layer. Dry band formation effect both potential and electric field distribution. Power dissipated along the insulator surface and the resultant heat generation was also studied. The results of this study could be useful in the selection of polymeric insulators for contaminated environments.

  17. Particle simulation on heterogeneous distributed supercomputers

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Dagum, Leonardo

    1993-01-01

    We describe the implementation and performance of a three dimensional particle simulation distributed between a Thinking Machines CM-2 and a Cray Y-MP. These are connected by a combination of two high-speed networks: a high-performance parallel interface (HIPPI) and an optical network (UltraNet). This is the first application to use this configuration at NASA Ames Research Center. We describe our experience implementing and using the application and report the results of several timing measurements. We show that the distribution of applications across disparate supercomputing platforms is feasible and has reasonable performance. In addition, several practical aspects of the computing environment are discussed.

  18. Modeling and Simulation of Upset-Inducing Disturbances for Digital Systems in an Electromagnetic Reverberation Chamber

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report describes a modeling and simulation approach for disturbance patterns representative of the environment experienced by a digital system in an electromagnetic reverberation chamber. The disturbance is modeled by a multi-variate statistical distribution based on empirical observations. Extended versions of the Rejection Samping and Inverse Transform Sampling techniques are developed to generate multi-variate random samples of the disturbance. The results show that Inverse Transform Sampling returns samples with higher fidelity relative to the empirical distribution. This work is part of an ongoing effort to develop a resilience assessment methodology for complex safety-critical distributed systems.

  19. BIRD: A general interface for sparse distributed memory simulators

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1990-01-01

    Kanerva's sparse distributed memory (SDM) has now been implemented for at least six different computers, including SUN3 workstations, the Apple Macintosh, and the Connection Machine. A common interface for input of commands would both aid testing of programs on a broad range of computer architectures and assist users in transferring results from research environments to applications. A common interface also allows secondary programs to generate command sequences for a sparse distributed memory, which may then be executed on the appropriate hardware. The BIRD program is an attempt to create such an interface. Simplifying access to different simulators should assist developers in finding appropriate uses for SDM.

  20. Efficient calculation of the energy of a molecule in an arbitrary electric field

    NASA Astrophysics Data System (ADS)

    Pulay, Peter; Janowski, Tomasz

    In thermodynamic (e.g., Monte Carlo) simulations with electronic embedding, the energy of the active site or solute must be calculated for millions of configurations of the environment (solvent or protein matrix) to obtain reliable statistics. This precludes the use of accurate but expensive ab initio and density functional techniques. Except for the immediate neighbors, the effect of the environment is electrostatic. We show that the energy of a molecule in the irregular field of the environment can be determined very efficiently by expanding the electric potential in known functions, and precalculating the first and second order response of the molecule to the components of the potential. These generalized multipole moments and polarizabilities allow the calculation of the energy of the system without further ab initio calculations. Several expansion functions were explored: polynomials, distributed inverse powers, and sine functions. The latter provide the numerically most stable fit but require new types of integrals. Distributed inverse powers can be simulated using dummy atoms, and energies calculated this way provide a very good approximation to the actual energies in the field of the environment.

  1. Spatial Structures of the Environment and of Dispersal Impact Species Distribution in Competitive Metacommunities

    PubMed Central

    Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang

    2013-01-01

    The correspondence between species distribution and the environment depends on species’ ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics. PMID:23874815

  2. Spatial structures of the environment and of dispersal impact species distribution in competitive metacommunities.

    PubMed

    Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang

    2013-01-01

    The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.

  3. Electron distribution functions in electric field environments

    NASA Technical Reports Server (NTRS)

    Rudolph, Terence H.

    1991-01-01

    The amount of current carried by an electric discharge in its early stages of growth is strongly dependent on its geometrical shape. Discharges with a large number of branches, each funnelling current to a common stem, tend to carry more current than those with fewer branches. The fractal character of typical discharges was simulated using stochastic models based on solutions of the Laplace equation. Extension of these models requires the use of electron distribution functions to describe the behavior of electrons in the undisturbed medium ahead of the discharge. These electrons, interacting with the electric field, determine the propagation of branches in the discharge and the way in which further branching occurs. The first phase in the extension of the referenced models , the calculation of simple electron distribution functions in an air/electric field medium, is discussed. Two techniques are investigated: (1) the solution of the Boltzmann equation in homogeneous, steady state environments, and (2) the use of Monte Carlo simulations. Distribution functions calculated from both techniques are illustrated. Advantages and disadvantages of each technique are discussed.

  4. Acoustic simulation in architecture with parallel algorithm

    NASA Astrophysics Data System (ADS)

    Li, Xiaohong; Zhang, Xinrong; Li, Dan

    2004-03-01

    In allusion to complexity of architecture environment and Real-time simulation of architecture acoustics, a parallel radiosity algorithm was developed. The distribution of sound energy in scene is solved with this method. And then the impulse response between sources and receivers at frequency segment, which are calculated with multi-process, are combined into whole frequency response. The numerical experiment shows that parallel arithmetic can improve the acoustic simulating efficiency of complex scene.

  5. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce

    PubMed Central

    Pratx, Guillem; Xing, Lei

    2011-01-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916

  6. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  7. Beam distribution reconstruction simulation for electron beam probe

    NASA Astrophysics Data System (ADS)

    Feng, Yong-Chun; Mao, Rui-Shi; Li, Peng; Kang, Xin-Cai; Yin, Yan; Liu, Tong; You, Yao-Yao; Chen, Yu-Cong; Zhao, Tie-Cheng; Xu, Zhi-Guo; Wang, Yan-Yu; Yuan, You-Jin

    2017-07-01

    An electron beam probe (EBP) is a detector which makes use of a low-intensity and low-energy electron beam to measure the transverse profile, bunch shape, beam neutralization and beam wake field of an intense beam with small dimensions. While it can be applied to many aspects, we limit our analysis to beam distribution reconstruction. This kind of detector is almost non-interceptive for all of the beam and does not disturb the machine environment. In this paper, we present the theoretical aspects behind this technique for beam distribution measurement and some simulation results of the detector involved. First, a method to obtain a parallel electron beam is introduced and a simulation code is developed. An EBP as a profile monitor for dense beams is then simulated using the fast scan method for various target beam profiles, including KV distribution, waterbag distribution, parabolic distribution, Gaussian distribution and halo distribution. Profile reconstruction from the deflected electron beam trajectory is implemented and compared with the actual profile, and the expected agreement is achieved. Furthermore, as well as fast scan, a slow scan, i.e. step-by-step scan, is considered, which lowers the requirement for hardware, i.e. Radio Frequency deflector. We calculate the three-dimensional electric field of a Gaussian distribution and simulate the electron motion in this field. In addition, a fast scan along the target beam direction and slow scan across the beam are also presented, and can provide a measurement of longitudinal distribution as well as transverse profile simultaneously. As an example, simulation results for the China Accelerator Driven Sub-critical System (CADS) and High Intensity Heavy Ion Accelerator Facility (HIAF) are given. Finally, a potential system design for an EBP is described.

  8. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-06-08

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts, such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been developed that have the potential to mitigate many power quality concerns. However, local closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. To enable the study of the performance of advanced control schemes in a detailed distribution system environment, a test platform has been developed that integrates Power Hardware-in-the-Loop (PHIL) withmore » concurrent time-series electric distribution system simulation. In the test platform, GridLAB-D, a distribution system simulation tool, runs a detailed simulation of a distribution feeder in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling. At the National Renewable Energy Laboratory (NREL), a hardware inverter interacts with grid and PV simulators emulating an operational distribution system. Power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of inverter control modes—constant power factor and active Volt/VAr control—when integrated into a simulated IEEE 8500-node test feeder. We demonstrate that this platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, results are used to validate GridLAB-D simulations of advanced inverter controls.« less

  9. New Educational Modules Using a Cyber-Distribution System Testbed

    DOE PAGES

    Xie, Jing; Bedoya, Juan Carlos; Liu, Chen-Ching; ...

    2018-03-30

    At Washington State University (WSU), a modern cyber-physical system testbed has been implemented based on an industry grade distribution management system (DMS) that is integrated with remote terminal units (RTUs), smart meters, and a solar photovoltaic (PV). In addition, the real model from the Avista Utilities distribution system in Pullman, WA, is modeled in DMS. The proposed testbed environment allows students and instructors to utilize these facilities for innovations in learning and teaching. For power engineering education, this testbed helps students understand the interaction between a cyber system and a physical distribution system through industrial level visualization. The testbed providesmore » a distribution system monitoring and control environment for students. Compared with a simulation based approach, the testbed brings the students' learning environment a step closer to the real world. The educational modules allow students to learn the concepts of a cyber-physical system and an electricity market through an integrated testbed. Furthermore, the testbed provides a platform in the study mode for students to practice working on a real distribution system model. Here, this paper describes the new educational modules based on the testbed environment. Three modules are described together with the underlying educational principles and associated projects.« less

  10. New Educational Modules Using a Cyber-Distribution System Testbed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Jing; Bedoya, Juan Carlos; Liu, Chen-Ching

    At Washington State University (WSU), a modern cyber-physical system testbed has been implemented based on an industry grade distribution management system (DMS) that is integrated with remote terminal units (RTUs), smart meters, and a solar photovoltaic (PV). In addition, the real model from the Avista Utilities distribution system in Pullman, WA, is modeled in DMS. The proposed testbed environment allows students and instructors to utilize these facilities for innovations in learning and teaching. For power engineering education, this testbed helps students understand the interaction between a cyber system and a physical distribution system through industrial level visualization. The testbed providesmore » a distribution system monitoring and control environment for students. Compared with a simulation based approach, the testbed brings the students' learning environment a step closer to the real world. The educational modules allow students to learn the concepts of a cyber-physical system and an electricity market through an integrated testbed. Furthermore, the testbed provides a platform in the study mode for students to practice working on a real distribution system model. Here, this paper describes the new educational modules based on the testbed environment. Three modules are described together with the underlying educational principles and associated projects.« less

  11. The Vehicular Information Space Framework

    NASA Astrophysics Data System (ADS)

    Prinz, Vivian; Schlichter, Johann; Schweiger, Benno

    Vehicular networks are distributed, self-organizing and highly mobile ad hoc networks. They allow for providing drivers with up-to-the-minute information about their environment. Therefore, they are expected to be a decisive future enabler for enhancing driving comfort and safety. This article introduces the Vehicular Information Space framework (VIS). Vehicles running the VIS form a kind of distributed database. It enables them to provide information like existing hazards, parking spaces or traffic densities in a location aware and fully distributed manner. In addition, vehicles can retrieve, modify and delete these information items. The underlying algorithm is based on features derived from existing structured Peer-to-Peer algorithms and extended to suit the specific characteristics of highly mobile ad hoc networks. We present, implement and simulate the VIS using a motorway and an urban traffic environment. Simulation studies on VIS message occurrence show that the VIS implies reasonable traffic overhead. Also, overall VIS message traffic is independent from the number of information items provided.

  12. Lipopolysaccharide Membrane Building and Simulation

    PubMed Central

    Jo, Sunhwan; Wu, Emilia L.; Stuhlsatz, Danielle; Klauda, Jeffery B.; Widmalm, Göran; Im, Wonpil

    2015-01-01

    Summary While membrane simulations are widely employed to study the structure and dynamics of various lipid bilayers and membrane proteins in the bilayers, simulations of lipopolysaccharides (LPS) in membrane environments have been limited due to its structural complexity, difficulties in building LPS-membrane systems, and lack of appropriate molecular force field. In this work, as a first step to extend CHARMM-GUI Membrane Builder to incorporate LPS molecules and to explore their structures and dynamics in membrane environments using molecular dynamics simulations, we describe step-by-step procedures to build LPS bilayer systems using CHARMM and the recently developed CHARMM carbohydrate and lipid force fields. Such procedures are illustrated by building various bilayers of Escherichia coli O6 LPS and their preliminary simulation results are given in terms of per-LPS area and density distributions of various components along the membrane normal. PMID:25753722

  13. Overview Presentation

    NASA Technical Reports Server (NTRS)

    Lytle, John

    2001-01-01

    This report provides an overview presentation of the 2000 NPSS (Numerical Propulsion System Simulation) Review and Planning Meeting. Topics include: 1) a background of the program; 2) 1999 Industry Feedback; 3) FY00 Status, including resource distribution and major accomplishments; 4) FY01 Major Milestones; and 5) Future direction for the program. Specifically, simulation environment/production software and NPSS CORBA Security Development are discussed.

  14. Radar and microphysical characteristics of convective storms simulated from a numerical model using a new microphysical parameterization

    NASA Technical Reports Server (NTRS)

    Ferrier, Brad S.; Tao, Wei-Kuo; Simpson, Joanne

    1991-01-01

    The basic features of a new and improved bulk-microphysical parameterization capable of simulating the hydrometeor structure of convective systems in all types of large-scale environments (with minimal adjustment of coefficients) are studied. Reflectivities simulated from the model are compared with radar observations of an intense midlatitude convective system. Simulated reflectivities using the novel four-class ice scheme with a microphysical parameterization rain distribution at 105 min are illustrated. Preliminary results indicate that this new ice scheme works efficiently in simulating midlatitude continental storms.

  15. Simulation Environment Synchronizing Real Equipment for Manufacturing Cell

    NASA Astrophysics Data System (ADS)

    Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro

    Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.

  16. The characteristics simulation of FMCW laser backscattering signals

    NASA Astrophysics Data System (ADS)

    Liu, Bohu; Song, Chengtian; Duan, Yabo

    2018-04-01

    A Monte Carlo simulation model of FMCW laser transmission in a smoke interference environment was established in this paper. The aerosol extinction coefficient and scattering coefficient changed dynamically in the simulation according to the smoke concentration variation, aerosol particle distributions and photon spatial positions. The simulation results showed that the smoke backscattering interference produced a number of amplitude peaks in the beat signal spectrum; the SNR of target echo signal to smoke interference was related to the transmitted laser wavelength and the aerosol particle size distribution; a better SNR could be obtained when the laser wavelength was in the range of 560-1660 nm. The characteristics of FMCW laser backscattering signals generated by simulation are consistent with the theoretical analysis. Therefore, this study was greatly helpful for improving the ability of identifying target and anti-interference in the further research.

  17. Distributed virtual environment for emergency medical training

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.; Garcia, Brian W.; Godsell-Stytz, Gayl M.

    1997-07-01

    In many professions where individuals must work in a team in a high stress environment to accomplish a time-critical task, individual and team performance can benefit from joint training using distributed virtual environments (DVEs). One professional field that lacks but needs a high-fidelity team training environment is the field of emergency medicine. Currently, emergency department (ED) medical personnel train by using words to create a metal picture of a situation for the physician and staff, who then cooperate to solve the problems portrayed by the word picture. The need in emergency medicine for realistic virtual team training is critical because ED staff typically encounter rarely occurring but life threatening situations only once in their careers and because ED teams currently have no realistic environment in which to practice their team skills. The resulting lack of experience and teamwork makes diagnosis and treatment more difficult. Virtual environment based training has the potential to redress these shortfalls. The objective of our research is to develop a state-of-the-art virtual environment for emergency medicine team training. The virtual emergency room (VER) allows ED physicians and medical staff to realistically prepare for emergency medical situations by performing triage, diagnosis, and treatment on virtual patients within an environment that provides them with the tools they require and the team environment they need to realistically perform these three tasks. There are several issues that must be addressed before this vision is realized. The key issues deal with distribution of computations; the doctor and staff interface to the virtual patient and ED equipment; the accurate simulation of individual patient organs' response to injury, medication, and treatment; and an accurate modeling of the symptoms and appearance of the patient while maintaining a real-time interaction capability. Our ongoing work addresses all of these issues. In this paper we report on our prototype VER system and its distributed system architecture for an emergency department distributed virtual environment for emergency medical staff training. The virtual environment enables emergency department physicians and staff to develop their diagnostic and treatment skills using the virtual tools they need to perform diagnostic and treatment tasks. Virtual human imagery, and real-time virtual human response are used to create the virtual patient and present a scenario. Patient vital signs are available to the emergency department team as they manage the virtual case. The work reported here consists of the system architectures we developed for the distributed components of the virtual emergency room. The architectures we describe consist of the network level architecture as well as the software architecture for each actor within the virtual emergency room. We describe the role of distributed interactive simulation and other enabling technologies within the virtual emergency room project.

  18. An Overview of the Formation and Attitude Control System for the Terrestrial Planet Finder Formation Flying Interferometer

    NASA Technical Reports Server (NTRS)

    Scharf, Daniel P.; Hadaegh, Fred Y.; Rahman, Zahidul H.; Shields, Joel F.; Singh, Gurkipal; Wette, Matthew R.

    2004-01-01

    The Terrestrial Planet Finder formation flying Interferometer (TPF-I) will be a five-spacecraft, precision formation operating near the second Sun-Earth Lagrange point. As part of technology development for TPF-I, a formation and attitude control system (FACS) is being developed that achieves the precision and functionality needed for the TPF-I formation and that will be demonstrated in a distributed, real-time simulation environment. In this paper we present an overview of FACS and discuss in detail its formation estimation, guidance and control architectures and algorithms. Since FACS is currently being integrated into a high-fidelity simulation environment, component simulations demonstrating algorithm performance are presented.

  19. An Overview of the Formation and Attitude Control System for the Terrestrial Planet Finder Formation Flying Interferometer

    NASA Technical Reports Server (NTRS)

    Scharf, Daniel P.; Hadaegh, Fred Y.; Rahman, Zahidul H.; Shields, Joel F.; Singh, Gurkipal

    2004-01-01

    The Terrestrial Planet Finder formation flying Interferometer (TPF-I) will be a five-spacecraft, precision formation operating near a Sun-Earth Lagrange point. As part of technology development for TPF-I, a formation and attitude control system (FACS) is being developed that achieves the precision and functionality associated with the TPF-I formation. This FACS will be demonstrated in a distributed, real-time simulation environment. In this paper we present an overview of the FACS and discuss in detail its constituent formation estimation, guidance and control architectures and algorithms. Since the FACS is currently being integrated into a high-fidelity simulation environment, component simulations demonstrating algorithm performance are presented.

  20. NASA Virtual Glovebox: An Immersive Virtual Desktop Environment for Training Astronauts in Life Science Experiments

    NASA Technical Reports Server (NTRS)

    Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard

    2003-01-01

    The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.

  1. Mock Data Challenge for the MPD/NICA Experiment on the HybriLIT Cluster

    NASA Astrophysics Data System (ADS)

    Gertsenberger, Konstantin; Rogachevsky, Oleg

    2018-02-01

    Simulation of data processing before receiving first experimental data is an important issue in high-energy physics experiments. This article presents the current Event Data Model and the Mock Data Challenge for the MPD experiment at the NICA accelerator complex which uses ongoing simulation studies to exercise in a stress-testing the distributed computing infrastructure and experiment software in the full production environment from simulated data through the physical analysis.

  2. Modeling and Simulation of the Transient Response of Temperature and Relative Humidity Sensors with and without Protective Housing

    PubMed Central

    Rocha, Keller Sullivan Oliveira; Martins, José Helvecio; Martins, Marcio Arêdes; Ferreira Tinôco, Ilda de Fátima; Saraz, Jairo Alexander Osorio; Filho, Adílio Flauzino Lacerda; Fernandes, Luiz Henrique Martins

    2014-01-01

    Based on the necessity for enclosure protection of temperature and relative humidity sensors installed in a hostile environment, a wind tunnel was used to quantify the time that the sensors take to reach equilibrium in the environmental conditions to which they are exposed. Two treatments were used: (1) sensors with polyvinyl chloride (PVC) enclosure protection, and (2) sensors with no enclosure protection. The primary objective of this study was to develop and validate a 3-D computational fluid dynamics (CFD) model for analyzing the temperature and relative humidity distribution in a wind tunnel using sensors with PVC enclosure protection and sensors with no enclosure protection. A CFD simulation model was developed to describe the temperature distribution and the physics of mass transfer related to the airflow relative humidity. The first results demonstrate the applicability of the simulation. For verification, a sensor device was successfully assembled and tested in an environment that was optimized to ensure fast change conditions. The quantification setup presented in this paper is thus considered to be adequate for testing different materials and morphologies for enclosure protection. The results show that the boundary layer flow regime has a significant impact on the heat flux distribution. The results indicate that the CFD technique is a powerful tool which provides a detailed description of the flow and temperature fields as well as the time that the relative humidity takes to reach equilibrium with the environment in which the sensors are inserted. PMID:24851994

  3. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  4. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  5. Use of EPANET solver to manage water distribution in Smart City

    NASA Astrophysics Data System (ADS)

    Antonowicz, A.; Brodziak, R.; Bylka, J.; Mazurkiewicz, J.; Wojtecki, S.; Zakrzewski, P.

    2018-02-01

    Paper presents a method of using EPANET solver to support manage water distribution system in Smart City. The main task is to develop the application that allows remote access to the simulation model of the water distribution network developed in the EPANET environment. Application allows to perform both single and cyclic simulations with the specified step of changing the values of the selected process variables. In the paper the architecture of application was shown. The application supports the selection of the best device control algorithm using optimization methods. Optimization procedures are possible with following methods: brute force, SLSQP (Sequential Least SQuares Programming), Modified Powell Method. Article was supplemented by example of using developed computer tool.

  6. Massively Parallel Processing for Fast and Accurate Stamping Simulations

    NASA Astrophysics Data System (ADS)

    Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu

    2005-08-01

    The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Timothy M.; Palmintier, Bryan; Suryanarayanan, Siddharth

    As more Smart Grid technologies (e.g., distributed photovoltaic, spatially distributed electric vehicle charging) are integrated into distribution grids, static distribution simulations are no longer sufficient for performing modeling and analysis. GridLAB-D is an agent-based distribution system simulation environment that allows fine-grained end-user models, including geospatial and network topology detail. A problem exists in that, without outside intervention, once the GridLAB-D simulation begins execution, it will run to completion without allowing the real-time interaction of Smart Grid controls, such as home energy management systems and aggregator control. We address this lack of runtime interaction by designing a flexible communication interface, Bus.pymore » (pronounced bus-dot-pie), that uses Python to pass messages between one or more GridLAB-D instances and a Smart Grid simulator. This work describes the design and implementation of Bus.py, discusses its usefulness in terms of some Smart Grid scenarios, and provides an example of an aggregator-based residential demand response system interacting with GridLAB-D through Bus.py. The small scale example demonstrates the validity of the interface and shows that an aggregator using said interface is able to control residential loads in GridLAB-D during runtime to cause a reduction in the peak load on the distribution system in (a) peak reduction and (b) time-of-use pricing cases.« less

  8. Applying Multivariate Discrete Distributions to Genetically Informative Count Data.

    PubMed

    Kirkpatrick, Robert M; Neale, Michael C

    2016-03-01

    We present a novel method of conducting biometric analysis of twin data when the phenotypes are integer-valued counts, which often show an L-shaped distribution. Monte Carlo simulation is used to compare five likelihood-based approaches to modeling: our multivariate discrete method, when its distributional assumptions are correct, when they are incorrect, and three other methods in common use. With data simulated from a skewed discrete distribution, recovery of twin correlations and proportions of additive genetic and common environment variance was generally poor for the Normal, Lognormal and Ordinal models, but good for the two discrete models. Sex-separate applications to substance-use data from twins in the Minnesota Twin Family Study showed superior performance of two discrete models. The new methods are implemented using R and OpenMx and are freely available.

  9. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  10. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  11. Real-Time Hardware-in-the-Loop Simulation of Ares I Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick; Matras, Alex; Walker, David; Wilson, Heath; Fulton, Chris; Alday, Nathan; Betts, Kevin; Hughes, Ryan; Turbe, Michael

    2009-01-01

    The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory at the Marshall Space Flight Center. The primary purpose of the Ares System Integration Laboratory is to test the vehicle avionics hardware and software in a hardware - in-the-loop environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time simulation backbone to stimulate all required Ares components for verification testing. ARTE_VIIS provides high -fidelity dynamics, actuator, and sensor models to simulate an accurate flight trajectory in order to ensure realistic test conditions. ARTEMIS has been designed to take advantage of the advances in underlying computational power now available to support hardware-in-the-loop testing to achieve real-time simulation with unprecedented model fidelity. A modular realtime design relying on a fully distributed computing architecture has been implemented.

  12. Simulation of Optical Resonators for Vertical-Cavity Surface-Emitting Lasers (vcsel)

    NASA Astrophysics Data System (ADS)

    Mansour, Mohy S.; Hassen, Mahmoud F. M.; El-Nozahey, Adel M.; Hafez, Alaa S.; Metry, Samer F.

    2010-04-01

    Simulation and modeling of the reflectivity and transmissivity of the multilayer DBR of VCSEL, as well as inside the active region quantum well are analyzed using the characteristic matrix method. The electric field intensity distributions inside such vertical-cavity structure are calculated. A software program under MATLAB environment is constructed for the simulation. This study was performed for two specific Bragg wavelengths 980 nm and 370 nm for achieving a resonant periodic gain (RPG)

  13. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  14. Microgravity effects on water flow and distribution in unsaturated porous media: Analyses of flight experiments

    NASA Astrophysics Data System (ADS)

    Jones, Scott B.; Or, Dani

    1999-04-01

    Plants grown in porous media are part of a bioregenerative life support system designed for long-duration space missions. Reduced gravity conditions of orbiting spacecraft (microgravity) alter several aspects of liquid flow and distribution within partially saturated porous media. The objectives of this study were to evaluate the suitability of conventional capillary flow theory in simulating water distribution in porous media measured in a microgravity environment. Data from experiments aboard the Russian space station Mir and a U.S. space shuttle were simulated by elimination of the gravitational term from the Richards equation. Qualitative comparisons with media hydraulic parameters measured on Earth suggest narrower pore size distributions and inactive or nonparticipating large pores in microgravity. Evidence of accentuated hysteresis, altered soil-water characteristic, and reduced unsaturated hydraulic conductivity from microgravity simulations may be attributable to a number of proposed secondary mechanisms. These are likely spawned by enhanced and modified paths of interfacial flows and an altered force ratio of capillary to body forces in microgravity.

  15. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    PubMed

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  16. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants

    PubMed Central

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325

  17. Molecular dynamics simulations of polarizable DNA in crystal environment

    NASA Astrophysics Data System (ADS)

    Babin, Volodymyr; Baucom, Jason; Darden, Thomas A.; Sagui, Celeste

    We have investigated the role of the electrostatic description and cell environment in molecular dynamics (MD) simulations of DNA. Multiple unrestrained MD simulations of the DNA duplex d(CCAACGTTGG)2 have been carried out using two different force fields: a traditional description based on atomic point charges and a polarizable force field. For the time scales probed, and given the ?right? distribution of divalent ions, the latter performs better than the nonpolarizable force field. In particular, by imposing the experimental unit cell environment, an initial configuration with ideal B-DNA duplexes in the unit cell acquires sequence-dependent features that very closely resemble the crystallographic ones. Simultaneously, the all-atom root-mean-square coordinates deviation (RMSD) with respect to the crystallographic structure is seen to decay. At later times, the polarizable force field is able to maintain this lower RMSD, while the nonpolarizable force field starts to drift away.

  18. Live Virtual Constructive Distributed Test Environment Characterization Report

    NASA Technical Reports Server (NTRS)

    Murphy, Jim; Kim, Sam K.

    2013-01-01

    This report documents message latencies observed over various Live, Virtual, Constructive, (LVC) simulation environment configurations designed to emulate possible system architectures for the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project integrated tests. For each configuration, four scenarios with progressively increasing air traffic loads were used to determine system throughput and bandwidth impacts on message latency.

  19. Constructing Virtual Training Demonstrations

    DTIC Science & Technology

    2008-12-01

    virtual environments have been shown to be effective for training, and distributed game -based architectures contribute an added benefit of wide...investigation of how a demonstration authoring toolset can be constructed from existing virtual training environments using 3-D multiplayer gaming ...intelligent agents project to create AI middleware for simulations and videogames . The result was SimBionic®, which enables users to graphically author

  20. Numerical Simulations of Flow Separation Control in Low-Pressure Turbines using Plasma Actuators

    NASA Technical Reports Server (NTRS)

    Suzen, Y. B.; Huang, P. G.; Ashpis, D. E.

    2007-01-01

    A recently introduced phenomenological model to simulate flow control applications using plasma actuators has been further developed and improved in order to expand its use to complicated actuator geometries. The new modeling approach eliminates the requirement of an empirical charge density distribution shape by using the embedded electrode as a source for the charge density. The resulting model is validated against a flat plate experiment with quiescent environment. The modeling approach incorporates the effect of the plasma actuators on the external flow into Navier Stokes computations as a body force vector which is obtained as a product of the net charge density and the electric field. The model solves the Maxwell equation to obtain the electric field due to the applied AC voltage at the electrodes and an additional equation for the charge density distribution representing the plasma density. The new modeling approach solves the charge density equation in the computational domain assuming the embedded electrode as a source therefore automatically generating a charge density distribution on the surface exposed to the flow similar to that observed in the experiments without explicitly specifying an empirical distribution. The model is validated against a flat plate experiment with quiescent environment.

  1. Influence of savanna fire on Australian monsoon season precipitation and circulation as simulated using a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Lynch, Amanda H.; Abramson, David; Görgen, Klaus; Beringer, Jason; Uotila, Petteri

    2007-10-01

    Fires in the Australian savanna have been hypothesized to affect monsoon evolution, but the hypothesis is controversial and the effects have not been quantified. A distributed computing approach allows the development of a challenging experimental design that permits simultaneous variation of all fire attributes. The climate model simulations are distributed around multiple independent computer clusters in six countries, an approach that has potential for a range of other large simulation applications in the earth sciences. The experiment clarifies that savanna burning can shape the monsoon through two mechanisms. Boundary-layer circulation and large-scale convergence is intensified monotonically through increasing fire intensity and area burned. However, thresholds of fire timing and area are evident in the consequent influence on monsoon rainfall. In the optimal band of late, high intensity fires with a somewhat limited extent, it is possible for the wet season to be significantly enhanced.

  2. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    NASA Astrophysics Data System (ADS)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  3. PCSIM: A Parallel Simulation Environment for Neural Circuits Fully Integrated with Python

    PubMed Central

    Pecevski, Dejan; Natschläger, Thomas; Schuch, Klaus

    2008-01-01

    The Parallel Circuit SIMulator (PCSIM) is a software package for simulation of neural circuits. It is primarily designed for distributed simulation of large scale networks of spiking point neurons. Although its computational core is written in C++, PCSIM's primary interface is implemented in the Python programming language, which is a powerful programming environment and allows the user to easily integrate the neural circuit simulator with data analysis and visualization tools to manage the full neural modeling life cycle. The main focus of this paper is to describe PCSIM's full integration into Python and the benefits thereof. In particular we will investigate how the automatically generated bidirectional interface and PCSIM's object-oriented modular framework enable the user to adopt a hybrid modeling approach: using and extending PCSIM's functionality either employing pure Python or C++ and thus combining the advantages of both worlds. Furthermore, we describe several supplementary PCSIM packages written in pure Python and tailored towards setting up and analyzing neural simulations. PMID:19543450

  4. Effects of dispersal on total biomass in a patchy, heterogeneous system: analysis and experiment.

    USGS Publications Warehouse

    Zhang, Bo; Liu, Xin; DeAngelis, Donald L.; Ni, Wei-Ming; Wang, G Geoff

    2015-01-01

    An intriguing recent result from mathematics is that a population diffusing at an intermediate rate in an environment in which resources vary spatially will reach a higher total equilibrium biomass than the population in an environment in which the same total resources are distributed homogeneously. We extended the current mathematical theory to apply to logistic growth and also showed that the result applies to patchy systems with dispersal among patches, both for continuous and discrete time. This allowed us to make specific predictions, through simulations, concerning the biomass dynamics, which were verified by a laboratory experiment. The experiment was a study of biomass growth of duckweed (Lemna minor Linn.), where the resources (nutrients added to water) were distributed homogeneously among a discrete series of water-filled containers in one treatment, and distributed heterogeneously in another treatment. The experimental results showed that total biomass peaked at an intermediate, relatively low, diffusion rate, higher than the total carrying capacity of the system and agreeing with the simulation model. The implications of the experiment to dynamics of source, sink, and pseudo-sink dynamics are discussed.

  5. Instrumentation and Methodology Development for Mars Mission

    NASA Technical Reports Server (NTRS)

    Chen, Yuan-Liang Albert

    2002-01-01

    The Mars environment comprises a dry, cold and low air pressure atmosphere with low gravity (0.38g) and high resistivity soil. The global dust storms that cover a large portion of Mars were observed often from Earth. This environment provides an idea condition for triboelectric charging. The extremely dry conditions on the Martian surface have raised concerns that electrostatic charge buildup will not be dissipated easily. If triboelectrically generated charge cannot be dissipated or avoided, then dust will accumulate on charged surfaces and electrostatic discharge may cause hazards for future exploration missions. The low surface temperature on Mars helps to prolong the charge decay on the dust particles and soil. To better understand the physics of Martian charged dust particles is essential to future Mars missions. We research and design two sensors, velocity/charge sensor and PZT momentum sensors, to detect the velocity distribution, charge distribution and mass distribution of Martian charged dust particles. These sensors are fabricated at NASA Kenney Space Center, Electromagnetic Physics Testbed. The sensors will be tested and calibrated for simulated Mars atmosphere condition with JSC MARS-1 Martian Regolith simulant in this NASA laboratory.

  6. Distributed decision support for the 21st century mission space

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2002-07-01

    The past decade has produced significant changes in the conduct of military operations: increased humanitarian missions, asymmetric warfare, the reliance on coalitions and allies, stringent rules of engagement, concern about casualties, and the need for sustained air operations. Future mission commanders will need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Integral to this process is creating situational assessment-understanding the mission space, simulation to analyze alternative futures, current capabilities, planning assessments, course-of-action assessments, and a common operational picture-keeping everyone on the same sheet of paper. Decision support tools in a distributed collaborative environment offer the capability of decomposing these complex multitask processes and distributing them over a dynamic set of execution assets. Decision support technologies can semi-automate activities, such as planning an operation, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that is not currently fused. The marriage of information and simulation technologies provides the mission commander with a collaborative virtual environment for planning and decision support.

  7. Development of ex vivo model for determining temperature distribution in tumor tissue during photothermal therapy

    NASA Astrophysics Data System (ADS)

    Liu, Shaojie; Doughty, Austin; Mesiya, Sana; Pettitt, Alex; Zhou, Feifan; Chen, Wei R.

    2017-02-01

    Temperature distribution in tissue is a crucial factor in determining the outcome of photothermal therapy in cancer treatment. In order to investigate the temperature distribution in tumor tissue during laser irradiation, we developed a novel ex vivo device to simulate the photothermal therapy on tumors. A 35°C, a thermostatic incubator was used to provide a simulation environment for body temperature of live animals. Different biological tissues (chicken breast and bovine liver) were buried inside a tissue-simulating gel and considered as tumor tissues. An 805-nm laser was used to irradiate the target tissue. A fiber with an interstitial cylindrical diffuser (10 mm) was directly inserted in the center of the tissue, and the needle probes of a thermocouple were inserted into the tissue paralleling the laser fiber at different distances to measure the temperature distribution. All of the procedures were performed in the incubator. Based on the results of this study, the temperature distribution in bovine liver is similar to that of tumor tissue under photothermal therapy with the same doses. Therefore, the developed model using bovine liver for determining temperature distribution can be used during interstitial photothermal therapy.

  8. Job Superscheduler Architecture and Performance in Computational Grid Environments

    NASA Technical Reports Server (NTRS)

    Shan, Hongzhang; Oliker, Leonid; Biswas, Rupak

    2003-01-01

    Computational grids hold great promise in utilizing geographically separated heterogeneous resources to solve large-scale complex scientific problems. However, a number of major technical hurdles, including distributed resource management and effective job scheduling, stand in the way of realizing these gains. In this paper, we propose a novel grid superscheduler architecture and three distributed job migration algorithms. We also model the critical interaction between the superscheduler and autonomous local schedulers. Extensive performance comparisons with ideal, central, and local schemes using real workloads from leading computational centers are conducted in a simulation environment. Additionally, synthetic workloads are used to perform a detailed sensitivity analysis of our superscheduler. Several key metrics demonstrate that substantial performance gains can be achieved via smart superscheduling in distributed computational grids.

  9. Proposing "the burns suite" as a novel simulation tool for advancing the delivery of burns education.

    PubMed

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2014-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. We explored this concept in a burns scenario in a novel, low-cost, high-fidelity, portable, immersive simulation environment (referred to as distributed simulation). This contextualized simulation/distributed simulation combination was named "The Burns Suite" (TBS). A pediatric burn resuscitation scenario was selected after high trainee demand. It was designed on Advanced Trauma and Life Support and Emergency Management of Severe Burns principles and refined using expert opinion through cognitive task analysis. TBS contained "realism" props, briefed nurses, and a simulated patient. Novices and experts were recruited. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's α was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twelve participants completed TBS scenario. Mean face and content validity ratings were high (4.6 and 4.5, respectively; range, 4-5). The internal consistency of questions was high. Qualitative data analysis revealed that participants felt 1) the experience was "real" and they were "able to behave as if in a real resuscitation environment," and 2) TBS "addressed what Advanced Trauma and Life Support and Emergency Management of Severe Burns didn't" (including the efficacy of incorporating nontechnical skills). TBS provides a novel, effective simulation tool to significantly advance the delivery of burns education. Recreating clinical challenge is crucial to optimize simulation training. This low-cost approach also has major implications for surgical education, particularly during increasing financial austerity. Alternative scenarios and/or procedures can be recreated within TBS, providing a diverse educational immersive simulation experience.

  10. Corrosion Behavior of Low-C Medium-Mn Steel in Simulated Marine Immersion and Splash Zone Environment

    NASA Astrophysics Data System (ADS)

    Zhang, Dazheng; Gao, Xiuhua; Su, Guanqiao; Du, Linxiu; Liu, Zhenguang; Hu, Jun

    2017-05-01

    The corrosion behavior of low-C medium-Mn steel in simulated marine immersion and splash zone environment was studied by static immersion corrosion experiment and wet-dry cyclic corrosion experiment, respectively. Corrosion rate, corrosion products, surface morphology, cross-sectional morphology, elemental distribution, potentiodynamic polarization curves and electrochemical impedance spectra were used to elucidate the corrosion behavior of low-C medium-Mn steel. The results show that corrosion rate in immersion zone is much less than that in splash zone owing to its relatively mild environment. Manganese compounds are detected in the corrosion products and only appeared in splash zone environment, which can deteriorate the protective effect of rust layer. With the extension of exposure time, corrosion products are gradually transformed into dense and thick corrosion rust from the loose and porous one in these two environments. But in splash zone environment, alloying elements of Mn appear significant enrichment in the rust layer, which decrease the corrosion resistance of the steel.

  11. Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 2

    NASA Technical Reports Server (NTRS)

    Bernard, Douglas E. (Editor); Man, Guy K. (Editor)

    1989-01-01

    This volume of the conference proceedings contain papers and discussions in the following topical areas: Parallel processing; Emerging integrated capabilities; Low order controllers; Real time simulation; Multibody component representation; User environment; and Distributed parameter techniques.

  12. Particle Filter Based Tracking in a Detection Sparse Discrete Event Simulation Environment

    DTIC Science & Technology

    2007-03-01

    obtained by disqualifying a large number of particles. 52 (a) (b) ( c ) Figure 31. Particle Disqualification via Sanitization b...1 B. RESEARCH APPROACH..............................................................................5 C . THESIS ORGANIZATION...38 b. Detection Distribution Sampling............................................43 c . Estimated Position Calculation

  13. The TAVERNS emulator: An Ada simulation of the space station data communications network and software development environment

    NASA Technical Reports Server (NTRS)

    Howes, Norman R.

    1986-01-01

    The Space Station DMS (Data Management System) is the onboard component of the Space Station Information System (SSIS) that includes the computers, networks and software that support the various core and payload subsystems of the Space Station. TAVERNS (Test And Validation Environment for Remote Networked Systems) is a distributed approach for development and validation of application software for Space Station. The TAVERNS concept assumes that the different subsystems will be developed by different contractors who may be geographically separated. The TAVERNS Emulator is an Ada simulation of a TAVERNS on the ASD VAX. The software services described in the DMS Test Bed User's Manual are being emulated on the VAX together with simulations of some of the core subsystems and a simulation of the DCN. The TAVERNS Emulator will be accessible remotely from any VAX that can communicate with the ASD VAX.

  14. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H.; Montesinos-López, José C.; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-01-01

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. PMID:28364037

  15. Proceedings of the Organization of 1990 Meeting of International Neural Network Society Jointed with IEEE Held in Washington, DC on January 15 - 19, 1990. Volume 2. Applications Track.

    DTIC Science & Technology

    1990-11-30

    Simonotto Universita’ di Genova Learning from Natural Selection in an Artificial Environment ...................................................... 1...11-92 Ethem Alpaydin Swiss Federal Institute of Technology Framework for Distributed Artificial Neural System Simulation...11-129 David Y. Fong Lockheed Missiles and Space Co. and Christopher Tocci Raytheon Co. Simulation of Artificial Neural

  16. The effects of solid rocket motor effluents on selected surfaces and solid particle size, distribution, and composition for simulated shuttle booster separation motors

    NASA Technical Reports Server (NTRS)

    Jex, D. W.; Linton, R. C.; Russell, W. M.; Trenkle, J. J.; Wilkes, D. R.

    1976-01-01

    A series of three tests was conducted using solid rocket propellants to determine the effects a solid rocket plume would have on thermal protective surfaces (TPS). The surfaces tested were those which are baselined for the shuttle vehicle. The propellants used were to simulate the separation solid rocket motors (SSRM) that separate the solid rocket boosters (SRB) from the shuttle launch vehicle. Data cover: (1) the optical effects of the plume environment on spacecraft related surfaces, and (2) the solid particle size, distribution, and composition at TPS sample locations.

  17. Bio-inspired sensing and control for disturbance rejection and stabilization

    NASA Astrophysics Data System (ADS)

    Gremillion, Gregory; Humbert, James S.

    2015-05-01

    The successful operation of small unmanned aircraft systems (sUAS) in dynamic environments demands robust stability in the presence of exogenous disturbances. Flying insects are sensor-rich platforms, with highly redundant arrays of sensors distributed across the insect body that are integrated to extract rich information with diminished noise. This work presents a novel sensing framework in which measurements from an array of accelerometers distributed across a simulated flight vehicle are linearly combined to directly estimate the applied forces and torques with improvements in SNR. In simulation, the estimation performance is quantified as a function of sensor noise level, position estimate error, and sensor quantity.

  18. skelesim: an extensible, general framework for population genetic simulation in R.

    PubMed

    Parobek, Christian M; Archer, Frederick I; DePrenger-Levin, Michelle E; Hoban, Sean M; Liggins, Libby; Strand, Allan E

    2017-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares' complex capabilities, composing code and input files, a daunting bioinformatics barrier and a steep conceptual learning curve. skelesim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics and organizing data output, in a reproducible pipeline within the R environment. skelesim is designed to be an extensible framework that can 'wrap' around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skelesim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skelesim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skelesim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). © 2016 John Wiley & Sons Ltd.

  19. skeleSim: an extensible, general framework for population genetic simulation in R

    PubMed Central

    Parobek, Christian M.; Archer, Frederick I.; DePrenger-Levin, Michelle E.; Hoban, Sean M.; Liggins, Libby; Strand, Allan E.

    2016-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares’ complex capabilities, composing code and input files, a daunting bioinformatics barrier, and a steep conceptual learning curve. skeleSim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics, and organizing data output, in a reproducible pipeline within the R environment. skeleSim is designed to be an extensible framework that can ‘wrap’ around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skeleSim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skeleSim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skeleSim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). PMID:27736016

  20. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  1. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  2. High-performing simulations of the space radiation environment for the International Space Station and Apollo Missions

    NASA Astrophysics Data System (ADS)

    Lund, Matthew Lawrence

    The space radiation environment is a significant challenge to future manned and unmanned space travels. Future missions will rely more on accurate simulations of radiation transport in space through spacecraft to predict astronaut dose and energy deposition within spacecraft electronics. The International Space Station provides long-term measurements of the radiation environment in Low Earth Orbit (LEO); however, only the Apollo missions provided dosimetry data beyond LEO. Thus dosimetry analysis for deep space missions is poorly supported with currently available data, and there is a need to develop dosimetry-predicting models for extended deep space missions. GEANT4, a Monte Carlo Method, provides a powerful toolkit in C++ for simulation of radiation transport in arbitrary media, thus including the spacecraft and space travels. The newest version of GEANT4 supports multithreading and MPI, resulting in faster distributive processing of simulations in high-performance computing clusters. This thesis introduces a new application based on GEANT4 that greatly reduces computational time using Kingspeak and Ember computational clusters at the Center for High Performance Computing (CHPC) to simulate radiation transport through full spacecraft geometry, reducing simulation time to hours instead of weeks without post simulation processing. Additionally, this thesis introduces a new set of detectors besides the historically used International Commission of Radiation Units (ICRU) spheres for calculating dose distribution, including a Thermoluminescent Detector (TLD), Tissue Equivalent Proportional Counter (TEPC), and human phantom combined with a series of new primitive scorers in GEANT4 to calculate dose equivalence based on the International Commission of Radiation Protection (ICRP) standards. The developed models in this thesis predict dose depositions in the International Space Station and during the Apollo missions showing good agreement with experimental measurements. From these models the greatest contributor to radiation dose for the Apollo missions was from Galactic Cosmic Rays due to the short time within the radiation belts. The Apollo 14 dose measurements were an order of magnitude higher compared to other Apollo missions. The GEANT4 model of the Apollo Command Module shows consistent doses due to Galactic Cosmic Rays and Radiation Belts for all missions, with a small variation in dose distribution across the capsule. The model also predicts well the dose depositions and equivalent dose values in various human organs for the International Space Station or Apollo Command Module.

  3. Risk, individual differences, and environment: an Agent-Based Modeling approach to sexual risk-taking.

    PubMed

    Nagoski, Emily; Janssen, Erick; Lohrmann, David; Nichols, Eric

    2012-08-01

    Risky sexual behaviors, including the decision to have unprotected sex, result from interactions between individuals and their environment. The current study explored the use of Agent-Based Modeling (ABM)-a methodological approach in which computer-generated artificial societies simulate human sexual networks-to assess the influence of heterogeneity of sexual motivation on the risk of contracting HIV. The models successfully simulated some characteristics of human sexual systems, such as the relationship between individual differences in sexual motivation (sexual excitation and inhibition) and sexual risk, but failed to reproduce the scale-free distribution of number of partners observed in the real world. ABM has the potential to inform intervention strategies that target the interaction between an individual and his or her social environment.

  4. Proton Lateral Broadening Distribution Comparisons Between GRNTRN, MCNPX, and Laboratory Beam Measurements

    NASA Technical Reports Server (NTRS)

    Mertens, Christopher J.; Moyers, Michael F.; Walker, Steven A.; Tweed, John

    2010-01-01

    Recent developments in NASA s deterministic High charge (Z) and Energy TRaNsport (HZETRN) code have included lateral broadening of primary ion beams due to small-angle multiple Coulomb scattering, and coupling of the ion-nuclear scattering interactions with energy loss and straggling. This new version of HZETRN is based on Green function methods, called GRNTRN, and is suitable for modeling transport with both space environment and laboratory boundary conditions. Multiple scattering processes are a necessary extension to GRNTRN in order to accurately model ion beam experiments, to simulate the physical and biological-effective radiation dose, and to develop new methods and strategies for light ion radiation therapy. In this paper we compare GRNTRN simulations of proton lateral broadening distributions with beam measurements taken at Loma Linda University Proton Therapy Facility. The simulated and measured lateral broadening distributions are compared for a 250 MeV proton beam on aluminum, polyethylene, polystyrene, bone substitute, iron, and lead target materials. The GRNTRN results are also compared to simulations from the Monte Carlo MCNPX code for the same projectile-target combinations described above.

  5. Computational modeling and statistical analyses on individual contact rate and exposure to disease in complex and confined transportation hubs

    NASA Astrophysics Data System (ADS)

    Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.

    2018-01-01

    Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.

  6. [Low Fidelity Simulation of a Zero-Y Robot

    NASA Technical Reports Server (NTRS)

    Sweet, Adam

    2001-01-01

    The item to be cleared is a low-fidelity software simulation model of a hypothetical freeflying robot designed for use in zero gravity environments. This simulation model works with the HCC simulation system that was developed by Xerox PARC and NASA Ames Research Center. HCC has been previously cleared for distribution. When used with the HCC software, the model computes the location and orientation of the simulated robot over time. Failures (such as a broken motor) can be injected into the simulation to produce simulated behavior corresponding to the failure. Release of this simulation will allow researchers to test their software diagnosis systems by attempting to diagnose the simulated failure from the simulated behavior. This model does not contain any encryption software nor can it perform any control tasks that might be export controlled.

  7. GEANT4 distributed computing for compact clusters

    NASA Astrophysics Data System (ADS)

    Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.

    2014-11-01

    A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.

  8. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  9. Toward an Integrated Executable Architecture and M&S Based Analysis for Counter Terrorism and Homeland Security

    DTIC Science & Technology

    2006-09-01

    Lavoie, D. Kurts, SYNTHETIC ENVIRONMENTS AT THE ENTREPRISE LEVEL: OVERVIEW OF A GOVERNMENT OF CANADA (GOC), ACADEMIA and INDUSTRY DISTRIBUTED...vehicle (UAV) focused to locate the radiological source, and by comparing the performance of these assets in terms of various capability based...framework to analyze homeland security capabilities • Illustrate how a rapidly configured distributed simulation involving academia, industry and

  10. Reducing acquisition risk through integrated systems of systems engineering

    NASA Astrophysics Data System (ADS)

    Gross, Andrew; Hobson, Brian; Bouwens, Christina

    2016-05-01

    In the fall of 2015, the Joint Staff J7 (JS J7) sponsored the Bold Quest (BQ) 15.2 event and conducted planning and coordination to combine this event into a joint event with the Army Warfighting Assessment (AWA) 16.1 sponsored by the U.S. Army. This multipurpose event combined a Joint/Coalition exercise (JS J7) with components of testing, training, and experimentation required by the Army. In support of Assistant Secretary of the Army for Acquisition, Logistics, and Technology (ASA(ALT)) System of Systems Engineering and Integration (SoSE&I), Always On-On Demand (AO-OD) used a system of systems (SoS) engineering approach to develop a live, virtual, constructive distributed environment (LVC-DE) to support risk mitigation utilizing this complex and challenging exercise environment for a system preparing to enter limited user test (LUT). AO-OD executed a requirements-based SoS engineering process starting with user needs and objectives from Army Integrated Air and Missile Defense (AIAMD), Patriot units, Coalition Intelligence, Surveillance and Reconnaissance (CISR), Focused End State 4 (FES4) Mission Command (MC) Interoperability with Unified Action Partners (UAP), and Mission Partner Environment (MPE) Integration and Training, Tactics and Procedures (TTP) assessment. The SoS engineering process decomposed the common operational, analytical, and technical requirements, while utilizing the Institute of Electrical and Electronics Engineers (IEEE) Distributed Simulation Engineering and Execution Process (DSEEP) to provide structured accountability for the integration and execution of the AO-OD LVC-DE. As a result of this process implementation, AO-OD successfully planned for, prepared, and executed a distributed simulation support environment that responsively satisfied user needs and objectives, demonstrating the viability of an LVC-DE environment to support multiple user objectives and support risk mitigation activities for systems in the acquisition process.

  11. The Planetary and Space Simulation Facilities at DLR Cologne

    NASA Astrophysics Data System (ADS)

    Rabbow, Elke; Parpart, André; Reitz, Günther

    2016-06-01

    Astrobiology strives to increase our knowledge on the origin, evolution and distribution of life, on Earth and beyond. In the past centuries, life has been found on Earth in environments with extreme conditions that were expected to be uninhabitable. Scientific investigations of the underlying metabolic mechanisms and strategies that lead to the high adaptability of these extremophile organisms increase our understanding of evolution and distribution of life on Earth. Life as we know it depends on the availability of liquid water. Exposure of organisms to defined and complex extreme environmental conditions, in particular those that limit the water availability, allows the investigation of the survival mechanisms as well as an estimation of the possibility of the distribution to and survivability on other celestial bodies of selected organisms. Space missions in low Earth orbit (LEO) provide access for experiments to complex environmental conditions not available on Earth, but studies on the molecular and cellular mechanisms of adaption to these hostile conditions and on the limits of life cannot be performed exclusively in space experiments. Experimental space is limited and allows only the investigation of selected endpoints. An additional intensive ground based program is required, with easy to access facilities capable to simulate space and planetary environments, in particular with focus on temperature, pressure, atmospheric composition and short wavelength solar ultraviolet radiation (UV). DLR Cologne operates a number of Planetary and Space Simulation facilities (PSI) where microorganisms from extreme terrestrial environments or known for their high adaptability are exposed for mechanistic studies. Space or planetary parameters are simulated individually or in combination in temperature controlled vacuum facilities equipped with a variety of defined and calibrated irradiation sources. The PSI support basic research and were recurrently used for pre-flight test programs for several astrobiological space missions. Parallel experiments on ground provided essential complementary data supporting the scientific interpretation of the data received from the space missions.

  12. METAL AEROSOL FORMATION IN A LABORATORY SWIRL FLAME INCINERATOR

    EPA Science Inventory

    The paper describes experiments performed using an 82 kW (280,000 Btu/hr) refractory-lined horizontal tunnel combustor to examine the aerosol particle size distribution (PSD) produced by simulated nickel, cadmium, and lead wastes injected into an incineration environment. Metal c...

  13. Impact of Land Cover Characterization and Properties on Snow Albedo in Climate Models

    NASA Astrophysics Data System (ADS)

    Wang, L.; Bartlett, P. A.; Chan, E.; Montesano, P.

    2017-12-01

    The simulation of winter albedo in boreal and northern environments has been a particular challenge for land surface modellers. Assessments of output from CMIP3 and CMIP5 climate models have revealed that many simulations are characterized by overestimation of albedo in the boreal forest. Recent studies suggest that inaccurate representation of vegetation distribution, improper simulation of leaf area index, and poor treatment of canopy-snow processes are the primary causes of albedo errors. While several land cover datasets are commonly used to derive plant functional types (PFT) for use in climate models, new land cover and vegetation datasets with higher spatial resolution have become available in recent years. In this study, we compare the spatial distribution of the dominant PFTs and canopy cover fractions based on different land cover datasets, and present results from offline simulations of the latest version Canadian Land Surface Scheme (CLASS) over the northern Hemisphere land. We discuss the impact of land cover representation and surface properties on winter albedo simulations in climate models.

  14. A parametric study of surface roughness and bonding mechanisms of aluminum alloys with epoxies: a molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Timilsina, Rajendra; Termaath, Stephanie

    The marine environment is highly aggressive towards most materials. However, aluminium-magnesium alloys (Al-Mg, specifically, 5xxx series) have exceptionally long service life in such aggressive marine environments. For instance, an Al-Mg alloy, AA5083, is extensively used in naval structures because of its good mechanical strength, formability, seawater corrosion resistance and weldability. However, bonding mechanisms of these alloys with epoxies in a rough surface environment are not fully understood yet. It requires a rigorous investigation at molecular or atomic levels. We performed a molecular dynamics simulation to study an adherend surface preparation and surface bonding mechanisms of Al-Mg alloy (AA5083) with different epoxies by developing several computer models. Various distributions of surface roughness are introduced in the models and performed molecular dynamics simulations. Formation of a beta phase (Al3Mg2) , microstructures, bonding energies at the interface, bonding strengths and durability are investigated. Office of Naval Research.

  15. Evaluation of acoustic testing techniques for spacecraft systems

    NASA Technical Reports Server (NTRS)

    Cockburn, J. A.

    1971-01-01

    External acoustic environments, structural responses, noise reductions, and the internal acoustic environments have been predicted for a typical shroud/spacecraft system during lift-off and various critical stages of flight. Spacecraft responses caused by energy transmission from the shroud via mechanical and acoustic paths have been compared and the importance of the mechanical path has been evaluated. Theoretical predictions have been compared extensively with available laboratory and in-flight measurements. Equivalent laboratory acoustic fields for simulation of shroud response during the various phases of flight have been derived and compared in detail. Techniques for varying the time-space correlations of laboratory acoustic fields have been examined, together with methods for varying the time and spatial distribution of acoustic amplitudes. Possible acoustic testing configurations for shroud/spacecraft systems have been suggested and trade-off considerations have been reviewed. The problem of simulating the acoustic environments versus simulating the structural responses has been considered and techniques for testing without the shroud installed have been discussed.

  16. Pigeon interaction mode switch-based UAV distributed flocking control under obstacle environments.

    PubMed

    Qiu, Huaxin; Duan, Haibin

    2017-11-01

    Unmanned aerial vehicle (UAV) flocking control is a serious and challenging problem due to local interactions and changing environments. In this paper, a pigeon flocking model and a pigeon coordinated obstacle-avoiding model are proposed based on a behavior that pigeon flocks will switch between hierarchical and egalitarian interaction mode at different flight phases. Owning to the similarity between bird flocks and UAV swarms in essence, a distributed flocking control algorithm based on the proposed pigeon flocking and coordinated obstacle-avoiding models is designed to coordinate a heterogeneous UAV swarm to fly though obstacle environments with few informed individuals. The comparative simulation results are elaborated to show the feasibility, validity and superiority of our proposed algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    NASA Astrophysics Data System (ADS)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  18. Implementation of quantum key distribution network simulation module in the network simulator NS-3

    NASA Astrophysics Data System (ADS)

    Mehic, Miralem; Maurhart, Oliver; Rass, Stefan; Voznak, Miroslav

    2017-10-01

    As the research in quantum key distribution (QKD) technology grows larger and becomes more complex, the need for highly accurate and scalable simulation technologies becomes important to assess the practical feasibility and foresee difficulties in the practical implementation of theoretical achievements. Due to the specificity of the QKD link which requires optical and Internet connection between the network nodes, to deploy a complete testbed containing multiple network hosts and links to validate and verify a certain network algorithm or protocol would be very costly. Network simulators in these circumstances save vast amounts of money and time in accomplishing such a task. The simulation environment offers the creation of complex network topologies, a high degree of control and repeatable experiments, which in turn allows researchers to conduct experiments and confirm their results. In this paper, we described the design of the QKD network simulation module which was developed in the network simulator of version 3 (NS-3). The module supports simulation of the QKD network in an overlay mode or in a single TCP/IP mode. Therefore, it can be used to simulate other network technologies regardless of QKD.

  19. Design and simulation of sensor networks for tracking Wifi users in outdoor urban environments

    NASA Astrophysics Data System (ADS)

    Thron, Christopher; Tran, Khoi; Smith, Douglas; Benincasa, Daniel

    2017-05-01

    We present a proof-of-concept investigation into the use of sensor networks for tracking of WiFi users in outdoor urban environments. Sensors are fixed, and are capable of measuring signal power from users' WiFi devices. We derive a maximum likelihood estimate for user location based on instantaneous sensor power measurements. The algorithm takes into account the effects of power control, and is self-calibrating in that the signal power model used by the location algorithm is adjusted and improved as part of the operation of the network. Simulation results to verify the system's performance are presented. The simulation scenario is based on a 1.5 km2 area of lower Manhattan, The self-calibration mechanism was verified for initial rms (root mean square) errors of up to 12 dB in the channel power estimates: rms errors were reduced by over 60% in 300 track-hours, in systems with limited power control. Under typical operating conditions with (without) power control, location rms errors are about 8.5 (5) meters with 90% accuracy within 9 (13) meters, for both pedestrian and vehicular users. The distance error distributions for smaller distances (<30 m) are well-approximated by an exponential distribution, while the distributions for large distance errors have fat tails. The issue of optimal sensor placement in the sensor network is also addressed. We specify a linear programming algorithm for determining sensor placement for networks with reduced number of sensors. In our test case, the algorithm produces a network with 18.5% fewer sensors with comparable accuracy estimation performance. Finally, we discuss future research directions for improving the accuracy and capabilities of sensor network systems in urban environments.

  20. A Markov Environment-dependent Hurricane Intensity Model and Its Comparison with Multiple Dynamic Models

    NASA Astrophysics Data System (ADS)

    Jing, R.; Lin, N.; Emanuel, K.; Vecchi, G. A.; Knutson, T. R.

    2017-12-01

    A Markov environment-dependent hurricane intensity model (MeHiM) is developed to simulate the climatology of hurricane intensity given the surrounding large-scale environment. The model considers three unobserved discrete states representing respectively storm's slow, moderate, and rapid intensification (and deintensification). Each state is associated with a probability distribution of intensity change. The storm's movement from one state to another, regarded as a Markov chain, is described by a transition probability matrix. The initial state is estimated with a Bayesian approach. All three model components (initial intensity, state transition, and intensity change) are dependent on environmental variables including potential intensity, vertical wind shear, midlevel relative humidity, and ocean mixing characteristics. This dependent Markov model of hurricane intensity shows a significant improvement over previous statistical models (e.g., linear, nonlinear, and finite mixture models) in estimating the distributions of 6-h and 24-h intensity change, lifetime maximum intensity, and landfall intensity, etc. Here we compare MeHiM with various dynamical models, including a global climate model [High-Resolution Forecast-Oriented Low Ocean Resolution model (HiFLOR)], a regional hurricane model (Geophysical Fluid Dynamics Laboratory (GFDL) hurricane model), and a simplified hurricane dynamic model [Coupled Hurricane Intensity Prediction System (CHIPS)] and its newly developed fast simulator. The MeHiM developed based on the reanalysis data is applied to estimate the intensity of simulated storms to compare with the dynamical-model predictions under the current climate. The dependences of hurricanes on the environment under current and future projected climates in the various models will also be compared statistically.

  1. Self-Organizing Distributed Architecture Supporting Dynamic Space Expanding and Reducing in Indoor LBS Environment

    PubMed Central

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2015-01-01

    Indoor location-based services (iLBS) are extremely dynamic and changeable, and include numerous resources and mobile devices. In particular, the network infrastructure requires support for high scalability in the indoor environment, and various resource lookups are requested concurrently and frequently from several locations based on the dynamic network environment. A traditional map-based centralized approach for iLBSs has several disadvantages: it requires global knowledge to maintain a complete geographic indoor map; the central server is a single point of failure; it can also cause low scalability and traffic congestion; and it is hard to adapt to a change of service area in real time. This paper proposes a self-organizing and fully distributed platform for iLBSs. The proposed self-organizing distributed platform provides a dynamic reconfiguration of locality accuracy and service coverage by expanding and contracting dynamically. In order to verify the suggested platform, scalability performance according to the number of inserted or deleted nodes composing the dynamic infrastructure was evaluated through a simulation similar to the real environment. PMID:26016908

  2. Simulation on the internal structure of three-dimensional proximal tibia under different mechanical environments.

    PubMed

    Fang, Juan; Gong, He; Kong, Lingyan; Zhu, Dong

    2013-12-20

    Bone can adjust its morphological structure to adapt to the changes of mechanical environment, i.e. the bone structure change is related to mechanical loading. This implies that osteoarthritis may be closely associated with knee joint deformity. The purposes of this paper were to simulate the internal bone mineral density (BMD) change in three-dimensional (3D) proximal tibia under different mechanical environments, as well as to explore the relationship between mechanical environment and bone morphological abnormity. The right proximal tibia was scanned with CT to reconstruct a 3D proximal tibia model in MIMICS, then it was imported to finite element software ANSYS to establish 3D finite element model. The internal structure of 3D proximal tibia of young normal people was simulated using quantitative bone remodeling theory in combination with finite element method, then based on the changing pattern of joint contact force on the tibial plateau in valgus knees, the mechanical loading was changed, and the simulated normal tibia structure was used as initial structure to simulate the internal structure of 3D proximal tibia for old people with 6° valgus deformity. Four regions of interest (ROIs) were selected in the proximal tibia to quantitatively analyze BMD and compare with the clinical measurements. The simulation results showed that the BMD distribution in 3D proximal tibia was consistent with clinical measurements in normal knees and that in valgus knees was consistent with the measurement of patients with osteoarthritis in clinics. It is shown that the change of mechanical environment is the main cause for the change of subchondral bone structure, and being under abnormal mechanical environment for a long time may lead to osteoarthritis. Besides, the simulation method adopted in this paper can more accurately simulate the internal structure of 3D proximal tibia under different mechanical environments. It helps to better understand the mechanism of osteoarthritis and provides theoretical basis and computational method for the prevention and treatment of osteoarthritis. It can also serve as basis for further study on periprosthetic BMD changes after total knee arthroplasty, and provide a theoretical basis for optimization design of prosthesis.

  3. Simulation on the internal structure of three-dimensional proximal tibia under different mechanical environments

    PubMed Central

    2013-01-01

    Background Bone can adjust its morphological structure to adapt to the changes of mechanical environment, i.e. the bone structure change is related to mechanical loading. This implies that osteoarthritis may be closely associated with knee joint deformity. The purposes of this paper were to simulate the internal bone mineral density (BMD) change in three-dimensional (3D) proximal tibia under different mechanical environments, as well as to explore the relationship between mechanical environment and bone morphological abnormity. Methods The right proximal tibia was scanned with CT to reconstruct a 3D proximal tibia model in MIMICS, then it was imported to finite element software ANSYS to establish 3D finite element model. The internal structure of 3D proximal tibia of young normal people was simulated using quantitative bone remodeling theory in combination with finite element method, then based on the changing pattern of joint contact force on the tibial plateau in valgus knees, the mechanical loading was changed, and the simulated normal tibia structure was used as initial structure to simulate the internal structure of 3D proximal tibia for old people with 6° valgus deformity. Four regions of interest (ROIs) were selected in the proximal tibia to quantitatively analyze BMD and compare with the clinical measurements. Results The simulation results showed that the BMD distribution in 3D proximal tibia was consistent with clinical measurements in normal knees and that in valgus knees was consistent with the measurement of patients with osteoarthritis in clinics. Conclusions It is shown that the change of mechanical environment is the main cause for the change of subchondral bone structure, and being under abnormal mechanical environment for a long time may lead to osteoarthritis. Besides, the simulation method adopted in this paper can more accurately simulate the internal structure of 3D proximal tibia under different mechanical environments. It helps to better understand the mechanism of osteoarthritis and provides theoretical basis and computational method for the prevention and treatment of osteoarthritis. It can also serve as basis for further study on periprosthetic BMD changes after total knee arthroplasty, and provide a theoretical basis for optimization design of prosthesis. PMID:24359345

  4. Simulations of large acoustic scintillations in the straits of Florida.

    PubMed

    Tang, Xin; Tappert, F D; Creamer, Dennis B

    2006-12-01

    Using a full-wave acoustic model, Monte Carlo numerical studies of intensity fluctuations in a realistic shallow water environment that simulates the Straits of Florida, including internal wave fluctuations and bottom roughness, have been performed. Results show that the sound intensity at distant receivers scintillates dramatically. The acoustic scintillation index SI increases rapidly with propagation range and is significantly greater than unity at ranges beyond about 10 km. This result supports a theoretical prediction by one of the authors. Statistical analyses show that the distribution of intensity of the random wave field saturates to the expected Rayleigh distribution with SI= 1 at short range due to multipath interference effects, and then SI continues to increase to large values. This effect, which is denoted supersaturation, is universal at long ranges in waveguides having lossy boundaries (where there is differential mode attenuation). The intensity distribution approaches a log-normal distribution to an excellent approximation; it may not be a universal distribution and comparison is also made to a K distribution. The long tails of the log-normal distribution cause "acoustic intermittency" in which very high, but rare, intensities occur.

  5. Shadow Mode Assessment Using Realistic Technologies for the National Airspace (SMART NAS)

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal H.

    2014-01-01

    Develop a simulation and modeling capability that includes: (a) Assessment of multiple parallel universes, (b) Accepts data feeds, (c) Allows for live virtual constructive distribute environment, (d) Enables integrated examinations of concepts, algorithms, technologies and National Airspace System (NAS) architectures.

  6. Comparison of predictive estimates of high-latitude electrodynamics with observations of global-scale Birkeland currents

    NASA Astrophysics Data System (ADS)

    Anderson, Brian J.; Korth, Haje; Welling, Daniel T.; Merkin, Viacheslav G.; Wiltberger, Michael J.; Raeder, Joachim; Barnes, Robin J.; Waters, Colin L.; Pulkkinen, Antti A.; Rastaetter, Lutz

    2017-02-01

    Two of the geomagnetic storms for the Space Weather Prediction Center Geospace Environment Modeling challenge occurred after data were first acquired by the Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE). We compare Birkeland currents from AMPERE with predictions from four models for the 4-5 April 2010 and 5-6 August 2011 storms. The four models are the Weimer (2005b) field-aligned current statistical model, the Lyon-Fedder-Mobarry magnetohydrodynamic (MHD) simulation, the Open Global Geospace Circulation Model MHD simulation, and the Space Weather Modeling Framework MHD simulation. The MHD simulations were run as described in Pulkkinen et al. (2013) and the results obtained from the Community Coordinated Modeling Center. The total radial Birkeland current, ITotal, and the distribution of radial current density, Jr, for all models are compared with AMPERE results. While the total currents are well correlated, the quantitative agreement varies considerably. The Jr distributions reveal discrepancies between the models and observations related to the latitude distribution, morphologies, and lack of nightside current systems in the models. The results motivate enhancing the simulations first by increasing the simulation resolution and then by examining the relative merits of implementing more sophisticated ionospheric conductance models, including ionospheric outflows or other omitted physical processes. Some aspects of the system, including substorm timing and location, may remain challenging to simulate, implying a continuing need for real-time specification.

  7. Prediction of the Aerothermodynamic Environment of the Huygens Probe

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Striepe, Scott A.; Wright, Michael J.; Bose, Deepak; Sutton, Kenneth; Takashima, Naruhisa

    2005-01-01

    An investigation of the aerothermodynamic environment of the Huygens entry probe has been conducted. A Monte Carlo simulation of the trajectory of the probe during entry into Titan's atmosphere was performed to identify a worst-case heating rate trajectory. Flowfield and radiation transport computations were performed at points along this trajectory to obtain convective and radiative heat-transfer distributions on the probe's heat shield. This investigation identified important physical and numerical factors, including atmospheric CH4 concentration, transition to turbulence, numerical diffusion modeling, and radiation modeling, which strongly influenced the aerothermodynamic environment.

  8. A new HLA-based distributed control architecture for agricultural teams of robots in hybrid applications with real and simulated devices or environments.

    PubMed

    Nebot, Patricio; Torres-Sospedra, Joaquín; Martínez, Rafael J

    2011-01-01

    The control architecture is one of the most important part of agricultural robotics and other robotic systems. Furthermore its importance increases when the system involves a group of heterogeneous robots that should cooperate to achieve a global goal. A new control architecture is introduced in this paper for groups of robots in charge of doing maintenance tasks in agricultural environments. Some important features such as scalability, code reuse, hardware abstraction and data distribution have been considered in the design of the new architecture. Furthermore, coordination and cooperation among the different elements in the system is allowed in the proposed control system. By integrating a network oriented device server Player, Java Agent Development Framework (JADE) and High Level Architecture (HLA), the previous concepts have been considered in the new architecture presented in this paper. HLA can be considered the most important part because it not only allows the data distribution and implicit communication among the parts of the system but also allows to simultaneously operate with simulated and real entities, thus allowing the use of hybrid systems in the development of applications.

  9. Effects of dispersal on total biomass in a patchy, heterogeneous system: Analysis and experiment.

    PubMed

    Zhang, Bo; Liu, Xin; DeAngelis, D L; Ni, Wei-Ming; Wang, G Geoff

    2015-06-01

    An intriguing recent result from mathematics is that a population diffusing at an intermediate rate in an environment in which resources vary spatially will reach a higher total equilibrium biomass than the population in an environment in which the same total resources are distributed homogeneously. We extended the current mathematical theory to apply to logistic growth and also showed that the result applies to patchy systems with dispersal among patches, both for continuous and discrete time. This allowed us to make specific predictions, through simulations, concerning the biomass dynamics, which were verified by a laboratory experiment. The experiment was a study of biomass growth of duckweed (Lemna minor Linn.), where the resources (nutrients added to water) were distributed homogeneously among a discrete series of water-filled containers in one treatment, and distributed heterogeneously in another treatment. The experimental results showed that total biomass peaked at an intermediate, relatively low, diffusion rate, higher than the total carrying capacity of the system and agreeing with the simulation model. The implications of the experiment to dynamics of source, sink, and pseudo-sink dynamics are discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Synthetic depth data creation for sensor setup planning and evaluation of multi-camera multi-person trackers

    NASA Astrophysics Data System (ADS)

    Pattke, Marco; Martin, Manuel; Voit, Michael

    2017-05-01

    Tracking people with cameras in public areas is common today. However with an increasing number of cameras it becomes harder and harder to view the data manually. Especially in safety critical areas automatic image exploitation could help to solve this problem. Setting up such a system can however be difficult because of its increased complexity. Sensor placement is critical to ensure that people are detected and tracked reliably. We try to solve this problem using a simulation framework that is able to simulate different camera setups in the desired environment including animated characters. We combine this framework with our self developed distributed and scalable system for people tracking to test its effectiveness and can show the results of the tracking system in real time in the simulated environment.

  11. Visualization and Tracking of Parallel CFD Simulations

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi; Kremenetsky, Mark

    1995-01-01

    We describe a system for interactive visualization and tracking of a 3-D unsteady computational fluid dynamics (CFD) simulation on a parallel computer. CM/AVS, a distributed, parallel implementation of a visualization environment (AVS) runs on the CM-5 parallel supercomputer. A CFD solver is run as a CM/AVS module on the CM-5. Data communication between the solver, other parallel visualization modules, and a graphics workstation, which is running AVS, are handled by CM/AVS. Partitioning of the visualization task, between CM-5 and the workstation, can be done interactively in the visual programming environment provided by AVS. Flow solver parameters can also be altered by programmable interactive widgets. This system partially removes the requirement of storing large solution files at frequent time steps, a characteristic of the traditional 'simulate (yields) store (yields) visualize' post-processing approach.

  12. Comparison of Ejecta Distributions from Normal Incident Hypervelocity Impact on Lunar Regolith Simulant

    NASA Technical Reports Server (NTRS)

    Edwards, David L.; Cooke, William; Scruggs, Rob; Moser, Danielle E.

    2008-01-01

    The National Aeronautics and Space Administration (NASA) is progressing toward long-term lunar habitation. Critical to the design of a lunar habitat is an understanding of the lunar surface environment; of specific importance is the primary meteoroid and subsequent ejecta environment. The document, NASA SP-8013, was developed for the Apollo program and is the latest definition of the ejecta environment. There is concern that NASA SP-8013 may over-estimate the lunar ejecta environment. NASA's Meteoroid Environment Office (MEO) has initiated several tasks to improve the accuracy of our understanding of the lunar surface ejecta environment. This paper reports the results of experiments on projectile impact into powered pumice and unconsolidated JSC-1A Lunar Mare Regolith stimulant (JSC-1A) targets. The Ames Vertical Gun Range (AVGR) was used to accelerate projectiles to velocities in excess of 5 km/s and impact the targets at normal incidence. The ejected particles were detected by thin aluminum foil targets placed around the impact site and angular distributions were determined for ejecta. Comparison of ejecta angular distribution with previous works will be presented. A simplistic technique to characterize the ejected particles was formulated and improvements to this technique will be discussed for implementation in future tests.

  13. Pit formation observed in a multilayer dielectric coating as a result of simulated space environmental exposure

    NASA Astrophysics Data System (ADS)

    Fuqua, Peter D.; Presser, Nathan; Barrie, James D.; Meshishnek, Michael J.; Coleman, Dianne J.

    2002-06-01

    Certain spaceborne telescope designs require that dielectric-coated lenses be exposed to the energetic electrons and protons associated with the space environment. Test coupons that were exposed to a simulated space environment showed extensive pitting as a result of dielectric breakdown. A typical pit was 50-100 mum at the surface and extended to the substrate material, in which a 10-mum-diameter melt region was found. Pitting was not observed on similar samples that had also been overcoated with a transparent conductive thin film. Measurement of the bidirectional reflectance distribution transfer function showed that pitting caused a fivefold to tenfold increase in the scattering of visible light.

  14. Atomistic simulations of TeO₂-based glasses: interatomic potentials and molecular dynamics.

    PubMed

    Gulenko, Anastasia; Masson, Olivier; Berghout, Abid; Hamani, David; Thomas, Philippe

    2014-07-21

    In this work we present for the first time empirical interatomic potentials that are able to reproduce TeO2-based systems. Using these potentials in classical molecular dynamics simulations, we obtained first results for the pure TeO2 glass structure model. The calculated pair distribution function is in good agreement with the experimental one, which indicates a realistic glass structure model. We investigated the short- and medium-range TeO2 glass structures. The local environment of the Te atom strongly varies, so that the glass structure model has a broad Q polyhedral distribution. The glass network is described as weakly connected with a large number of terminal oxygen atoms.

  15. Optimal atomic structure of amorphous silicon obtained from density functional theory calculations

    NASA Astrophysics Data System (ADS)

    Pedersen, Andreas; Pizzagalli, Laurent; Jónsson, Hannes

    2017-06-01

    Atomic structure of amorphous silicon consistent with several reported experimental measurements has been obtained from annealing simulations using electron density functional theory calculations and a systematic removal of weakly bound atoms. The excess energy and density with respect to the crystal are well reproduced in addition to radial distribution function, angular distribution functions, and vibrational density of states. No atom in the optimal configuration is locally in a crystalline environment as deduced by ring analysis and common neighbor analysis, but coordination defects are present at a level of 1%-2%. The simulated samples provide structural models of this archetypal disordered covalent material without preconceived notion of the atomic ordering or fitting to experimental data.

  16. Testing the ability of a semidistributed hydrological model to simulate contributing area

    NASA Astrophysics Data System (ADS)

    Mengistu, S. G.; Spence, C.

    2016-06-01

    A dry climate, the prevalence of small depressions, and the lack of a well-developed drainage network are characteristics of environments with extremely variable contributing areas to runoff. These types of regions arguably present the greatest challenge to properly understanding catchment streamflow generation processes. Previous studies have shown that contributing area dynamics are important for streamflow response, but the nature of the relationship between the two is not typically understood. Furthermore, it is not often tested how well hydrological models simulate contributing area. In this study, the ability of a semidistributed hydrological model, the PDMROF configuration of Environment Canada's MESH model, was tested to determine if it could simulate contributing area. The study focused on the St. Denis Creek watershed in central Saskatchewan, Canada, which with its considerable topographic depressions, exhibits wide variation in contributing area, making it ideal for this type of investigation. MESH-PDMROF was able to replicate contributing area derived independently from satellite imagery. Daily model simulations revealed a hysteretic relationship between contributing area and streamflow not apparent from the less frequent remote sensing observations. This exercise revealed that contributing area extent can be simulated by a semi-distributed hydrological model with a scheme that assumes storage capacity distribution can be represented with a probability function. However, further investigation is needed to determine if it can adequately represent the complex relationship between streamflow and contributing area that is such a key signature of catchment behavior.

  17. Guideline on Scenario Development for (Distributed) Simulation Environments (Guide en vue du developpement de scenario dans le cadre de simulation distribuee)

    DTIC Science & Technology

    2015-01-01

    des lacunes de M&S du MORS du NMSG a déterminé que le manque d’interopérabilité de la simulation constituait la lacune prioritaire à combler en...matière de capacité, une équipe exploratoire (ET-027) a été constituée au sein du NMSG pour étudier l’interopérabilité de la simulation. L’ET-027 a...élevés (autrement dit, aux niveaux pratique, dynamique et conceptuel) de même que l’automatisation relative du développement, de

  18. Strain distribution of confined Ge/GeO2 core/shell nanoparticles engineered by growth environments

    NASA Astrophysics Data System (ADS)

    Wei, Wenyan; Yuan, Cailei; Luo, Xingfang; Yu, Ting; Wang, Gongping

    2016-02-01

    The strain distributions of Ge/GeO2 core/shell nanoparticles confined in different host matrix grown by surface oxidation are investigated. The simulated results by finite element method demonstrated that the strains of the Ge core and the GeO2 shell strongly depend on the growth environments of the nanoparticles. Moreover, it can be found that there is a transformation of the strain on Ge core from tensile to compressive strain during the growth of Ge/GeO2 core/shell nanoparticles. And, the transformation of the strain is closely related with the Young's modulus of surrounding materials of Ge/GeO2 core/shell nanoparticles.

  19. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-10-11

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been proposed that have the potential to mitigate many power quality concerns. However, closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. In order to enable the study of the performance of advanced control schemes in a detailed distribution system environment, a Hardware-in-the-Loop (HIL) platform has been developed. In the HIL system,more » GridLAB-D, a distribution system simulation tool, runs in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling to hardware located at the National Renewable Energy Laboratory (NREL). Hardware inverters interact with grid and PV simulators emulating an operational distribution system and power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of controls applied to inverters that are integrated into a simulation of the IEEE 8500-node test feeder, with inverters in either constant power factor control or active volt/VAR control. We demonstrate that this HIL platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, the results from HIL are used to validate GridLAB-D simulations of advanced inverter controls.« less

  20. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  1. Test and Analysis Capabilities of the Space Environment Effects Team at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Finckenor, M. M.; Edwards, D. L.; Vaughn, J. A.; Schneider, T. A.; Hovater, M. A.; Hoppe, D. T.

    2002-01-01

    Marshall Space Flight Center has developed world-class space environmental effects testing facilities to simulate the space environment. The combined environmental effects test system exposes temperature-controlled samples to simultaneous protons, high- and low-energy electrons, vacuum ultraviolet (VUV) radiation, and near-ultraviolet (NUV) radiation. Separate chambers for studying the effects of NUV and VUV at elevated temperatures are also available. The Atomic Oxygen Beam Facility exposes samples to atomic oxygen of 5 eV energy to simulate low-Earth orbit (LEO). The LEO space plasma simulators are used to study current collection to biased spacecraft surfaces, arcing from insulators and electrical conductivity of materials. Plasma propulsion techniques are analyzed using the Marshall magnetic mirror system. The micro light gas gun simulates micrometeoroid and space debris impacts. Candidate materials and hardware for spacecraft can be evaluated for durability in the space environment with a variety of analytical techniques. Mass, solar absorptance, infrared emittance, transmission, reflectance, bidirectional reflectance distribution function, and surface morphology characterization can be performed. The data from the space environmental effects testing facilities, combined with analytical results from flight experiments, enable the Environmental Effects Group to determine optimum materials for use on spacecraft.

  2. Building interactive virtual environments for simulated training in medicine using VRML and Java/JavaScript.

    PubMed

    Korocsec, D; Holobar, A; Divjak, M; Zazula, D

    2005-12-01

    Medicine is a difficult thing to learn. Experimenting with real patients should not be the only option; simulation deserves a special attention here. Virtual Reality Modelling Language (VRML) as a tool for building virtual objects and scenes has a good record of educational applications in medicine, especially for static and animated visualisations of body parts and organs. However, to create computer simulations resembling situations in real environments the required level of interactivity and dynamics is difficult to achieve. In the present paper we describe some approaches and techniques which we used to push the limits of the current VRML technology further toward dynamic 3D representation of virtual environments (VEs). Our demonstration is based on the implementation of a virtual baby model, whose vital signs can be controlled from an external Java application. The main contributions of this work are: (a) outline and evaluation of the three-level VRML/Java implementation of the dynamic virtual environment, (b) proposal for a modified VRML Timesensor node, which greatly improves the overall control of system performance, and (c) architecture of the prototype distributed virtual environment for training in neonatal resuscitation comprising the interactive virtual newborn, active bedside monitor for vital signs and full 3D representation of the surgery room.

  3. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H; Montesinos-López, José C; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-05-05

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. Copyright © 2017 Montesinos-López et al.

  4. Real-time high speed generator system emulation with hardware-in-the-loop application

    NASA Astrophysics Data System (ADS)

    Stroupe, Nicholas

    The emerging emphasis and benefits of distributed generation on smaller scale networks has prompted much attention and focus to research in this field. Much of the research that has grown in distributed generation has also stimulated the development of simulation software and techniques. Testing and verification of these distributed power networks is a complex task and real hardware testing is often desired. This is where simulation methods such as hardware-in-the-loop become important in which an actual hardware unit can be interfaced with a software simulated environment to verify proper functionality. In this thesis, a simulation technique is taken one step further by utilizing a hardware-in-the-loop technique to emulate the output voltage of a generator system interfaced to a scaled hardware distributed power system for testing. The purpose of this thesis is to demonstrate a new method of testing a virtually simulated generation system supplying a scaled distributed power system in hardware. This task is performed by using the Non-Linear Loads Test Bed developed by the Energy Conversion and Integration Thrust at the Center for Advanced Power Systems. This test bed consists of a series of real hardware developed converters consistent with the Navy's All-Electric-Ship proposed power system to perform various tests on controls and stability under the expected non-linear load environment of the Navy weaponry. This test bed can also explore other distributed power system research topics and serves as a flexible hardware unit for a variety of tests. In this thesis, the test bed will be utilized to perform and validate this newly developed method of generator system emulation. In this thesis, the dynamics of a high speed permanent magnet generator directly coupled with a micro turbine are virtually simulated on an FPGA in real-time. The calculated output stator voltage will then serve as a reference for a controllable three phase inverter at the input of the test bed that will emulate and reproduce these voltages on real hardware. The output of the inverter is then connected with the rest of the test bed and can consist of a variety of distributed system topologies for many testing scenarios. The idea is that the distributed power system under test in hardware can also integrate real generator system dynamics without physically involving an actual generator system. The benefits of successful generator system emulation are vast and lead to much more detailed system studies without the draw backs of needing physical generator units. Some of these advantages are safety, reduced costs, and the ability of scaling while still preserving the appropriate system dynamics. This thesis will introduce the ideas behind generator emulation and explain the process and necessary steps to obtaining such an objective. It will also demonstrate real results and verification of numerical values in real-time. The final goal of this thesis is to introduce this new idea and show that it is in fact obtainable and can prove to be a highly useful tool in the simulation and verification of distributed power systems.

  5. A simulation-based analytic model of radio galaxies

    NASA Astrophysics Data System (ADS)

    Hardcastle, M. J.

    2018-04-01

    I derive and discuss a simple semi-analytical model of the evolution of powerful radio galaxies which is not based on assumptions of self-similar growth, but rather implements some insights about the dynamics and energetics of these systems derived from numerical simulations, and can be applied to arbitrary pressure/density profiles of the host environment. The model can qualitatively and quantitatively reproduce the source dynamics and synchrotron light curves derived from numerical modelling. Approximate corrections for radiative and adiabatic losses allow it to predict the evolution of radio spectral index and of inverse-Compton emission both for active and `remnant' sources after the jet has turned off. Code to implement the model is publicly available. Using a standard model with a light relativistic (electron-positron) jet, subequipartition magnetic fields, and a range of realistic group/cluster environments, I simulate populations of sources and show that the model can reproduce the range of properties of powerful radio sources as well as observed trends in the relationship between jet power and radio luminosity, and predicts their dependence on redshift and environment. I show that the distribution of source lifetimes has a significant effect on both the source length distribution and the fraction of remnant sources expected in observations, and so can in principle be constrained by observations. The remnant fraction is expected to be low even at low redshift and low observing frequency due to the rapid luminosity evolution of remnants, and to tend rapidly to zero at high redshift due to inverse-Compton losses.

  6. Updating source term and atmospheric dispersion simulations for the dose reconstruction in Fukushima Daiichi Nuclear Power Station Accident

    NASA Astrophysics Data System (ADS)

    Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku

    2017-09-01

    In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.

  7. Simulation of Earth-Moon-Mars Environments for the Assessment of Organ Doses

    NASA Astrophysics Data System (ADS)

    Kim, M. Y.; Schwadron, N. A.; Townsend, L.; Cucinotta, F. A.

    2010-12-01

    Space radiation environments for historically large solar particle events (SPE) and galactic cosmic rays (GCR) at solar minimum and solar maximum are simulated in order to characterize exposures to radio-sensitive organs for missions to low-Earth orbit (LEO), moon, and Mars. Primary and secondary particles for SPE and GCR are transported through the respective atmosphere of Earth or Mars, space vehicle, and astronaut’s body tissues using the HZETRN/QMSFRG computer code. In LEO, exposures are reduced compared to deep space because particles are deflected by the Earth’s magnetic field and absorbed by the solid body of the Earth. Geomagnetic transmission function as a function of altitude was applied for the particle flux of charged particles, and the shift of the organ exposures to higher velocity or lower stopping powers compared to those in deep space was analyzed. In the transport through Mars atmosphere, a vertical distribution of atmospheric thickness was calculated from the temperature and pressure data of Mars Global Surveyor, and the directional cosine distribution was implemented to describe the spherically distributed atmospheric distance along the slant path at each altitude. The resultant directional shielding by Mars atmosphere at solar minimum and solar maximum was used for the particle flux simulation at various altitudes on the Martian surface. Finally, atmospheric shielding was coupled with vehicle and body shielding for organ dose estimates. We made predictions of radiation dose equivalents and evaluated acute symptoms at LEO, moon, and Mars at solar minimum and solar maximum.

  8. Air traffic control by distributed management in a MLS environment

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.; Parkin, L.; Hart, S.

    1977-01-01

    The microwave landing system (MLS) is a technically feasible means for increasing runway capacity since it could support curved approaches to a short final. The shorter the final segment of the approach, the wider the variety of speed mixes possible so that theoretically, capacity would ultimately be limited by runway occupance time only. An experiment contrasted air traffic control in a MLS environment under a centralized form of management and under distributed management which was supported by a traffic situation display in each of the 3 piloted simulators. Objective flight data, verbal communication and subjective responses were recorded on 18 trial runs lasting about 20 minutes each. The results were in general agreement with previous distributed management research. In particular, distributed management permitted a smaller spread of intercrossing times and both pilots and controllers perceived distributed management as the more 'ideal' system in this task. It is concluded from this and previous research that distributed management offers a viable alternative to centralized management with definite potential for dealing with dense traffic in a safe, orderly and expeditious manner.

  9. Central East Pacific Flight Routing

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon; Sridhar, Banavar; Kopardekar, Parimal; Cheng, Nadia

    2006-01-01

    With the introduction of the Federal Aviation Administration s Advanced Technology and Oceanic Procedures system at the Oakland Oceanic Center, a level of automation now exists in the oceanic environment to potentially begin accommodating increased user preferred routing requests. This paper presents the results of an initial feasibility assessment which examines the potential benefits of transitioning from the fixed Central East Pacific routes to user preferred routes. As a surrogate for the actual user-provided routing requests, a minimum-travel-time, wind-optimal dynamic programming algorithm was developed and utilized in this paper. After first describing the characteristics (e.g., origin airport, destination airport, vertical distribution and temporal distribution) of the westbound flights utilizing the Central East Pacific routes on Dec. 14-16 and 19-20, the results of both a flight-plan-based simulation and a wind-optimal-based simulation are presented. Whereas the lateral and longitudinal distribution of the aircraft trajectories in these two simulations varied dramatically, the number of simulated first-loss-of-separation events remained relatively constant. One area of concern that was uncovered in this initial analysis was a potential workload issue associated with the redistribution of traffic in the oceanic sectors due to thc prevailing wind patterns.

  10. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  11. The Influence of Spatial Configuration of Residential Area and Vector Populations on Dengue Incidence Patterns in an Individual-Level Transmission Model.

    PubMed

    Kang, Jeon-Young; Aldstadt, Jared

    2017-07-15

    Dengue is a mosquito-borne infectious disease that is endemic in tropical and subtropical countries. Many individual-level simulation models have been developed to test hypotheses about dengue virus transmission. Often these efforts assume that human host and mosquito vector populations are randomly or uniformly distributed in the environment. Although, the movement of mosquitoes is affected by spatial configuration of buildings and mosquito populations are highly clustered in key buildings, little research has focused on the influence of the local built environment in dengue transmission models. We developed an agent-based model of dengue transmission in a village setting to test the importance of using realistic environments in individual-level models of dengue transmission. The results from one-way ANOVA analysis of simulations indicated that the differences between scenarios in terms of infection rates as well as serotype-specific dominance are statistically significant. Specifically, the infection rates in scenarios of a realistic environment are more variable than those of a synthetic spatial configuration. With respect to dengue serotype-specific cases, we found that a single dengue serotype is more often dominant in realistic environments than in synthetic environments. An agent-based approach allows a fine-scaled analysis of simulated dengue incidence patterns. The results provide a better understanding of the influence of spatial heterogeneity on dengue transmission at a local scale.

  12. Structural Heterogeneity and Quantitative FRET Efficiency Distributions of Polyprolines through a Hybrid Atomistic Simulation and Monte Carlo Approach

    PubMed Central

    Hoefling, Martin; Lima, Nicola; Haenni, Dominik; Seidel, Claus A. M.; Schuler, Benjamin; Grubmüller, Helmut

    2011-01-01

    Förster Resonance Energy Transfer (FRET) experiments probe molecular distances via distance dependent energy transfer from an excited donor dye to an acceptor dye. Single molecule experiments not only probe average distances, but also distance distributions or even fluctuations, and thus provide a powerful tool to study biomolecular structure and dynamics. However, the measured energy transfer efficiency depends not only on the distance between the dyes, but also on their mutual orientation, which is typically inaccessible to experiments. Thus, assumptions on the orientation distributions and averages are usually made, limiting the accuracy of the distance distributions extracted from FRET experiments. Here, we demonstrate that by combining single molecule FRET experiments with the mutual dye orientation statistics obtained from Molecular Dynamics (MD) simulations, improved estimates of distances and distributions are obtained. From the simulated time-dependent mutual orientations, FRET efficiencies are calculated and the full statistics of individual photon absorption, energy transfer, and photon emission events is obtained from subsequent Monte Carlo (MC) simulations of the FRET kinetics. All recorded emission events are collected to bursts from which efficiency distributions are calculated in close resemblance to the actual FRET experiment, taking shot noise fully into account. Using polyproline chains with attached Alexa 488 and Alexa 594 dyes as a test system, we demonstrate the feasibility of this approach by direct comparison to experimental data. We identified cis-isomers and different static local environments as sources of the experimentally observed heterogeneity. Reconstructions of distance distributions from experimental data at different levels of theory demonstrate how the respective underlying assumptions and approximations affect the obtained accuracy. Our results show that dye fluctuations obtained from MD simulations, combined with MC single photon kinetics, provide a versatile tool to improve the accuracy of distance distributions that can be extracted from measured single molecule FRET efficiencies. PMID:21629703

  13. Using Immersive Virtual Reality for Electrical Substation Training

    ERIC Educational Resources Information Center

    Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana

    2015-01-01

    Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…

  14. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert B.

    1994-01-01

    Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

  15. On the interplay between cosmological shock waves and their environment

    NASA Astrophysics Data System (ADS)

    Martin-Alvarez, Sergio; Planelles, Susana; Quilis, Vicent

    2017-05-01

    Cosmological shock waves are tracers of the thermal history of the structures in the Universe. They play a crucial role in redistributing the energy within the cosmic structures and are also amongst the main ingredients of galaxy and galaxy cluster formation. Understanding this important function requires a proper description of the interplay between shocks and the different environments where they can be found. In this paper, an Adaptive Mesh Refinement (AMR) Eulerian cosmological simulation is analysed by means of a shock-finding algorithm that allows to generate shock wave maps. Based on the population of dark matter halos and on the distribution of density contrast in the simulation, we classify the shocks in five different environments. These range from galaxy clusters to voids. The shock distribution function and the shocks power spectrum are studied for these environments dynamics. We find that shock waves on different environments undergo different formation and evolution processes, showing as well different characteristics. We identify three different phases of formation, evolution and dissipation of these shock waves, and an intricate migration between distinct environments and scales. Shock waves initially form at external, low density regions and are merged and amplified through the collapse of structures. Shock waves and cosmic structures follow a parallel evolution. Later on, shocks start to detach from them and dissipate. We also find that most of the power that shock waves dissipate is found at scales of k ˜0.5 Mpc^{-1}, with a secondary peak at k ˜8 Mpc^{-1}. The evolution of the shocks power spectrum confirms that shock waves evolution is coupled and conditioned by their environment.

  16. The framework for simulation of bioinspired security mechanisms against network infrastructure attacks.

    PubMed

    Shorov, Andrey; Kotenko, Igor

    2014-01-01

    The paper outlines a bioinspired approach named "network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed procedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine necessary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described.

  17. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  18. Computational study of the heat transfer of an avian egg in a tray.

    PubMed

    Eren Ozcan, S; Andriessens, S; Berckmans, D

    2010-04-01

    The development of an embryo in an avian egg depends largely on its temperature. The embryo temperature is affected by its environment and the heat produced by the egg. In this paper, eggshell temperature and the heat transfer characteristics from one egg in a tray toward its environment are studied by means of computational fluid dynamics (CFD). Computational fluid dynamics simulations have the advantage of providing extensive 3-dimensional information on velocity and eggshell temperature distribution around an egg that otherwise is not possible to obtain by experiments. However, CFD results need to be validated against experimental data. The objectives were (1) to find out whether CFD can successfully simulate eggshell temperature from one egg in a tray by comparing to previously conducted experiments, (2) to visualize air flow and air temperature distribution around the egg in a detailed way, and (3) to perform sensitivity analysis on several variables affecting heat transfer. To this end, a CFD model was validated using 2 sets of temperature measurements yielding an effective model. From these simulations, it can be concluded that CFD can effectively be used to analyze heat transfer characteristics and eggshell temperature distribution around an egg. In addition, air flow and temperature distribution around the egg are visualized. It has been observed that temperature differences up to 2.6 degrees C are possible at high heat production (285 mW) and horizontal low flow rates (0.5 m/s). Sensitivity analysis indicates that average eggshell temperature is mainly affected by the inlet air velocity and temperature, flow direction, and the metabolic heat of the embryo and less by the thermal conductivity and emissivity of the egg and thermal emissivity of the tray.

  19. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled urban setting. Terrestrial factors investigated include altering the physical model's catchment slope (0°- 20°), as well as simulating a number of spatially-varied impermeability and building density/configuration scenarios. Additionally, the influence of different storm dynamics and intensities were investigated. Preliminary results demonstrate that rainfall-runoff responses in the physical modelling environment are highly sensitive to slight increases in catchment gradient and rainfall intensity and that more densely distributed building layouts significantly increase peak flows recorded at the physical model outflow when compared to sparsely distributed building layouts under comparable simulated rainfall conditions.

  20. A Hybrid Demand Response Simulator Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-02

    A hybrid demand response simulator is developed to test different control algorithms for centralized and distributed demand response (DR) programs in a small distribution power grid. The HDRS is designed to model a wide variety of DR services such as peak having, load shifting, arbitrage, spinning reserves, load following, regulation, emergency load shedding, etc. The HDRS does not model the dynamic behaviors of the loads, rather, it simulates the load scheduling and dispatch process. The load models include TCAs (water heaters, air conditioners, refrigerators, freezers, etc) and non-TCAs (lighting, washer, dishwasher, etc.) The ambient temperature changes, thermal resistance, capacitance, andmore » the unit control logics can be modeled for TCA loads. The use patterns of the non-TCA can be modeled by probability of use and probabilistic durations. Some of the communication network characteristics, such as delays and errors, can also be modeled. Most importantly, because the simulator is modular and greatly simplified the thermal models for TCA loads, it is very easy and fast to be used to test and validate different control algorithms in a simulated environment.« less

  1. Modelling chemical abundance distributions for dwarf galaxies in the Local Group: the impact of turbulent metal diffusion

    NASA Astrophysics Data System (ADS)

    Escala, Ivanna; Wetzel, Andrew; Kirby, Evan N.; Hopkins, Philip F.; Ma, Xiangcheng; Wheeler, Coral; Kereš, Dušan; Faucher-Giguère, Claude-André; Quataert, Eliot

    2018-02-01

    We investigate stellar metallicity distribution functions (MDFs), including Fe and α-element abundances, in dwarf galaxies from the Feedback in Realistic Environment (FIRE) project. We examine both isolated dwarf galaxies and those that are satellites of a Milky Way-mass galaxy. In particular, we study the effects of including a sub-grid turbulent model for the diffusion of metals in gas. Simulations that include diffusion have narrower MDFs and abundance ratio distributions, because diffusion drives individual gas and star particles towards the average metallicity. This effect provides significantly better agreement with observed abundance distributions in dwarf galaxies in the Local Group, including small intrinsic scatter in [α/Fe] versus [Fe/H] of ≲0.1 dex. This small intrinsic scatter arises in our simulations because the interstellar medium in dwarf galaxies is well mixed at nearly all cosmic times, such that stars that form at a given time have similar abundances to ≲0.1 dex. Thus, most of the scatter in abundances at z = 0 arises from redshift evolution and not from instantaneous scatter in the ISM. We find similar MDF widths and intrinsic scatter for satellite and isolated dwarf galaxies, which suggests that environmental effects play a minor role compared with internal chemical evolution in our simulations. Overall, with the inclusion of metal diffusion, our simulations reproduce abundance distribution widths of observed low-mass galaxies, enabling detailed studies of chemical evolution in galaxy formation.

  2. Contribution of explosion and future collision fragments to the orbital debris environment

    NASA Technical Reports Server (NTRS)

    Su, S.-Y.; Kessler, D. J.

    1985-01-01

    The time evolution of the near-earth man-made orbital debris environment modeled by numerical simulation is presented in this paper. The model starts with a data base of orbital debris objects which are tracked by the NORAD ground radar system. The current untrackable small objects are assumed to result from explosions and are predicted from data collected from a ground explosion experiment. Future collisions between earth orbiting objects are handled by the Monte Carlo method to simulate the range of collision possibilities that may occur in the real world. The collision fragmentation process between debris objects is calculated using an empirical formula derived from a laboratory spacecraft impact experiment to obtain the number versus size distribution of the newly generated debris population. The evolution of the future space debris environment is compared with the natural meteoroid background for the relative spacecraft penetration hazard.

  3. Counts-in-cylinders in the Sloan Digital Sky Survey with Comparisons to N-body Simulations

    NASA Astrophysics Data System (ADS)

    Berrier, Heather D.; Barton, Elizabeth J.; Berrier, Joel C.; Bullock, James S.; Zentner, Andrew R.; Wechsler, Risa H.

    2011-01-01

    Environmental statistics provide a necessary means of comparing the properties of galaxies in different environments, and a vital test of models of galaxy formation within the prevailing hierarchical cosmological model. We explore counts-in-cylinders, a common statistic defined as the number of companions of a particular galaxy found within a given projected radius and redshift interval. Galaxy distributions with the same two-point correlation functions do not necessarily have the same companion count distributions. We use this statistic to examine the environments of galaxies in the Sloan Digital Sky Survey Data Release 4 (SDSS DR4). We also make preliminary comparisons to four models for the spatial distributions of galaxies, based on N-body simulations and data from SDSS DR4, to study the utility of the counts-in-cylinders statistic. There is a very large scatter between the number of companions a galaxy has and the mass of its parent dark matter halo and the halo occupation, limiting the utility of this statistic for certain kinds of environmental studies. We also show that prevalent empirical models of galaxy clustering, that match observed two- and three-point clustering statistics well, fail to reproduce some aspects of the observed distribution of counts-in-cylinders on 1, 3, and 6 h -1 Mpc scales. All models that we explore underpredict the fraction of galaxies with few or no companions in 3 and 6 h -1 Mpc cylinders. Roughly 7% of galaxies in the real universe are significantly more isolated within a 6 h -1 Mpc cylinder than the galaxies in any of the models we use. Simple phenomenological models that map galaxies to dark matter halos fail to reproduce high-order clustering statistics in low-density environments.

  4. Molecular Dynamics Studies of Liposomes as Carriers for Photosensitizing Drugs: Development, Validation, and Simulations with a Coarse-Grained Model.

    PubMed

    Jämbeck, Joakim P M; Eriksson, Emma S E; Laaksonen, Aatto; Lyubartsev, Alexander P; Eriksson, Leif A

    2014-01-14

    Liposomes are proposed as drug delivery systems and can in principle be designed so as to cohere with specific tissue types or local environments. However, little detail is known about the exact mechanisms for drug delivery and the distributions of drug molecules inside the lipid carrier. In the current work, a coarse-grained (CG) liposome model is developed, consisting of over 2500 lipids, with varying degrees of drug loading. For the drug molecule, we chose hypericin, a natural compound proposed for use in photodynamic therapy, for which a CG model was derived and benchmarked against corresponding atomistic membrane bilayer model simulations. Liposomes with 21-84 hypericin molecules were generated and subjected to 10 microsecond simulations. Distribution of the hypericins, their orientations within the lipid bilayer, and the potential of mean force for transferring a hypericin molecule from the interior aqueous "droplet" through the liposome bilayer are reported herein.

  5. Distributed communication and psychosocial performance in simulated space dwelling groups

    NASA Astrophysics Data System (ADS)

    Hienz, R. D.; Brady, J. V.; Hursh, S. R.; Ragusa, L. C.; Rouse, C. O.; Gasior, E. D.

    2005-05-01

    The present report describes the development and application of a distributed interactive multi-person simulation in a computer-generated planetary environment as an experimental test bed for modeling the human performance effects of variations in the types of communication modes available, and in the types of stress and incentive conditions underlying the completion of mission goals. The results demonstrated a high degree of interchangeability between communication modes (audio, text) when one mode was not available. Additionally, the addition of time pressure stress to complete tasks resulted in a reduction in performance effectiveness, and these performance reductions were ameliorated via the introduction of positive incentives contingent upon improved performances. The results obtained confirmed that cooperative and productive psychosocial interactions can be maintained between individually isolated and dispersed members of simulated spaceflight crews communicating and problem-solving effectively over extended time intervals without the benefit of one another's physical presence.

  6. Parallel algorithms for modeling flow in permeable media. Annual report, February 15, 1995 - February 14, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G.A. Pope; K. Sephernoori; D.C. McKinney

    1996-03-15

    This report describes the application of distributed-memory parallel programming techniques to a compositional simulator called UTCHEM. The University of Texas Chemical Flooding reservoir simulator (UTCHEM) is a general-purpose vectorized chemical flooding simulator that models the transport of chemical species in three-dimensional, multiphase flow through permeable media. The parallel version of UTCHEM addresses solving large-scale problems by reducing the amount of time that is required to obtain the solution as well as providing a flexible and portable programming environment. In this work, the original parallel version of UTCHEM was modified and ported to CRAY T3D and CRAY T3E, distributed-memory, multiprocessor computersmore » using CRAY-PVM as the interprocessor communication library. Also, the data communication routines were modified such that the portability of the original code across different computer architectures was mad possible.« less

  7. An application of sedimentation simulation in Tahe oilfield

    NASA Astrophysics Data System (ADS)

    Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He

    2017-12-01

    The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.

  8. Simulation of empty container logistic management at depot

    NASA Astrophysics Data System (ADS)

    Sze, San-Nah; Sek, Siaw-Ying Doreen; Chiew, Kang-Leng; Tiong, Wei-King

    2017-07-01

    This study focuses on the empty container management problem in a deficit regional area. Deficit area is the area having more export activities than the import activities, which always have a shortage of empty container. This environment has challenged the trading companies in the decision making in distributing the empty containers. A simulation model that fit to the environment is developed. Besides, a simple heuristic algorithm with some hard and soft constraints consideration are proposed to plan the logistic of empty container supply. Then, the feasible route with the minimum cost will be determined by applying the proposed heuristic algorithm. The heuristic algorithm can be divided into three main phases which are data sorting, data assigning and time window updating.

  9. Telearch - Integrated visual simulation environment for collaborative virtual archaeology.

    NASA Astrophysics Data System (ADS)

    Kurillo, Gregorij; Forte, Maurizio

    Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.

  10. Local Chemical Ordering and Negative Thermal Expansion in PtNi Alloy Nanoparticles.

    PubMed

    Li, Qiang; Zhu, He; Zheng, Lirong; Fan, Longlong; Wang, Na; Rong, Yangchun; Ren, Yang; Chen, Jun; Deng, Jinxia; Xing, Xianran

    2017-12-13

    An atomic insight into the local chemical ordering and lattice strain is particular interesting to recent emerging bimetallic nanocatalysts such as PtNi alloys. Here, we reported the atomic distribution, chemical environment, and lattice thermal evolution in full-scale structural description of PtNi alloy nanoparticles (NPs). The different segregation of elements in the well-faceted PtNi nanoparticles is convinced by extended X-ray absorption fine structure (EXAFS). Atomic pair distribution function (PDF) study evidences the coexistence of the face-centered cubic and tetragonal ordering parts in the local environment of PtNi nanoparticles. Further reverse Monte Carlo (RMC) simulation with PDF data obviously exposed the segregation as Ni and Pt in the centers of {111} and {001} facets, respectively. Layer-by-layer statistical analysis up to 6 nm for the local atomic pairs revealed the distribution of local tetragonal ordering on the surface. This local coordination environment facilitates the distribution of heteroatomic Pt-Ni pairs, which plays an important role in the negative thermal expansion of Pt 41 Ni 59 NPs. The present study on PtNi alloy NPs from local short-range coordination to long-range average lattice provides a new perspective on tailoring physical properties in nanomaterials.

  11. Distributed Lag Models: Examining Associations between the Built Environment and Health

    PubMed Central

    Baek, Jonggyu; Sánchez, Brisa N.; Berrocal, Veronica J.; Sanchez-Vaznaugh, Emma V.

    2016-01-01

    Built environment factors constrain individual level behaviors and choices, and thus are receiving increasing attention to assess their influence on health. Traditional regression methods have been widely used to examine associations between built environment measures and health outcomes, where a fixed, pre-specified spatial scale (e.g., 1 mile buffer) is used to construct environment measures. However, the spatial scale for these associations remains largely unknown and misspecifying it introduces bias. We propose the use of distributed lag models (DLMs) to describe the association between built environment features and health as a function of distance from the locations of interest and circumvent a-priori selection of a spatial scale. Based on simulation studies, we demonstrate that traditional regression models produce associations biased away from the null when there is spatial correlation among the built environment features. Inference based on DLMs is robust under a range of scenarios of the built environment. We use this innovative application of DLMs to examine the association between the availability of convenience stores near California public schools, which may affect children’s dietary choices both through direct access to junk food and exposure to advertisement, and children’s body mass index z-scores (BMIz). PMID:26414942

  12. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink(R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  13. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  14. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia Mae; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (40,000 pound force thrust) (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink (R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  15. Modified chloride diffusion model for concrete under the coupling effect of mechanical load and chloride salt environment

    NASA Astrophysics Data System (ADS)

    Lei, Mingfeng; Lin, Dayong; Liu, Jianwen; Shi, Chenghua; Ma, Jianjun; Yang, Weichao; Yu, Xiaoniu

    2018-03-01

    For the purpose of investigating lining concrete durability, this study derives a modified chloride diffusion model for concrete based on the odd continuation of boundary conditions and Fourier transform. In order to achieve this, the linear stress distribution on a sectional structure is considered, detailed procedures and methods are presented for model verification and parametric analysis. Simulation results show that the chloride diffusion model can reflect the effects of linear stress distribution of the sectional structure on the chloride diffusivity with reliable accuracy. Along with the natural environmental characteristics of practical engineering structures, reference value ranges of model parameters are provided. Furthermore, a chloride diffusion model is extended for the consideration of multi-factor coupling of linear stress distribution, chloride concentration and diffusion time. Comparison between model simulation and typical current research results shows that the presented model can produce better considerations with a greater universality.

  16. Using special functions to model the propagation of airborne diseases

    NASA Astrophysics Data System (ADS)

    Bolaños, Daniela

    2014-06-01

    Some special functions of the mathematical physics are using to obtain a mathematical model of the propagation of airborne diseases. In particular we study the propagation of tuberculosis in closed rooms and we model the propagation using the error function and the Bessel function. In the model, infected individual emit pathogens to the environment and this infect others individuals who absorb it. The evolution in time of the concentration of pathogens in the environment is computed in terms of error functions. The evolution in time of the number of susceptible individuals is expressed by a differential equation that contains the error function and it is solved numerically for different parametric simulations. The evolution in time of the number of infected individuals is plotted for each numerical simulation. On the other hand, the spatial distribution of the pathogen around the source of infection is represented by the Bessel function K0. The spatial and temporal distribution of the number of infected individuals is computed and plotted for some numerical simulations. All computations were made using software Computer algebra, specifically Maple. It is expected that the analytical results that we obtained allow the design of treatment rooms and ventilation systems that reduce the risk of spread of tuberculosis.

  17. Evaluating the performance of distributed approaches for modal identification

    NASA Astrophysics Data System (ADS)

    Krishnan, Sriram S.; Sun, Zhuoxiong; Irfanoglu, Ayhan; Dyke, Shirley J.; Yan, Guirong

    2011-04-01

    In this paper two modal identification approaches appropriate for use in a distributed computing environment are applied to a full-scale, complex structure. The natural excitation technique (NExT) is used in conjunction with a condensed eigensystem realization algorithm (ERA), and the frequency domain decomposition with peak-picking (FDD-PP) are both applied to sensor data acquired from a 57.5-ft, 10 bay highway sign truss structure. Monte-Carlo simulations are performed on a numerical example to investigate the statistical properties and sensitivity to noise of the two distributed algorithms. Experimental results are provided and discussed.

  18. SimBOX: a scalable architecture for aggregate distributed command and control of spaceport and service constellation

    NASA Astrophysics Data System (ADS)

    Prasad, Guru; Jayaram, Sanjay; Ward, Jami; Gupta, Pankaj

    2004-08-01

    In this paper, Aximetric proposes a decentralized Command and Control (C2) architecture for a distributed control of a cluster of on-board health monitoring and software enabled control systems called SimBOX that will use some of the real-time infrastructure (RTI) functionality from the current military real-time simulation architecture. The uniqueness of the approach is to provide a "plug and play environment" for various system components that run at various data rates (Hz) and the ability to replicate or transfer C2 operations to various subsystems in a scalable manner. This is possible by providing a communication bus called "Distributed Shared Data Bus" and a distributed computing environment used to scale the control needs by providing a self-contained computing, data logging and control function module that can be rapidly reconfigured to perform different functions. This kind of software-enabled control is very much needed to meet the needs of future aerospace command and control functions.

  19. Method for assessing the need for case-specific hemodynamics: application to the distribution of vascular permeability.

    PubMed

    Hazel, A L; Friedman, M H

    2000-01-01

    A common approach to understanding the role of hemodynamics in atherogenesis is to seek relationships between parameters of the hemodynamic environment, and the distribution of tissue variables thought to be indicative of early disease. An important question arising in such investigations is whether the distributions of tissue variables are sufficiently similar among cases to permit them to be described by an ensemble average distribution. If they are, the hemodynamic environment needs be determined only once, for a nominal representative geometry; if not, the hemodynamic environment must be obtained for each case. A method for classifying distributions from multiple cases to answer this question is proposed and applied to the distributions of the uptake of Evans blue dye labeled albumin by the external iliac arteries of swine in response to a step increase in flow. It is found that the uptake patterns in the proximal segment of the arteries, between the aortic trifurcation and the ostium of the circumflex iliac artery, show considerable case-to-case variability. In the distal segment, extending to the deep femoral ostium, many cases show very little spatial variation, and the patterns in those that do are similar among the cases. Thus the response of the distal segment may be understood with fewer simulations, but the proximal segment has more information to offer.

  20. Local-order metric for condensed-phase environments

    NASA Astrophysics Data System (ADS)

    Martelli, Fausto; Ko, Hsin-Yu; Oǧuz, Erdal C.; Car, Roberto

    2018-02-01

    We introduce a local order metric (LOM) that measures the degree of order in the neighborhood of an atomic or molecular site in a condensed medium. The LOM maximizes the overlap between the spatial distribution of sites belonging to that neighborhood and the corresponding distribution in a suitable reference system. The LOM takes a value tending to zero for completely disordered environments and tending to one for environments that perfectly match the reference. The site-averaged LOM and its standard deviation define two scalar order parameters, S and δ S , that characterize with excellent resolution crystals, liquids, and amorphous materials. We show with molecular dynamics simulations that S , δ S , and the LOM provide very insightful information in the study of structural transformations, such as those occurring when ice spontaneously nucleates from supercooled water or when a supercooled water sample becomes amorphous upon progressive cooling.

  1. Evolution of a Fluctuating Population in a Randomly Switching Environment.

    PubMed

    Wienand, Karl; Frey, Erwin; Mobilia, Mauro

    2017-10-13

    Environment plays a fundamental role in the competition for resources, and hence in the evolution of populations. Here, we study a well-mixed, finite population consisting of two strains competing for the limited resources provided by an environment that randomly switches between states of abundance and scarcity. Assuming that one strain grows slightly faster than the other, we consider two scenarios-one of pure resource competition, and one in which one strain provides a public good-and investigate how environmental randomness (external noise) coupled to demographic (internal) noise determines the population's fixation properties and size distribution. By analytical means and simulations, we show that these coupled sources of noise can significantly enhance the fixation probability of the slower-growing species. We also show that the population size distribution can be unimodal, bimodal, or multimodal and undergoes noise-induced transitions between these regimes when the rate of switching matches the population's growth rate.

  2. Evolution of a Fluctuating Population in a Randomly Switching Environment

    NASA Astrophysics Data System (ADS)

    Wienand, Karl; Frey, Erwin; Mobilia, Mauro

    2017-10-01

    Environment plays a fundamental role in the competition for resources, and hence in the evolution of populations. Here, we study a well-mixed, finite population consisting of two strains competing for the limited resources provided by an environment that randomly switches between states of abundance and scarcity. Assuming that one strain grows slightly faster than the other, we consider two scenarios—one of pure resource competition, and one in which one strain provides a public good—and investigate how environmental randomness (external noise) coupled to demographic (internal) noise determines the population's fixation properties and size distribution. By analytical means and simulations, we show that these coupled sources of noise can significantly enhance the fixation probability of the slower-growing species. We also show that the population size distribution can be unimodal, bimodal, or multimodal and undergoes noise-induced transitions between these regimes when the rate of switching matches the population's growth rate.

  3. Collaborative environments for capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2005-05-01

    Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.

  4. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In addition, the performance of DSIL under different traffic loads with different mix of data and priorities are evaluated.

  5. RF Exposure Analysis for Multiple Wi-Fi Devices In Enclosed Environment

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Rhodes, Bryan A.; deSilva, B. Kanishka; Sham, Catherine C.; Keiser, James R.

    2013-01-01

    Wi-Fi devices operated inside a metallic enclosure have been investigation in the recent years. A motivation for this study is to investigate wave propagation inside an enclosed environment such as elevator, car, aircraft, and spacecraft. There are performances and safety concerned that when the RF transmitters are used in the metallic enclosed environments. In this paper, the field distributions inside a confined room were investigated with multiple portable Wi-Fi devices. Computer simulations were performed using the rigorous computational electromagnetics (CEM). The method of moments (MoM) was used to model the mutual coupling among antennas. The geometrical theory of diffraction (GTD) was applied for the multiple reflections off the ground and walls. The prediction of the field distribution inside such environment is useful for the planning and deployment of a wireless radio and sensor system. Factors that affect the field strengths and distributions of radio waves in confined space were analyzed. The results could be used to evaluate the RF exposure safety in confined environment. By comparing the field distributions for various scenarios, it was observed that the Wi-Fi device counts, spacing and relative locations in the room are important factors in such environments. The RF Keep Out Zone (KOZ), where the electric field strengths exceed the permissible RF exposure limit, could be used to assess the RF human exposure compliance. As shown in this study, it s possible to maximize or minimize field intensity in specific area by arranging the Wi-Fi devices as a function of the relative location and spacing in a calculated manner.

  6. Flash Floods Simulation Using a Physical based hydrological Model at the Eastern Nile Basin: Case studies; Wadi Assiut, Egypt and Wadi Gumara, Lake Tana, Ethiopia.

    NASA Astrophysics Data System (ADS)

    Saber, M.; Sefelnasr, A.; Yilmaz, K. K.

    2015-12-01

    Flash flood is a natural hydrological phenomenon which affects many regions of the world. The behavior and effect of this phenomenon is different from one region to the other regions depending on several issues such as climatology and hydrological and topographical conditions at the target regions. Wadi assiut, Egypt as arid environment, and Gumara catchment, Lake Tana, Ethiopia, as humid conditions have been selected for application. The main target of this work is to simulate flash floods at both catchments considering the difference between them on the flash flood behaviors based on the variability of both of them. In order to simulate the flash floods, remote sensing data and a physical-based distributed hydrological model, Hydro-BEAM-WaS (Hydrological River Basin Environmental Assessment Model incorporating Wadi System) have been integrated used in this work. Based on the simulation results of flash floods in these regions, it was found that the time to reach the maximum peak is very short and consequently the warning time is very short as well. It was found that the flash floods starts from zero flow in arid environment, but on the contrary in humid arid, it starts from Base flow which is changeable based on the simulated events. Distribution maps of flash floods showing the vulnerable regions of these selected areas have been developed. Consequently, some mitigation strategies relying on this study have been introduced. The proposed methodology can be applied effectively for flash flood forecasting at different climate regions, however the paucity of observational data.

  7. System-of-Systems Approach for Integrated Energy Systems Modeling and Simulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Saurabh; Ruth, Mark; Pratt, Annabelle

    Today’s electricity grid is the most complex system ever built—and the future grid is likely to be even more complex because it will incorporate distributed energy resources (DERs) such as wind, solar, and various other sources of generation and energy storage. The complexity is further augmented by the possible evolution to new retail market structures that provide incentives to owners of DERs to support the grid. To understand and test new retail market structures and technologies such as DERs, demand-response equipment, and energy management systems while providing reliable electricity to all customers, an Integrated Energy System Model (IESM) is beingmore » developed at NREL. The IESM is composed of a power flow simulator (GridLAB-D), home energy management systems implemented using GAMS/Pyomo, a market layer, and hardware-in-the-loop simulation (testing appliances such as HVAC, dishwasher, etc.). The IESM is a system-of-systems (SoS) simulator wherein the constituent systems are brought together in a virtual testbed. We will describe an SoS approach for developing a distributed simulation environment. We will elaborate on the methodology and the control mechanisms used in the co-simulation illustrated by a case study.« less

  8. Parallel task processing of very large datasets

    NASA Astrophysics Data System (ADS)

    Romig, Phillip Richardson, III

    This research concerns the use of distributed computer technologies for the analysis and management of very large datasets. Improvements in sensor technology, an emphasis on global change research, and greater access to data warehouses all are increase the number of non-traditional users of remotely sensed data. We present a framework for distributed solutions to the challenges of datasets which exceed the online storage capacity of individual workstations. This framework, called parallel task processing (PTP), incorporates both the task- and data-level parallelism exemplified by many image processing operations. An implementation based on the principles of PTP, called Tricky, is also presented. Additionally, we describe the challenges and practical issues in modeling the performance of parallel task processing with large datasets. We present a mechanism for estimating the running time of each unit of work within a system and an algorithm that uses these estimates to simulate the execution environment and produce estimated runtimes. Finally, we describe and discuss experimental results which validate the design. Specifically, the system (a) is able to perform computation on datasets which exceed the capacity of any one disk, (b) provides reduction of overall computation time as a result of the task distribution even with the additional cost of data transfer and management, and (c) in the simulation mode accurately predicts the performance of the real execution environment.

  9. Performance of a Heterogeneous Grid Partitioner for N-body Applications

    NASA Technical Reports Server (NTRS)

    Harvey, Daniel J.; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    An important characteristic of distributed grids is that they allow geographically separated multicomputers to be tied together in a transparent virtual environment to solve large-scale computational problems. However, many of these applications require effective runtime load balancing for the resulting solutions to be viable. Recently, we developed a latency tolerant partitioner, called MinEX, specifically for use in distributed grid environments. This paper compares the performance of MinEX to that of METIS, a popular multilevel family of partitioners, using simulated heterogeneous grid configurations. A solver for the classical N-body problem is implemented to provide a framework for the comparisons. Experimental results show that MinEX provides superior quality partitions while being competitive to METIS in speed of execution.

  10. Measuring sense of presence and user characteristics to predict effective training in an online simulated virtual environment.

    PubMed

    De Leo, Gianluca; Diggs, Leigh A; Radici, Elena; Mastaglio, Thomas W

    2014-02-01

    Virtual-reality solutions have successfully been used to train distributed teams. This study aimed to investigate the correlation between user characteristics and sense of presence in an online virtual-reality environment where distributed teams are trained. A greater sense of presence has the potential to make training in the virtual environment more effective, leading to the formation of teams that perform better in a real environment. Being able to identify, before starting online training, those user characteristics that are predictors of a greater sense of presence can lead to the selection of trainees who would benefit most from the online simulated training. This is an observational study with a retrospective postsurvey of participants' user characteristics and degree of sense of presence. Twenty-nine members from 3 Air Force National Guard Medical Service expeditionary medical support teams participated in an online virtual environment training exercise and completed the Independent Television Commission-Sense of Presence Inventory survey, which measures sense of presence and user characteristics. Nonparametric statistics were applied to determine the statistical significance of user characteristics to sense of presence. Comparing user characteristics to the 4 scales of the Independent Television Commission-Sense of Presence Inventory using Kendall τ test gave the following results: the user characteristics "how often you play video games" (τ(26)=-0.458, P<0.01) and "television/film production knowledge" (τ(27)=-0.516, P<0.01) were significantly related to negative effects. Negative effects refer to adverse physiologic reactions owing to the virtual environment experience such as dizziness, nausea, headache, and eyestrain. The user characteristic "knowledge of virtual reality" was significantly related to engagement (τ(26)=0.463, P<0.01) and negative effects (τ(26)=-0.404, P<0.05). Individuals who have knowledge about virtual environments and experience with gaming environments report a higher sense of presence that indicates that they will likely benefit more from online virtual training. Future research studies could include a larger population of expeditionary medical support, and the results obtained could be used to create a model that predicts the level of presence based on the user characteristics. To maximize results and minimize costs, only those individuals who, based on their characteristics, are supposed to have a higher sense of presence and less negative effects could be selected for online simulated virtual environment training.

  11. Tractor seating for operators with paraplegia.

    PubMed

    Wilhite, C S; Field, W E; Jaramillo, M

    2017-01-01

    This feasibility study explored the utility of using a pressure mapping instrument to explore the variable of pressure under subjects sitting on a commonly used tractor seat, and four other cushion interventions. The research model used single-subject with repeated measures during simulated tractor operation. In examining the graphical images and pressure mapping data available from the instrument; the contour tractor seat used in this study was not sufficient in redistributing pressure for people with paraplegia operating tractors, putting them at greater risk for acquiring a pressure ulcer. The use of pressure mapping equipment to study seated pressure within dynamic environments is achievable, and further studies need to be performed and replicated in simulated or in vivo environments. The data in this study suggest people with paraplegia operating agricultural equipment may not have acceptable pressure distribution using the manufacturer's installed seat and must rely on adding wheelchair cushions or other materials to the seat surface to create acceptable pressure distribution. However, doing so changes other aspects of the seating micro or macro climate that can also be problematic.

  12. The influence of the environment on the propagation of protostellar outflows

    NASA Astrophysics Data System (ADS)

    Moraghan, Anthony; Smith, Michael D.; Rosen, Alexander

    2008-06-01

    The properties of bipolar outflows depend on the structure in the environment as well as the nature of the jet. To help distinguish between the two, we investigate here the properties pertaining to the ambient medium. We execute axisymmetric hydrodynamic simulations, injecting continuous atomic jets into molecular media with density gradients (protostellar cores) and density discontinuities (thick swept-up sheets). We determine the distribution of outflowing mass with radial velocity (the mass spectrum) to quantify our approach and to compare to observationally determined values. We uncover a sequence from clump entrainment in the flanks to bow shock sweeping as the density profile steepens. We also find that the dense, highly supersonic outflows remain collimated but can become turbulent after passing through a shell. The mass spectra vary substantially in time, especially at radial speeds exceeding 15 kms-1. The mass spectra also vary according to the conditions: both envelope-type density distributions and the passage through dense sheets generate considerably steeper mass spectra than a uniform medium. The simulations suggest that observed outflows penetrate highly non-uniform media.

  13. Graph Partitioning for Parallel Applications in Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Bisws, Rupak; Kumar, Shailendra; Das, Sajal K.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The problem of partitioning irregular graphs and meshes for parallel computations on homogeneous systems has been extensively studied. However, these partitioning schemes fail when the target system architecture exhibits heterogeneity in resource characteristics. With the emergence of technologies such as the Grid, it is imperative to study the partitioning problem taking into consideration the differing capabilities of such distributed heterogeneous systems. In our model, the heterogeneous system consists of processors with varying processing power and an underlying non-uniform communication network. We present in this paper a novel multilevel partitioning scheme for irregular graphs and meshes, that takes into account issues pertinent to Grid computing environments. Our partitioning algorithm, called MiniMax, generates and maps partitions onto a heterogeneous system with the objective of minimizing the maximum execution time of the parallel distributed application. For experimental performance study, we have considered both a realistic mesh problem from NASA as well as synthetic workloads. Simulation results demonstrate that MiniMax generates high quality partitions for various classes of applications targeted for parallel execution in a distributed heterogeneous environment.

  14. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration

    PubMed Central

    Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.

    2012-01-01

    In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.

  15. Impact of climate change on runoff pollution in urban environments

    NASA Astrophysics Data System (ADS)

    Coutu, S.; Kramer, S.; Barry, D. A.; Roudier, P.

    2012-12-01

    Runoff from urban environments is generally contaminated. These contaminants mostly originate from road traffic and building envelopes. Facade envelopes generate lead, zinc and even biocides, which are used for facade protection. Road traffic produces particles from tires and brakes. The transport of these pollutants to the environment is controlled by rainfall. The interval, duration and intensity of rainfall events are important as the dynamics of the pollutants are often modeled with non-linear buildup/washoff functions. Buildup occurs during dry weather when pollution accumulates, and is subsequently washed-off at the time of the following rainfall, contaminating surface runoff. Climate predictions include modified rainfall distributions, with changes in both number and intensity of events, even if the expected annual rainfall varies little. Consequently, pollutant concentrations in urban runoff driven by buildup/washoff processes will be affected by these changes in rainfall distributions. We investigated to what extent modifications in future rainfall distributions will impact the concentrations of pollutants present in urban surface runoff. The study used the example of Lausanne, Switzerland (temperate climate zone). Three emission scenarios (time horizon 2090), multiple combinations of RCM/GCM and modifications in rain event frequency were used to simulate future rainfall distributions with various characteristics. Simulated rainfall events were used as inputs for four pairs of buildup/washoff models, in order to compare future pollution concentrations in surface runoff. In this way, uncertainty in model structure was also investigated. Future concentrations were estimated to be between ±40% of today's concentrations depending on the season and, importantly, on the choice of the RCM/GCM model. Overall, however, the dominant factor was the uncertainty inherent in buildup/washoff models, which dominated over the uncertainty in future rainfall distributions. Consequently, the choice of a proper buildup/washoff model, with calibrated site-specific coefficients, is a major factor in modeling future runoff concentrations from contaminated urban surfaces.

  16. A general-purpose development environment for intelligent computer-aided training systems

    NASA Technical Reports Server (NTRS)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  17. Simulating and stimulating performance: introducing distributed simulation to enhance musical learning and performance.

    PubMed

    Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert

    2014-01-01

    Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of "real" performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three "expert" virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training.

  18. Simulating and stimulating performance: introducing distributed simulation to enhance musical learning and performance

    PubMed Central

    Williamon, Aaron; Aufegger, Lisa; Eiholzer, Hubert

    2014-01-01

    Musicians typically rehearse far away from their audiences and in practice rooms that differ significantly from the concert venues in which they aspire to perform. Due to the high costs and inaccessibility of such venues, much current international music training lacks repeated exposure to realistic performance situations, with students learning all too late (or not at all) how to manage performance stress and the demands of their audiences. Virtual environments have been shown to be an effective training tool in the fields of medicine and sport, offering practitioners access to real-life performance scenarios but with lower risk of negative evaluation and outcomes. The aim of this research was to design and test the efficacy of simulated performance environments in which conditions of “real” performance could be recreated. Advanced violin students (n = 11) were recruited to perform in two simulations: a solo recital with a small virtual audience and an audition situation with three “expert” virtual judges. Each simulation contained back-stage and on-stage areas, life-sized interactive virtual observers, and pre- and post-performance protocols designed to match those found at leading international performance venues. Participants completed a questionnaire on their experiences of using the simulations. Results show that both simulated environments offered realistic experience of performance contexts and were rated particularly useful for developing performance skills. For a subset of 7 violinists, state anxiety and electrocardiographic data were collected during the simulated audition and an actual audition with real judges. Results display comparable levels of reported state anxiety and patterns of heart rate variability in both situations, suggesting that responses to the simulated audition closely approximate those of a real audition. The findings are discussed in relation to their implications, both generalizable and individual-specific, for performance training. PMID:24550856

  19. Simulation study on discrete charge effects of SiNW biosensors according to bound target position using a 3D TCAD simulator.

    PubMed

    Chung, In-Young; Jang, Hyeri; Lee, Jieun; Moon, Hyunggeun; Seo, Sung Min; Kim, Dae Hwan

    2012-02-17

    We introduce a simulation method for the biosensor environment which treats the semiconductor and the electrolyte region together, using the well-established semiconductor 3D TCAD simulator tool. Using this simulation method, we conduct electrostatic simulations of SiNW biosensors with a more realistic target charge model where the target is described as a charged cube, randomly located across the nanowire surface, and analyze the Coulomb effect on the SiNW FET according to the position and distribution of the target charges. The simulation results show the considerable variation in the SiNW current according to the bound target positions, and also the dependence of conductance modulation on the polarity of target charges. This simulation method and the results can be utilized for analysis of the properties and behavior of the biosensor device, such as the sensing limit or the sensing resolution.

  20. Technology evaluation, assessment, modeling, and simulation: the TEAMS capability

    NASA Astrophysics Data System (ADS)

    Holland, Orgal T.; Stiegler, Robert L.

    1998-08-01

    The United States Marine Corps' Technology Evaluation, Assessment, Modeling and Simulation (TEAMS) capability, located at the Naval Surface Warfare Center in Dahlgren Virginia, provides an environment for detailed test, evaluation, and assessment of live and simulated sensor and sensor-to-shooter systems for the joint warfare community. Frequent use of modeling and simulation allows for cost effective testing, bench-marking, and evaluation of various levels of sensors and sensor-to-shooter engagements. Interconnectivity to live, instrumented equipment operating in real battle space environments and to remote modeling and simulation facilities participating in advanced distributed simulations (ADS) exercises is available to support a wide- range of situational assessment requirements. TEAMS provides a valuable resource for a variety of users. Engineers, analysts, and other technology developers can use TEAMS to evaluate, assess and analyze tactical relevant phenomenological data on tactical situations. Expeditionary warfare and USMC concept developers can use the facility to support and execute advanced warfighting experiments (AWE) to better assess operational maneuver from the sea (OMFTS) concepts, doctrines, and technology developments. Developers can use the facility to support sensor system hardware, software and algorithm development as well as combat development, acquisition, and engineering processes. Test and evaluation specialists can use the facility to plan, assess, and augment their processes. This paper presents an overview of the TEAMS capability and focuses specifically on the technical challenges associated with the integration of live sensor hardware into a synthetic environment and how those challenges are being met. Existing sensors, recent experiments and facility specifications are featured.

  1. Simulation and assessment of SO2 toxic environment after ignition of uncontrolled sour gas flow of well blowout in hills.

    PubMed

    Zhu, Yuan; Chen, Guo-ming

    2010-06-15

    To study the sulfur dioxide (SO(2)) toxic environment after the ignition of uncontrolled sour gas flow of well blowout, we propose an integrated model to simulate the accident scenario and assess the consequences of SO(2) poisoning. The accident simulation is carried out based on computational fluid dynamics (CFD), which is composed of well blowout dynamics, combustion of sour gas, and products dispersion. Furthermore, detailed complex terrains are built and boundary layer flows are simulated according to Pasquill stability classes. Then based on the estimated exposure dose derived from the toxic dose-response relationship, quantitative assessment is carried out by using equivalent emergency response planning guideline (ERPG) concentration. In this case study, the contaminated areas are graded into three levels, and the areas, maximal influence distances, and main trajectories are predicted. We show that wind drives the contamination and its distribution to spread downwind, and terrains change the distribution shape through spatial aggregation and obstacles. As a result, the most dangerous regions are the downwind areas, the foot of the slopes, and depression areas such as valleys. These cause unfavorable influences on emergency response for accident control and public evacuation. In addition, the effectiveness of controlling the number of deaths by employing ignition is verified in theory. Based on the assessment results, we propose some suggestions for risk assessment, emergency response and accident decision making. Copyright 2010 Elsevier B.V. All rights reserved.

  2. Probability distributions of whisker-surface contact: quantifying elements of the rat vibrissotactile natural scene.

    PubMed

    Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z

    2015-08-01

    Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.

  3. Functional and real-time requirements of a multisensor data fusion (MSDF) situation and threat assessment (STA) resource management (RM) system

    NASA Astrophysics Data System (ADS)

    Duquet, Jean Remi; Bergeron, Pierre; Blodgett, Dale E.; Couture, Jean; Macieszczak, Maciej; Mayrand, Michel; Chalmers, Bruce A.; Paradis, Stephane

    1998-03-01

    The Research and Development group at Lockheed Martin Canada, in collaboration with the Defence Research Establishment Valcartier, has undertaken a research project in order to capture and analyze the real-time and functional requirements of a next generation Command and Control System (CCS) for the Canadian Patrol Frigates, integrating Multi- Sensor Data Fusion (MSDF), Situation and Threat Assessment (STA) and Resource Management (RM). One important aspect of the project is to define how the use of Artificial Intelligence may optimize the performance of an integrated, real-time MSDF/STA/RM system. A closed-loop simulation environment is being developed to facilitate the evaluation of MSDF/STA/RM concepts, algorithms and architectures. This environment comprises (1) a scenario generator, (2) complex sensor, hardkill and softkill weapon models, (3) a real-time monitoring tool, (4) a distributed Knowledge-Base System (KBS) shell. The latter is being completely redesigned and implemented in-house since no commercial KBS shell could adequately satisfy all the project requirements. The closed- loop capability of the simulation environment, together with its `simulated real-time' capability, allows the interaction between the MSDF/STA/RM system and the environment targets during the execution of a scenario. This capability is essential to measure the performance of many STA and RM functionalities. Some benchmark scenarios have been selected to demonstrate quantitatively the capabilities of the selected MSDF/STA/RM algorithms. The paper describes the simulation environment and discusses the MSDF/STA/RM functionalities currently implemented and their performance as an automatic CCS.

  4. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  5. Simulating Aerosol Size Distribution and Mass Concentration with Simultaneous Nucleation, Condensation/Coagulation, and Deposition with the GRAPES-CUACE

    NASA Astrophysics Data System (ADS)

    Zhou, Chunhong; Shen, Xiaojing; Liu, Zirui; Zhang, Yangmei; Xin, Jinyuan

    2018-04-01

    A coupled aerosol-cloud model is essential for investigating the formation of haze and fog and the interaction of aerosols with clouds and precipitation. One of the key tasks of such a model is to produce correct mass and number size distributions of aerosols. In this paper, a parameterization scheme for aerosol size distribution in initial emission, which took into account the measured mass and number size distributions of aerosols, was developed in the GRAPES-CUACE [Global/Regional Assimilation and PrEdiction System-China Meteorological Administration (CMA) Unified Atmospheric Chemistry Environment model]—an online chemical weather forecast system that contains microphysical processes and emission, transport, and chemical conversion of sectional multi-component aerosols. In addition, the competitive mechanism between nucleation and condensation for secondary aerosol formation was improved, and the dry deposition was also modified to be in consistent with the real depositing length. Based on the above improvements, the GRAPES-CUACE simulations were verified against observational data during 1-31 January 2013, when a series of heavy regional haze-fog events occurred in eastern China. The results show that the aerosol number size distribution from the improved experiment was much closer to the observation, whereas in the old experiment the number concentration was higher in the nucleation mode and lower in the accumulation mode. Meanwhile, the errors in aerosol number size distribution as diagnosed by its sectional mass size distribution were also reduced. Moreover, simulations of organic carbon, sulfate, and other aerosol components were improved and the overestimation as well as underestimation of PM2.5 concentration in eastern China was significantly reduced, leading to increased correlation coefficient between simulated and observed PM2.5 by more than 70%. In the remote areas where bad simulation results were produced previously, the correlation coefficient grew from 0.35 to 0.61, and the mean mass concentration went up from 43% to 87.5% of the observed value. Thus, the simulation of particulate matters in these areas has been improved considerably.

  6. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to represent the relations of people, locations, systems, artifacts, communication and information content.

  7. Using simulation to interpret experimental data in terms of protein conformational ensembles.

    PubMed

    Allison, Jane R

    2017-04-01

    In their biological environment, proteins are dynamic molecules, necessitating an ensemble structural description. Molecular dynamics simulations and solution-state experiments provide complimentary information in the form of atomically detailed coordinates and averaged or distributions of structural properties or related quantities. Recently, increases in the temporal and spatial scale of conformational sampling and comparison of the more diverse conformational ensembles thus generated have revealed the importance of sampling rare events. Excitingly, new methods based on maximum entropy and Bayesian inference are promising to provide a statistically sound mechanism for combining experimental data with molecular dynamics simulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The Framework for Simulation of Bioinspired Security Mechanisms against Network Infrastructure Attacks

    PubMed Central

    Kotenko, Igor

    2014-01-01

    The paper outlines a bioinspired approach named “network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed prosedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine nessesary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described. PMID:25254229

  9. Planetary and Space Simulation Facilities PSI at DLR for Astrobiology

    NASA Astrophysics Data System (ADS)

    Rabbow, E.; Rettberg, P.; Panitz, C.; Reitz, G.

    2008-09-01

    Ground based experiments, conducted in the controlled planetary and space environment simulation facilities PSI at DLR, are used to investigate astrobiological questions and to complement the corresponding experiments in LEO, for example on free flying satellites or on space exposure platforms on the ISS. In-orbit exposure facilities can only accommodate a limited number of experiments for exposure to space parameters like high vacuum, intense radiation of galactic and solar origin and microgravity, sometimes also technically adapted to simulate extraterrestrial planetary conditions like those on Mars. Ground based experiments in carefully equipped and monitored simulation facilities allow the investigation of the effects of simulated single environmental parameters and selected combinations on a much wider variety of samples. In PSI at DLR, international science consortia performed astrobiological investigations and space experiment preparations, exposing organic compounds and a wide range of microorganisms, reaching from bacterial spores to complex microbial communities, lichens and even animals like tardigrades to simulated planetary or space environment parameters in pursuit of exobiological questions on the resistance to extreme environments and the origin and distribution of life. The Planetary and Space Simulation Facilities PSI of the Institute of Aerospace Medicine at DLR in Köln, Germany, providing high vacuum of controlled residual composition, ionizing radiation of a X-ray tube, polychromatic UV radiation in the range of 170-400 nm, VIS and IR or individual monochromatic UV wavelengths, and temperature regulation from -20°C to +80°C at the sample size individually or in selected combinations in 9 modular facilities of varying sizes are presented with selected experiments performed within.

  10. Response of Human Prostate Cancer Cells to Mitoxantrone Treatment in Simulated Microgravity Environment

    NASA Astrophysics Data System (ADS)

    Zhang, Ye; Wu, Honglu

    2012-07-01

    RESPONSE OF HUMAN PROSTATE CANCER CELLS TO MITOXANTRONE TREATMENT IN SIMULATED MICROGRAVITY ENVIRONMENT Ye Zhang1,2, Christopher Edwards3, and Honglu Wu1 1 NASA-Johnson Space Center, Houston, TX 2 Wyle Integrated Science and Engineering Group, Houston, TX 3 Oregon State University, Corvallis, OR This study explores the changes in growth of human prostate cancer cells (LNCaP) and their response to the treatment of an antineoplastic agent, mitoxantrone, under the simulated microgravity condition. In comparison to static 1g, microgravity and simulated microgravity have been shown to alter global gene expression patterns and protein levels in various cultured cell models or animals. However, very little is known about the effect of altered gravity on the responses of cells to the treatment of drugs, especially chemotherapy drugs. To test the hypothesis that zero gravity would result in altered regulations of cells in response to antineoplastic agents, we cultured LNCaP cells in either a High Aspect Ratio Vessel (HARV) bioreactor at the rotating condition to model microgravity in space or in the static condition as control, and treated the cells with mitoxantrone. Cell growth, as well as expressions of oxidative stress related genes, were analyzed after the drug treatment. Compared to static 1g controls, the cells cultured in the simulated microgravity environment did not present significant differences in cell viability, growth rate, or cell cycle distribution. However, after mitoxantrone treatment, a significant proportion of bioreactor cultured cells became apoptotic or was arrested in G2. Several oxidative stress related genes also showed a higher expression level post mitoxantrone treatment. Our results indicate that simulated microgravity may alter the response of LNCaP cells to mitoxantrone treatment. Understanding the mechanisms by which cells respond to drugs differently in an altered gravity environment will be useful for the improvement of cancer treatment on the ground. This study explores the changes in growth of human prostate cancer cells (LNCaP) and their response to the treatment of an antineoplastic agent, mitoxantrone, under the simulated microgravity condition. In comparison to static 1g, microgravity and simulated microgravity have been shown to alter global gene expression patterns and protein levels in various cultured cell models or animals. However, very little is known about the effect of altered gravity on the responses of cells to the treatment of drugs, especially chemotherapy drugs. To test the hypothesis that zero gravity would result in altered regulations of cells in response to antineoplastic agents, we cultured LNCaP cells in either a High Aspect Ratio Vessel (HARV) bioreactor at the rotating condition to model microgravity in space or in the static condition as control, and treated the cells with mitoxantrone. Cell growth, as well as expressions of oxidative stress related genes, were analyzed after the drug treatment. Compared to static 1g controls, the cells cultured in the simulated microgravity environment did not present significant differences in cell viability, growth rate, or cell cycle distribution. However, after mitoxantrone treatment, a significant proportion of bioreactor cultured cells became apoptotic or was arrested in G2. Several oxidative stress related genes also showed a higher expression level post mitoxantrone treatment. Our results indicate that simulated microgravity may alter the response of LNCaP cells to mitoxantrone treatment. Understanding the mechanisms by which cells respond to drugs differently in an altered gravity environment will be useful for the improvement of cancer treatment on the ground.

  11. Modeling the human body/seat system in a vibration environment.

    PubMed

    Rosen, Jacob; Arcan, Mircea

    2003-04-01

    The vibration environment is a common man-made artificial surrounding with which humans have a limited tolerance to cope due to their body dynamics. This research studied the dynamic characteristics of a seated human body/seat system in a vibration environment. The main result is a multi degrees of freedom lumped parameter model that synthesizes two basic dynamics: (i) global human dynamics, the apparent mass phenomenon, including a systematic set of the model parameters for simulating various conditions like body posture, backrest, footrest, muscle tension, and vibration directions, and (ii) the local human dynamics, represented by the human pelvis/vibrating seat contact, using a cushioning interface. The model and its selected parameters successfully described the main effects of the apparent mass phenomenon compared to experimental data documented in the literature. The model provided an analytical tool for human body dynamics research. It also enabled a primary tool for seat and cushioning design. The model was further used to develop design guidelines for a composite cushion using the principle of quasi-uniform body/seat contact force distribution. In terms of evenly distributing the contact forces, the best result for the different materials and cushion geometries simulated in the current study was achieved using a two layer shaped geometry cushion built from three materials. Combining the geometry and the mechanical characteristics of a structure under large deformation into a lumped parameter model enables successful analysis of the human/seat interface system and provides practical results for body protection in dynamic environment.

  12. Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)

    NASA Astrophysics Data System (ADS)

    Winstral, A. H.; Marks, D. G.; Gurney, R. J.

    2009-12-01

    The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.

  13. Influence of snowpack and melt energy heterogeneity on snow cover depletion and snowmelt runoff simulation in a cold mountain environment

    NASA Astrophysics Data System (ADS)

    DeBeer, Chris M.; Pomeroy, John W.

    2017-10-01

    The spatial heterogeneity of mountain snow cover and ablation is important in controlling patterns of snow cover depletion (SCD), meltwater production, and runoff, yet is not well-represented in most large-scale hydrological models and land surface schemes. Analyses were conducted in this study to examine the influence of various representations of snow cover and melt energy heterogeneity on both simulated SCD and stream discharge from a small alpine basin in the Canadian Rocky Mountains. Simulations were performed using the Cold Regions Hydrological Model (CRHM), where point-scale snowmelt computations were made using a snowpack energy balance formulation and applied to spatial frequency distributions of snow water equivalent (SWE) on individual slope-, aspect-, and landcover-based hydrological response units (HRUs) in the basin. Hydrological routines were added to represent the vertical and lateral transfers of water through the basin and channel system. From previous studies it is understood that the heterogeneity of late winter SWE is a primary control on patterns of SCD. The analyses here showed that spatial variation in applied melt energy, mainly due to differences in net radiation, has an important influence on SCD at multiple scales and basin discharge, and cannot be neglected without serious error in the prediction of these variables. A single basin SWE distribution using the basin-wide mean SWE (SWE ‾) and coefficient of variation (CV; standard deviation/mean) was found to represent the fine-scale spatial heterogeneity of SWE sufficiently well. Simulations that accounted for differences in (SWE ‾) among HRUs but neglected the sub-HRU heterogeneity of SWE were found to yield similar discharge results as simulations that included this heterogeneity, while SCD was poorly represented, even at the basin level. Finally, applying point-scale snowmelt computations based on a single SWE depth for each HRU (thereby neglecting spatial differences in internal snowpack energetics over the distributions) was found to yield similar SCD and discharge results as simulations that resolved internal energy differences. Spatial/internal snowpack melt energy effects are more pronounced at times earlier in spring before the main period of snowmelt and SCD, as shown in previously published work. The paper discusses the importance of these findings as they apply to the warranted complexity of snowmelt process simulation in cold mountain environments, and shows how the end-of-winter SWE distribution represents an effective means of resolving snow cover heterogeneity at multiple scales for modelling, even in steep and complex terrain.

  14. DEPEND - A design environment for prediction and evaluation of system dependability

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Iyer, Ravishankar K.

    1990-01-01

    The development of DEPEND, an integrated simulation environment for the design and dependability analysis of fault-tolerant systems, is described. DEPEND models both hardware and software components at a functional level, and allows automatic failure injection to assess system performance and reliability. It relieves the user of the work needed to inject failures, maintain statistics, and output reports. The automatic failure injection scheme is geared toward evaluating a system under high stress (workload) conditions. The failures that are injected can affect both hardware and software components. To illustrate the capability of the simulator, a distributed system which employs a prediction-based, dynamic load-balancing heuristic is evaluated. Experiments were conducted to determine the impact of failures on system performance and to identify the failures to which the system is especially susceptible.

  15. Simulation using computer-piloted point excitations of vibrations induced on a structure by an acoustic environment

    NASA Astrophysics Data System (ADS)

    Monteil, P.

    1981-11-01

    Computation of the overall levels and spectral densities of the responses measured on a launcher skin, the fairing for instance, merged into a random acoustic environment during take off, was studied. The analysis of transmission of these vibrations to the payload required the simulation of these responses by a shaker control system, using a small number of distributed shakers. Results show that this closed loop computerized digital system allows the acquisition of auto and cross spectral densities equal to those of the responses previously computed. However, wider application is sought, e.g., road and runway profiles. The problems of multiple input-output system identification, multiple true random signal generation, and real time programming are evoked. The system should allow for the control of four shakers.

  16. Fog-computing concept usage as means to enhance information and control system reliability

    NASA Astrophysics Data System (ADS)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  17. Paracousti-UQ: A Stochastic 3-D Acoustic Wave Propagation Algorithm.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preston, Leiph

    Acoustic full waveform algorithms, such as Paracousti, provide deterministic solutions in complex, 3-D variable environments. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected sound levels within an environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. Performing Monte Carlo (MC) simulations is one method of assessing this uncertainty, but it can quickly become computationally intractable for realistic problems. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a fractionmore » of the computational cost of MC. Paracousti-UQ solves the SPDE system of 3-D acoustic wave propagation equations and provides estimates of the uncertainty of the output simulated wave field (e.g., amplitudes, waveforms) based on estimated probability distributions of the input medium and source parameters. This report describes the derivation of the stochastic partial differential equations, their implementation, and comparison of Paracousti-UQ results with MC simulations using simple models.« less

  18. Land Use Management in the Panama Canal Watershed to Maximize Hydrologic Ecosystem Services Benefits: Explicit Simulation of Preferential Flow Paths in an HPC Environment

    NASA Astrophysics Data System (ADS)

    Regina, J. A.; Ogden, F. L.; Steinke, R. C.; Frazier, N.; Cheng, Y.; Zhu, J.

    2017-12-01

    Preferential flow paths (PFP) resulting from biotic and abiotic factors contribute significantly to the generation of runoff in moist lowland tropical watersheds. Flow through PFPs represents the dominant mechanism by which land use choices affect hydrological behavior. The relative influence of PFP varies depending upon land-use management practices. Assessing the possible effects of land-use and landcover change on flows, and other ecosystem services, in the humid tropics partially depends on adequate simulation of PFP across different land-uses. Currently, 5% of global trade passes through the Panama Canal, which is supplied with fresh water from the Panama Canal Watershed. A third set of locks, recently constructed, are expected to double the capacity of the Canal. We incorporated explicit simulation of PFPs in to the ADHydro HPC distributed hydrological model to simulate the effects of land-use and landcover change due to land management incentives on water resources availability in the Panama Canal Watershed. These simulations help to test hypotheses related to the effectiveness of various proposed payments for ecosystem services schemes. This presentation will focus on hydrological model formulation and performance in an HPC environment.

  19. Monitoring of bioaerosol inhalation risks in different environments using a six-stage Andersen sampler and the PCR-DGGE method.

    PubMed

    Xu, Zhenqiang; Yao, Maosheng

    2013-05-01

    Increasing evidences show that inhalation of indoor bioaerosols has caused numerous adverse health effects and diseases. However, the bioaerosol size distribution, composition, and concentration level, representing different inhalation risks, could vary with different living environments. The six-stage Andersen sampler is designed to simulate the sampling of different human lung regions. Here, the sampler was used in investigating the bioaerosol exposure in six different environments (student dorm, hospital, laboratory, hotel room, dining hall, and outdoor environment) in Beijing. During the sampling, the Andersen sampler was operated for 30 min for each sample, and three independent experiments were performed for each of the environments. The air samples collected onto each of the six stages of the sampler were incubated on agar plates directly at 26 °C, and the colony forming units (CFU) were manually counted and statistically corrected. In addition, the developed CFUs were washed off the agar plates and subjected to polymerase chain reaction (PCR)-denaturing gradient gel electrophoresis (DGGE) for diversity analysis. Results revealed that for most environments investigated, the culturable bacterial aerosol concentrations were higher than those of culturable fungal aerosols. The culturable bacterial and fungal aerosol fractions, concentration, size distribution, and diversity were shown to vary significantly with the sampling environments. PCR-DGGE analysis indicated that different environments had different culturable bacterial aerosol compositions as revealed by distinct gel band patterns. For most environments tested, larger (>3 μm) culturable bacterial aerosols with a skewed size distribution were shown to prevail, accounting for more than 60 %, while for culturable fungal aerosols with a normal size distribution, those 2.1-4.7 μm dominated, accounting for 20-40 %. Alternaria, Cladosporium, Chaetomium, and Aspergillus were found abundant in most environments studied here. Viable microbial load per unit of particulate matter was also shown to vary significantly with the sampling environments. The results from this study suggested that different environments even with similar levels of total microbial culturable aerosol concentrations could present different inhalation risks due to different bioaerosol particle size distribution and composition. This work fills literature gaps regarding bioaerosol size and composition-based exposure risks in different human dwellings in contrast to a vast body of total bioaerosol levels.

  20. Assessment of Human Performance in a Simulated Rotorcraft Downwash Environment

    DTIC Science & Technology

    2007-05-01

    Plaga Biosciences and Protection Division Biomechanics Branch May 2007 Final Report for December 2004 to August 2005... Biomechanics Branch Wright-Patterson AFB OH 45433-7947 Approved for public release; distribution is unlimited NOTICE AND SIGNATURE PAGE...Human Effectiveness Directorate Biosciences & Protection Division Biomechanics Branch Wright-Patterson AFB OH 45433-7947 11. SPONSOR/MONITOR’S

  1. Adaptive Agent Modeling of Distributed Language: Investigations on the Effects of Cultural Variation and Internal Action Representations

    ERIC Educational Resources Information Center

    Cangelosi, Angelo

    2007-01-01

    In this paper we present the "grounded adaptive agent" computational framework for studying the emergence of communication and language. This modeling framework is based on simulations of population of cognitive agents that evolve linguistic capabilities by interacting with their social and physical environment (internal and external symbol…

  2. Learning Unknown Event Models

    DTIC Science & Technology

    2014-07-01

    Intelligence (www.aaai.org). All rights reserved. knowledge engineering, but it is often impractical due to high environment variance, or unknown events...distribution unlimited 13. SUPPLEMENTARY NOTES In Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence , 27-31 July 2014...autonomy for responding to unexpected events in strategy simulations. Computational Intelligence , 29(2), 187-206. Leake, D. B. (1991), Goal-based

  3. Data-Intensive Scientific Management, Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Goranova, Mariana; Shishedjiev, Bogdan; Juliana Georgieva, Juliana

    2012-11-01

    The proposed integrated system provides a suite of services for data-intensive sciences that enables scientists to describe, manage, analyze and visualize data from experiments and numerical simulations in distributed and heterogeneous environment. This paper describes the advisor and the converter services and presents an example from the monitoring of the slant column content of atmospheric minor gases.

  4. Modeling, Simulation, and Characterization of Distributed Multi-Agent Systems

    DTIC Science & Technology

    2012-01-01

    capabilities (vision, LIDAR , differential global positioning, ultrasonic proximity sensing, etc.), the agents comprising a MAS tend to have somewhat lesser...on the simultaneous localization and mapping ( SLAM ) problem [19]. SLAM acknowledges that externally-provided localization information is not...continually-updated mapping databases, generates a comprehensive representation of the spatial and spectral environment. Many times though, inherent SLAM

  5. Engineering High Assurance Distributed Cyber Physical Systems

    DTIC Science & Technology

    2015-01-15

    decisions: number of interacting agents and co-dependent decisions made in real-time without causing interference . To engineer a high assurance DART...environment specification, architecture definition, domain-specific languages, design patterns, code - generation, analysis, test-generation, and simulation...include synchronization between the models and source code , debugging at the model level, expression of the design intent, and quality of service

  6. Location Management in Distributed Mobile Environments

    DTIC Science & Technology

    1994-09-01

    carried out to evaluate the performanceof di erent static strategies for various communica-tion and mobility patterns. Simulation results indi-cate...results ofsimulations carried out to evaluate the performanceof proposed static location management strategies forvarious call and mobility patterns...Systems, Austin, Sept. 1994. U.S. Government or Federal Rights License 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17

  7. Multi-filter spectrophotometry of quasar environments

    NASA Technical Reports Server (NTRS)

    Craven, Sally E.; Hickson, Paul; Yee, Howard K. C.

    1993-01-01

    A many-filter photometric technique for determining redshifts and morphological types, by fitting spectral templates to spectral energy distributions, has good potential for application in surveys. Despite success in studies performed on simulated data, the results have not been fully reliable when applied to real, low signal-to-noise data. We are investigating techniques to improve the fitting process.

  8. Strawman Distributed Interactive Simulation Architecture Description Document. Volume 2. Supporting Rationale. Book 2. DIS Architecture Issues

    DTIC Science & Technology

    1992-03-31

    the-loop, interactive training environment. Its primary advantage is that it has a long history of use and a number of experienced users. However...programmer teams. Mazda IsU ADST/WDLPr,-92.OO8O1O 2 The Object Oriented Behavioral Decomposition Approach Object oriented behavioral decomposition is

  9. Development of VR simulator for nurse training

    NASA Astrophysics Data System (ADS)

    Nakagawa, Y.; Tsetserukou, D.; Terashima, K.

    2014-02-01

    Our research focuses on the development of the VR simulator NurseSim to train nurse and hospital aides how to carry unconscious or injured person. The motivation behind this project is the fact that nurses consider patient lifting, transfer, and turning as most physically demanding. The user experiences the 3D environment in which they hold the subject. The task is to maintain such posture so that to prevent further injuries of the patient and distribute the weight of the patient over both hands evenly. Nurses are taught to mitigate and manage fatigue while at work.

  10. Heterogeneity in homogeneous nucleation from billion-atom molecular dynamics simulation of solidification of pure metal.

    PubMed

    Shibuta, Yasushi; Sakane, Shinji; Miyoshi, Eisuke; Okita, Shin; Takaki, Tomohiro; Ohno, Munekazu

    2017-04-05

    Can completely homogeneous nucleation occur? Large scale molecular dynamics simulations performed on a graphics-processing-unit rich supercomputer can shed light on this long-standing issue. Here, a billion-atom molecular dynamics simulation of homogeneous nucleation from an undercooled iron melt reveals that some satellite-like small grains surrounding previously formed large grains exist in the middle of the nucleation process, which are not distributed uniformly. At the same time, grains with a twin boundary are formed by heterogeneous nucleation from the surface of the previously formed grains. The local heterogeneity in the distribution of grains is caused by the local accumulation of the icosahedral structure in the undercooled melt near the previously formed grains. This insight is mainly attributable to the multi-graphics processing unit parallel computation combined with the rapid progress in high-performance computational environments.Nucleation is a fundamental physical process, however it is a long-standing issue whether completely homogeneous nucleation can occur. Here the authors reveal, via a billion-atom molecular dynamics simulation, that local heterogeneity exists during homogeneous nucleation in an undercooled iron melt.

  11. Incremental learning of concept drift in nonstationary environments.

    PubMed

    Elwell, Ryan; Polikar, Robi

    2011-10-01

    We introduce an ensemble of classifiers-based approach for incremental learning of concept drift, characterized by nonstationary environments (NSEs), where the underlying data distributions change over time. The proposed algorithm, named Learn(++). NSE, learns from consecutive batches of data without making any assumptions on the nature or rate of drift; it can learn from such environments that experience constant or variable rate of drift, addition or deletion of concept classes, as well as cyclical drift. The algorithm learns incrementally, as other members of the Learn(++) family of algorithms, that is, without requiring access to previously seen data. Learn(++). NSE trains one new classifier for each batch of data it receives, and combines these classifiers using a dynamically weighted majority voting. The novelty of the approach is in determining the voting weights, based on each classifier's time-adjusted accuracy on current and past environments. This approach allows the algorithm to recognize, and act accordingly, to the changes in underlying data distributions, as well as to a possible reoccurrence of an earlier distribution. We evaluate the algorithm on several synthetic datasets designed to simulate a variety of nonstationary environments, as well as a real-world weather prediction dataset. Comparisons with several other approaches are also included. Results indicate that Learn(++). NSE can track the changing environments very closely, regardless of the type of concept drift. To allow future use, comparison and benchmarking by interested researchers, we also release our data used in this paper. © 2011 IEEE

  12. Cislan-2 extension final document by University of Twente (Netherlands)

    NASA Astrophysics Data System (ADS)

    Niemegeers, Ignas; Baumann, Frank; Beuwer, Wim; Jordense, Marcel; Pras, Aiko; Schutte, Leon; Tracey, Ian

    1992-01-01

    Results of worked performed under the so called Cislan extension contract are presented. The adaptation of the Cislan 2 prototype design to an environment of interconnected Local Area Networks (LAN's) instead of a single 802.5 token ring LAN is considered. In order to extend the network architecture, the Interconnection Function (IF) protocol layer was subdivided into two protocol layers: a new IF layer, and below the Medium Enhancement (ME) protocol layer. Some small enhancements to the distributed bandwidth allocation protocol were developed, which in fact are also applicable to the 'normal' Cislan 2 system. The new services and protocols are described together with some scenarios and requirements for the new internetting Cislan 2 system. How to overcome the degradation of the quality of speech due to packet loss on the LAN subsystem was studied. Experiments were planned in order to measure this speech quality degradation. Simulations were performed of two Cislan subsystems, the bandwidth allocation protocol and the clock synchronization mechanism. Results on both simulations, performed on SUN workstations using QNAP as a simulation tool, are given. Results of the simulations of the clock synchronization mechanism, and results of the simulation of the distributed bandwidth allocation protocol are given.

  13. Modeling Engineered Nanomaterials (ENMs) Fate and ...

    EPA Pesticide Factsheets

    Under the Toxic Substances Control Act (TSCA), the Environmental Protection Agency (EPA) is required to perform new chemical reviews of engineered nanomaterials (ENMs) identified in pre-manufacture notices. However, environmental fate models developed for traditional contaminants are limited in their ability to simulate the environmental behavior of nanomaterials due to incomplete understanding and representation of the processes governing nanomaterial distribution in the environment and by scarce empirical data quantifying the interaction of nanomaterials with environmental surfaces. We have updated the Water Quality Analysis Simulation Program (WASP), version S, to incorporate nanomaterials as an explicitly simulated state variable. WASPS now has the capability to simulate nanomaterial fate and transport in surface waters and sediments using heteroaggregation, the kinetic process governing the attachment of nanomaterials to particles and subsequently ENM distribution in the aqueous and sediment phases. Unlike dissolved chemicals which use equilibrium partition coefficients, heteroaggregation consists of a particle collision rate and an attachment efficiency ( lXhet) that generally acts as a one direction process. To demonstrate, we used a derived a het value from sediment attachment studies to parameterize WASP for simulation of multi walled carbon nanotube (MWCNT) transport in Brier Creek, a coastal plain river located in central eastern Georgia, USA and a tr

  14. Real-time hierarchically distributed processing network interaction simulation

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Wu, C.

    1987-01-01

    The Telerobot Testbed is a hierarchically distributed processing system which is linked together through a standard, commercial Ethernet. Standard Ethernet systems are primarily designed to manage non-real-time information transfer. Therefore, collisions on the net (i.e., two or more sources attempting to send data at the same time) are managed by randomly rescheduling one of the sources to retransmit at a later time interval. Although acceptable for transmitting noncritical data such as mail, this particular feature is unacceptable for real-time hierarchical command and control systems such as the Telerobot. Data transfer and scheduling simulations, such as token ring, offer solutions to collision management, but do not appropriately characterize real-time data transfer/interactions for robotic systems. Therefore, models like these do not provide a viable simulation environment for understanding real-time network loading. A real-time network loading model is being developed which allows processor-to-processor interactions to be simulated, collisions (and respective probabilities) to be logged, collision-prone areas to be identified, and network control variable adjustments to be reentered as a means of examining and reducing collision-prone regimes that occur in the process of simulating a complete task sequence.

  15. Analysis of the interrelationship of energy, economy, and environment: A model of a sustainable energy future for Korea

    NASA Astrophysics Data System (ADS)

    Boo, Kyung-Jin

    The primary purpose of this dissertation is to provide the groundwork for a sustainable energy future in Korea. For this purpose, a conceptual framework of sustainable energy development was developed to provide a deeper understanding of interrelationships between energy, the economy, and the environment (E 3). Based on this theoretical work, an empirical simulation model was developed to investigate the ways in which E3 interact. This dissertation attempts to develop a unified concept of sustainable energy development by surveying multiple efforts to integrate various definitions of sustainability. Sustainable energy development should be built on the basis of three principles: ecological carrying capacity, economic efficiency, and socio-political equity. Ecological carrying capacity delineates the earth's resource constraints as well as its ability to assimilate wastes. Socio-political equity implies an equitable distribution of the benefits and costs of energy consumption and an equitable distribution of environmental burdens. Economic efficiency dictates efficient allocation of scarce resources. The simulation model is composed of three modules: an energy module, an environmental module and an economic module. Because the model is grounded on economic structural behaviorism, the dynamic nature of the current economy is effectively depicted and simulated through manipulating exogenous policy variables. This macro-economic model is used to simulate six major policy intervention scenarios. Major findings from these policy simulations were: (1) carbon taxes are the most effective means of reducing air-pollutant emissions; (2) sustainable energy development can be achieved through reinvestment of carbon taxes into energy efficiency and renewable energy programs; and (3) carbon taxes would increase a nation's welfare if reinvested in relevant areas. The policy simulation model, because it is based on neoclassical economics, has limitations such that it cannot fully account for socio-political realities (inter- and intra-generational equity) which are core feature of sustainability. Thus, alternative approaches based on qualitative analysis, such as the multi-criteria approach, will be required to complement the current policy simulation model.

  16. Proceedings of the Spacecraft Charging Technology Conference: Executive Summary

    NASA Technical Reports Server (NTRS)

    Pike, C. P.; Whipple, E. C., Jr.; Stevens, N. J.; Minges, M. L.; Lehn, W. L.; Bunn, M. H.

    1977-01-01

    Aerospace environments are reviewed in reference to spacecraft charging. Modelling, a theoretical scheme which can be used to describe the structure of the sheath around the spacecraft and to calculate the charging currents within, is discussed. Materials characterization is considered for experimental determination of the behavior of typical spacecraft materials when exposed to simulated geomagnetic substorm conditions. Materials development is also examined for controlling and minimizing spacecraft charging or at least for distributing the charge in an equipotential manner, using electrical conductive surfaces for materials exposed to space environment.

  17. Range Systems Simulation for the NASA Shuttle: Emphasis on Disaster and Prevention Management During Lift-Off

    NASA Technical Reports Server (NTRS)

    Rabelo, Lisa; Sepulveda, Jose; Moraga, Reinaldo; Compton, Jeppie; Turner, Robert

    2005-01-01

    This article describes a decision-making system composed of a number of safety and environmental models for the launch phase of a NASA Space Shuttle mission. The components of this distributed simulation environment represent the different systems that must collaborate to establish the Expectation of Casualties (E(sub c)) caused by a failed Space Shuttle launch and subsequent explosion (accidental or instructed) of the spacecraft shortly after liftoff. This decision-making tool employs Space Shuttle reliability models, trajectory models, a blast model, weather dissemination systems, population models, amount and type of toxicants, gas dispersion models, human response functions to toxicants, and a geographical information system. Since one of the important features of this proposed simulation environment is to measure blast, toxic, and debris effects, the clear benefits is that it can help safety managers not only estimate the population at risk, but also to help plan evacuations, make sheltering decisions, establish the resources required to provide aid and comfort, and mitigate damages in case of a disaster.

  18. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    NASA Astrophysics Data System (ADS)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  19. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  20. Generation of Subsurface Voids, Incubation Effect, and Formation of Nanoparticles in Short Pulse Laser Interactions with Bulk Metal Targets in Liquid: Molecular Dynamics Study

    PubMed Central

    2017-01-01

    The ability of short pulse laser ablation in liquids to produce clean colloidal nanoparticles and unusual surface morphology has been employed in a broad range of practical applications. In this paper, we report the results of large-scale molecular dynamics simulations aimed at revealing the key processes that control the surface morphology and nanoparticle size distributions by pulsed laser ablation in liquids. The simulations of bulk Ag targets irradiated in water are performed with an advanced computational model combining a coarse-grained representation of liquid environment and an atomistic description of laser interaction with metal targets. For the irradiation conditions that correspond to the spallation regime in vacuum, the simulations predict that the water environment can prevent the complete separation of the spalled layer from the target, leading to the formation of large subsurface voids stabilized by rapid cooling and solidification. The subsequent irradiation of the laser-modified surface is found to result in a more efficient ablation and nanoparticle generation, thus suggesting the possibility of the incubation effect in multipulse laser ablation in liquids. The simulations performed at higher laser fluences that correspond to the phase explosion regime in vacuum reveal the accumulation of the ablation plume at the interface with the water environment and the formation of a hot metal layer. The water in contact with the metal layer is brought to the supercritical state and provides an environment suitable for nucleation and growth of small metal nanoparticles from metal atoms emitted from the hot metal layer. The metal layer itself has limited stability and can readily disintegrate into large (tens of nanometers) nanoparticles. The layer disintegration is facilitated by the Rayleigh–Taylor instability of the interface between the higher density metal layer decelerated by the pressure from the lighter supercritical water. The nanoparticles emerging from the layer disintegration are rapidly cooled and solidified due to the interaction with water environment, with a cooling rate of ∼2 × 1012 K/s observed in the simulations. The computational prediction of two distinct mechanisms of nanoparticle formation yielding nanoparticles with different characteristic sizes provides a plausible explanation for the experimental observations of bimodal nanoparticle size distributions in laser ablation in liquids. The ultrahigh cooling and solidification rates suggest the possibility for generation of nanoparticles featuring metastable phases and highly nonequilibrium structures. PMID:28798858

  1. Advanced Engineering Environments: Implications for Aerospace Manufacturing

    NASA Technical Reports Server (NTRS)

    Thomas, D.

    2001-01-01

    There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.

  2. The Future of Drought in the Southeastern U.S.: Projections from downscaled CMIP5 models

    NASA Astrophysics Data System (ADS)

    Keellings, D.; Engstrom, J.

    2017-12-01

    The Southeastern U.S. has been repeatedly impacted by severe droughts that have affected the environment and economy of the region. In this study the ability of 32 downscaled CMIP5 models, bias corrected using localized constructed analogs (LOCA), to simulate historical observations of dry spells from 1950-2005 are assessed using Perkins skill scores and significance tests. The models generally simulate the distribution of dry days well but there are significant differences between the ability of the best and worst performing models, particularly when it comes to the upper tail of the distribution. The best and worst performing models are then projected through 2099, using RCP 4.5 and 8.5, and estimates of 20 year return periods are compared. Only the higher skill models provide a good estimate of extreme dry spell lengths with simulations of 20 year return values within ± 5 days of observed values across the region. Projected return values differ by model grouping, but all models exhibit significant increases.

  3. Influence of the spectral distribution of light on the characteristics of photovoltaic panel. Comparison between simulation and experimental

    NASA Astrophysics Data System (ADS)

    Chadel, Meriem; Bouzaki, Mohammed Moustafa; Chadel, Asma; Petit, Pierre; Sawicki, Jean-Paul; Aillerie, Michel; Benyoucef, Boumediene

    2017-02-01

    We present and analyze experimental results obtained with a laboratory setup based on a hardware and smart instrumentation for the complete study of performance of PV panels using for illumination an artificial radiation source (Halogen lamps). Associated to an accurate analysis, this global experimental procedure allows the determination of effective performance under standard conditions thanks to a simulation process originally developed under Matlab software environment. The uniformity of the irradiated surface was checked by simulation of the light field. We studied the response of standard commercial photovoltaic panels under enlightenment measured by a spectrometer with different spectra for two sources, halogen lamps and sunlight. Then, we bring a special attention to the influence of the spectral distribution of light on the characteristics of photovoltaic panel, that we have performed as a function of temperature and for different illuminations with dedicated measurements and studies of the open circuit voltage and short-circuit current.

  4. Using multi-disciplinary optimization and numerical simulation on the transiting exoplanet survey satellite

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2017-08-01

    The Transiting Exoplanet Survey Satellite (TESS) is an instrument consisting of four, wide fieldof- view CCD cameras dedicated to the discovery of exoplanets around the brightest stars, and understanding the diversity of planets and planetary systems in our galaxy. Each camera utilizes a seven-element lens assembly with low-power and low-noise CCD electronics. Advanced multivariable optimization and numerical simulation capabilities accommodating arbitrarily complex objective functions have been added to the internally developed Lincoln Laboratory Integrated Modeling and Analysis Software (LLIMAS) and used to assess system performance. Various optical phenomena are accounted for in these analyses including full dn/dT spatial distributions in lenses and charge diffusion in the CCD electronics. These capabilities are utilized to design CCD shims for thermal vacuum chamber testing and flight, and verify comparable performance in both environments across a range of wavelengths, field points and temperature distributions. Additionally, optimizations and simulations are used for model correlation and robustness optimizations.

  5. Sensitivity of Offshore Surface Fluxes and Sea Breezes to the Spatial Distribution of Sea-Surface Temperature

    NASA Astrophysics Data System (ADS)

    Lombardo, Kelly; Sinsky, Eric; Edson, James; Whitney, Michael M.; Jia, Yan

    2018-03-01

    A series of numerical sensitivity experiments is performed to quantify the impact of sea-surface temperature (SST) distribution on offshore surface fluxes and simulated sea-breeze dynamics. The SST simulations of two mid-latitude sea-breeze events over coastal New England are performed using a spatially-uniform SST, as well as spatially-varying SST datasets of 32- and 1-km horizontal resolutions. Offshore surface heat and buoyancy fluxes vary in response to the SST distribution. Local sea-breeze circulations are relatively insensitive, with minimal differences in vertical structure and propagation speed among the experiments. The largest thermal perturbations are confined to the lowest 10% of the sea-breeze column due to the relatively high stability of the mid-Atlantic marine atmospheric boundary layer (ABL) suppressing vertical mixing, resulting in the depth of the marine layer remaining unchanged. Minimal impacts on the column-averaged virtual potential temperature and sea-breeze depth translates to small changes in sea-breeze propagation speed. This indicates that the use of datasets with a fine-scale SST may not produce more accurate sea-breeze simulations in highly stable marine ABL regimes, though may prove more beneficial in less stable sub-tropical environments.

  6. Collective odor source estimation and search in time-variant airflow environments using mobile robots.

    PubMed

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots' search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot's detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection-diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method.

  7. Collective Odor Source Estimation and Search in Time-Variant Airflow Environments Using Mobile Robots

    PubMed Central

    Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming

    2011-01-01

    This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots’ search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot’s detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection–diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method. PMID:22346650

  8. Simulation of organic molecule formation in solar system environments-The Miller-Urey Experiment in Space project overview

    NASA Astrophysics Data System (ADS)

    Kotler, J. Michelle; Ehrenfruend, Pascale; Botta, Oliver; Blum, Jurgen; Schrapler, Rainer; van Dongen, Joost; Palmans, Anja; Sephton, Mark A.; Martins, Zita; Cleaves, Henderson J.; Ricco, Antonio

    The Miller-Urey Experiment in space (MUE) investigates the formation of potential prebiotic organic compounds in the early solar system environment. The MUE experiment will be sent to and retrieved from the International Space Station (ISS), where it will be performed inside the Microgravity Science Glovebox (MSG). The goal of this space experiment is to understand prebiotic reactions in microgravity by simulating environments of the early solar nebula. The dynamic environment of the solar nebula with the simultaneous presence of gas, particles, and energetic processes, including shock waves, lightning, and radiation may trigger a rich organic chemistry leading to organic molecules. These environments will be simulated in six fabricated vials containing various gas mixtures as well as solid particles. Two gas mixture compositions will be tested and subjected to continuous spark discharges for 48, 96, and 192 hours. Silicate particles will serve as surfaces on which thin water ice mantles can accrete. The particles will move repeatedly through a high-voltage spark discharge in microgravity, enabling chemical re-actions analogous to the original Miller-Urey experiment. The experiment will be performed at low temperatures (-5 C), slowing hydrolysis and improving chances of detection of interme-diates, initial products, and their distributions. Executing the Miller-Urey experiment in the space environment (microgravity) allows us to simulate conditions that could have prevailed in the energetic early solar nebula and provides insights into the chemical pathways that may occur in forming planetary systems. Analysis will be performed post-flight using chemical analytical methods. The anticipated results will provide information about chemical reaction pathways to form organic compounds in space environment, emphasizing abiotic chemical pathways and mechanisms that could have been crucial in the formation of biologically relevant compounds such as amino acids and nucleobases, basic constituents common to life on Earth.

  9. Turbulent unmixing: how marine turbulence drives patchy distributions of motile phytoplankton

    NASA Astrophysics Data System (ADS)

    Durham, William; Climent, Eric; Barry, Michael; de Lillo, Filippo; Boffetta, Guido; Cencini, Massimo; Stocker, Roman

    2013-11-01

    Centimeter-scale patchiness in the distribution of phytoplankton increases the efficacy of many important ecological interactions in the marine food web. We show that turbulent fluid motion, usually synonymous with mixing, instead triggers intense small-scale patchiness in the distribution of motile phytoplankton. We use a suite of experiments, direct numerical simulations of turbulence, and analytical tools to show that turbulent shear and acceleration directs the motility of cells towards well-defined regions of flow, increasing local cell concentrations more than ten fold. This motility-driven `unmixing' offers an explanation for why motile cells are often more patchily distributed than non-motile cells and provides a mechanistic framework to understand how turbulence, whose strength varies profoundly in marine environments, impacts ocean productivity.

  10. A formation control strategy with coupling weights for the multi-robot system

    NASA Astrophysics Data System (ADS)

    Liang, Xudong; Wang, Siming; Li, Weijie

    2017-12-01

    The distributed formation problem of the multi-robot system with general linear dynamic characteristics and directed communication topology is discussed. In order to avoid that the multi-robot system can not maintain the desired formation in the complex communication environment, the distributed cooperative algorithm with coupling weights based on zipf distribution is designed. The asymptotic stability condition for the formation of the multi-robot system is given, and the theory of the graph and the Lyapunov theory are used to prove that the formation can converge to the desired geometry formation and the desired motion rules of the virtual leader under this condition. Nontrivial simulations are performed to validate the effectiveness of the distributed cooperative algorithm with coupling weights.

  11. Development of a smart home simulator for use as a heuristic tool for management of sensor distribution.

    PubMed

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2009-01-01

    Smart Homes offer potential solutions for various forms of independent living for the elderly. The assistive and protective environment afforded by smart homes offer a safe, relatively inexpensive, dependable and viable alternative to vulnerable inhabitants. Nevertheless, the success of a smart home rests upon the quality of information its decision support system receives and this in turn places great importance on the issue of correct sensor deployment. In this article we present a software tool that has been developed to address the elusive issue of sensor distribution within smart homes. Details of the tool will be presented and it will be shown how it can be used to emulate any real world environment whereby virtual sensor distributions can be rapidly implemented and assessed without the requirement for physical deployment for evaluation. As such, this approach offers the potential of tailoring sensor distributions to the specific needs of a patient in a non-evasive manner. The heuristics based tool presented here has been developed as the first part of a three stage project.

  12. A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation

    PubMed Central

    Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas

    2011-01-01

    High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089

  13. What do Simulations Predict for the Galaxy Stellar Mass Function and its Evolution in Different Environments?

    NASA Astrophysics Data System (ADS)

    Vulcani, Benedetta; De Lucia, Gabriella; Poggianti, Bianca M.; Bundy, Kevin; More, Surhud; Calvi, Rosa

    2014-06-01

    We present a comparison between the observed galaxy stellar mass function and the one predicted from the De Lucia & Blaizot semi-analytic model applied to the Millennium Simulation, for cluster satellites and galaxies in the field (meant as a wide portion of the sky, including all environments), in the local universe (z ~ 0.06), and at intermediate redshift (z ~ 0.6), with the aim to shed light on the processes which regulate the mass distribution in different environments. While the mass functions in the field and in its finer environments (groups, binary, and single systems) are well matched in the local universe down to the completeness limit of the observational sample, the model overpredicts the number of low-mass galaxies in the field at z ~ 0.6 and in clusters at both redshifts. Above M * = 1010.25 M ⊙, it reproduces the observed similarity of the cluster and field mass functions but not the observed evolution. Our results point out two shortcomings of the model: an incorrect treatment of cluster-specific environmental effects and an overefficient galaxy formation at early times (as already found by, e.g., Weinmann et al.). Next, we consider only simulations. Also using the Guo et al. model, we find that the high-mass end of the mass functions depends on halo mass: only very massive halos host massive galaxies, with the result that their mass function is flatter. Above M * = 109.4 M ⊙, simulations show an evolution in the number of the most massive galaxies in all environments. Mass functions obtained from the two prescriptions are different, however, results are qualitatively similar, indicating that the adopted methods to model the evolution of central and satellite galaxies still have to be better implemented in semi-analytic models.

  14. Simulation of Cardiovascular Response to the Head-Up/Head-Down Tilt at Different Angles

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Lu, Hong-Bing; Jiao, Chun; Zhang, Li-Fan

    2008-06-01

    The disappearance of hydrostatic pressure is the original factor that causes the changes of cardiovascular system under microgravity. The hydrostatical changes can be simulated by postural changes. Especially the head-down position can be used to simulate the effects of microgravity. The goal of this investigation was to develop a mathematical model for simulation of the human cardiovascular responses to acute and prolonged exposure under microgravity environment. We were particularly interested in the redistribution of transmural pressures, flows, blood volume, and the consequent alterations in local hemodynamics in different cardiovascular compartments during acute exposure and chronic adjustments. As a preliminary study, we first developed a multi-element, distributed hemodynamic model of human cardiovascular system, and verified the model to simulate cardiovascular changes during head up/down tilt at various angles.

  15. Modeling mobile source emissions during traffic jams in a micro urban environment.

    PubMed

    Kondrashov, Valery V; Reshetin, Vladimir P; Regens, James L; Gunter, James T

    2002-01-01

    Urbanization typically involves a continuous increase in motor vehicle use, resulting in congestion known as traffic jams. Idling emissions due to traffic jams combine with the complex terrain created by buildings to concentrate atmospheric pollutants in localized areas. This research simulates emissions concentrations and distributions for a congested street in Minsk, Belarus. Ground-level (up to 50-meters above the street's surface) pollutant concentrations were calculated using STAR (version 3.10) with emission factors obtained from the U.S. Environmental Protection Agency, wind speed and direction, and building location and size. Relative emissions concentrations and distributions were simulated at 1-meter and 10-meters above street level. The findings demonstrate the importance of wind speed and direction, and building size and location on emissions concentrations and distributions, with the leeward sides of buildings retaining up to 99 percent of the emitted pollutants within 1-meter of street level, and up to 77 percent 10-meters above the street.

  16. Adaptive Detector Arrays for Optical Communications Receivers

    NASA Technical Reports Server (NTRS)

    Vilnrotter, V.; Srinivasan, M.

    2000-01-01

    The structure of an optimal adaptive array receiver for ground-based optical communications is described and its performance investigated. Kolmogorov phase screen simulations are used to model the sample functions of the focal-plane signal distribution due to turbulence and to generate realistic spatial distributions of the received optical field. This novel array detector concept reduces interference from background radiation by effectively assigning higher confidence levels at each instant of time to those detector elements that contain significant signal energy and suppressing those that do not. A simpler suboptimum structure that replaces the continuous weighting function of the optimal receiver by a hard decision on the selection of the signal detector elements also is described and evaluated. Approximations and bounds to the error probability are derived and compared with the exact calculations and receiver simulation results. It is shown that, for photon-counting receivers observing Poisson-distributed signals, performance improvements of approximately 5 dB can be obtained over conventional single-detector photon-counting receivers, when operating in high background environments.

  17. CONFIG: Integrated engineering of systems and their operation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.

  18. Turbomachinery CFD on parallel computers

    NASA Technical Reports Server (NTRS)

    Blech, Richard A.; Milner, Edward J.; Quealy, Angela; Townsend, Scott E.

    1992-01-01

    The role of multistage turbomachinery simulation in the development of propulsion system models is discussed. Particularly, the need for simulations with higher fidelity and faster turnaround time is highlighted. It is shown how such fast simulations can be used in engineering-oriented environments. The use of parallel processing to achieve the required turnaround times is discussed. Current work by several researchers in this area is summarized. Parallel turbomachinery CFD research at the NASA Lewis Research Center is then highlighted. These efforts are focused on implementing the average-passage turbomachinery model on MIMD, distributed memory parallel computers. Performance results are given for inviscid, single blade row and viscous, multistage applications on several parallel computers, including networked workstations.

  19. Toward Improved Parameterization of a Meso-Scale Hydrologic Model in a Discontinuous Permafrost, Boreal Forest Ecosystem

    NASA Astrophysics Data System (ADS)

    Endalamaw, A. M.; Bolton, W. R.; Young, J. M.; Morton, D.; Hinzman, L. D.

    2013-12-01

    The sub-arctic environment can be characterized as being located in the zone of discontinuous permafrost. Although the distribution of permafrost is site specific, it dominates many of the hydrologic and ecologic responses and functions including vegetation distribution, stream flow, soil moisture, and storage processes. In this region, the boundaries that separate the major ecosystem types (deciduous dominated and coniferous dominated ecosystems) as well as permafrost (permafrost verses non-permafrost) occur over very short spatial scales. One of the goals of this research project is to improve parameterizations of meso-scale hydrologic models in this environment. Using the Caribou-Poker Creeks Research Watershed (CPCRW) as the test area, simulations of the headwater catchments of varying permafrost and vegetation distributions were performed. CPCRW, located approximately 50 km northeast of Fairbanks, Alaska, is located within the zone of discontinuous permafrost and the boreal forest ecosystem. The Variable Infiltration Capacity (VIC) model was selected as the hydrologic model. In CPCRW, permafrost and coniferous vegetation is generally found on north facing slopes and valley bottoms. Permafrost free soils and deciduous vegetation is generally found on south facing slopes. In this study, hydrologic simulations using fine scale vegetation and soil parameterizations - based upon slope and aspect analysis at a 50 meter resolution - were conducted. Simulations were also conducted using downscaled vegetation from the Scenarios Network for Alaska and Arctic Planning (SNAP) (1 km resolution) and soil data sets from the Food and Agriculture Organization (FAO) (approximately 9 km resolution). Preliminary simulation results show that soil and vegetation parameterizations based upon fine scale slope/aspect analysis increases the R2 values (0.5 to 0.65 in the high permafrost (53%) basin; 0.43 to 0.56 in the low permafrost (2%) basin) relative to parameterization based on coarse scale data. These results suggest that using fine resolution parameterizations can be used to improve meso-scale hydrological modeling in this region.

  20. Fluoroquinolones in the Wenyu River catchment, China: Occurrence simulation and risk assessment.

    PubMed

    Hao, Xuewen; Cao, Yan; Zhang, Lai; Zhang, Yongyong; Liu, Jianguo

    2015-12-01

    Concern is increasing regarding the environmental impact of the high usage rate and intensive release of antibiotics used for human and animal therapy in major urban areas of China. In the present study, regional environmental distribution simulations and risk assessments for 3 commonly used fluoroquinolones in the Wenyu River catchment were conducted using a typical catchment model widely used in Europe. The fluoroquinolone antibiotics investigated (ofloxacin, norfloxacin, and ciprofloxacin) are consumed at high levels for personal health care in China. These antibiotics were simulated in the aquatic environment of the Wenyu River catchment across the Beijing City area for annual average concentrations, with regional predicted environmental concentrations (PECs) of approximately 711 ng/L, 55.3 ng/L, and 22.2 ng/L and local PECs up to 1.8 µg/L, 116 ng/L, and 43 ng/L, respectively. Apart from hydrological conditions, the concentrations of fluoroquinolones were associated closely with the sewage treatment plants (STPs) and their serving population, as well as hospital distributions. The presence of these fluoroquinolones in the catchment area of the present study showed significant characteristics of the occurrence of pharmaceuticals in the aquatic environment in an urban river, with typical "down-the-drain" chemicals. Significantly high concentrations of specific antibiotics indicated non-negligible risks caused by the intensive use in the local aquatic environment in a metropolitan area, particularly ofloxacin in upstream Shahe Reservoir, middle stream and downstream Qing River, and Liangma River to the Ba River segment. Specific treatment measures for these pharmaceuticals and personal care products in STPs are required for such metropolitan areas. © 2015 SETAC.

  1. Real-world hydrologic assessment of a fully-distributed hydrological model in a parallel computing environment

    NASA Astrophysics Data System (ADS)

    Vivoni, Enrique R.; Mascaro, Giuseppe; Mniszewski, Susan; Fasel, Patricia; Springer, Everett P.; Ivanov, Valeriy Y.; Bras, Rafael L.

    2011-10-01

    SummaryA major challenge in the use of fully-distributed hydrologic models has been the lack of computational capabilities for high-resolution, long-term simulations in large river basins. In this study, we present the parallel model implementation and real-world hydrologic assessment of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS). Our parallelization approach is based on the decomposition of a complex watershed using the channel network as a directed graph. The resulting sub-basin partitioning divides effort among processors and handles hydrologic exchanges across boundaries. Through numerical experiments in a set of nested basins, we quantify parallel performance relative to serial runs for a range of processors, simulation complexities and lengths, and sub-basin partitioning methods, while accounting for inter-run variability on a parallel computing system. In contrast to serial simulations, the parallel model speed-up depends on the variability of hydrologic processes. Load balancing significantly improves parallel speed-up with proportionally faster runs as simulation complexity (domain resolution and channel network extent) increases. The best strategy for large river basins is to combine a balanced partitioning with an extended channel network, with potential savings through a lower TIN resolution. Based on these advances, a wider range of applications for fully-distributed hydrologic models are now possible. This is illustrated through a set of ensemble forecasts that account for precipitation uncertainty derived from a statistical downscaling model.

  2. FISHER'S GEOMETRIC MODEL WITH A MOVING OPTIMUM

    PubMed Central

    Matuszewski, Sebastian; Hermisson, Joachim; Kopp, Michael

    2014-01-01

    Fisher's geometric model has been widely used to study the effects of pleiotropy and organismic complexity on phenotypic adaptation. Here, we study a version of Fisher's model in which a population adapts to a gradually moving optimum. Key parameters are the rate of environmental change, the dimensionality of phenotype space, and the patterns of mutational and selectional correlations. We focus on the distribution of adaptive substitutions, that is, the multivariate distribution of the phenotypic effects of fixed beneficial mutations. Our main results are based on an “adaptive-walk approximation,” which is checked against individual-based simulations. We find that (1) the distribution of adaptive substitutions is strongly affected by the ecological dynamics and largely depends on a single composite parameter γ, which scales the rate of environmental change by the “adaptive potential” of the population; (2) the distribution of adaptive substitution reflects the shape of the fitness landscape if the environment changes slowly, whereas it mirrors the distribution of new mutations if the environment changes fast; (3) in contrast to classical models of adaptation assuming a constant optimum, with a moving optimum, more complex organisms evolve via larger adaptive steps. PMID:24898080

  3. Finite element analysis and performance study of switched reluctance generator

    NASA Astrophysics Data System (ADS)

    Zhang, Qianhan; Guo, Yingjun; Xu, Qi; Yu, Xiaoying; Guo, Yajie

    2017-03-01

    Analyses a three-phase 12/8 switched reluctance generator (SRG) which is based on its structure and performance principle. The initial size data were calculated by MathCAD, and the simulation model was set up in the ANSOFT software environment with the maximum efficiency and the maximum output power as the main reference parameters. The outer diameter of the stator and the inner diameter of the rotor were parameterized. The static magnetic field distribution, magnetic flux, magnetic energy, torque, inductance characteristics, back electromotive force and phase current waveform of SRG is obtained by analyzing the static magnetic field and the steady state motion of two-dimensional transient magnetic field in ANSOFT environment. Finally, the experimental data of the prototype are compared with the simulation results, which provide a reliable basis for the design and research of SRG wind turbine system.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smallwood, D.O.

    It is recognized that some dynamic and noise environments are characterized by time histories which are not Gaussian. An example is high intensity acoustic noise. Another example is some transportation vibration. A better simulation of these environments can be generated if a zero mean non-Gaussian time history can be reproduced with a specified auto (or power) spectral density (ASD or PSD) and a specified probability density function (pdf). After the required time history is synthesized, the waveform can be used for simulation purposes. For example, modem waveform reproduction techniques can be used to reproduce the waveform on electrodynamic or electrohydraulicmore » shakers. Or the waveforms can be used in digital simulations. A method is presented for the generation of realizations of zero mean non-Gaussian random time histories with a specified ASD, and pdf. First a Gaussian time history with the specified auto (or power) spectral density (ASD) is generated. A monotonic nonlinear function relating the Gaussian waveform to the desired realization is then established based on the Cumulative Distribution Function (CDF) of the desired waveform and the known CDF of a Gaussian waveform. The established function is used to transform the Gaussian waveform to a realization of the desired waveform. Since the transformation preserves the zero-crossings and peaks of the original Gaussian waveform, and does not introduce any substantial discontinuities, the ASD is not substantially changed. Several methods are available to generate a realization of a Gaussian distributed waveform with a known ASD. The method of Smallwood and Paez (1993) is an example. However, the generation of random noise with a specified ASD but with a non-Gaussian distribution is less well known.« less

  5. THE HALO MASS FUNCTION CONDITIONED ON DENSITY FROM THE MILLENNIUM SIMULATION: INSIGHTS INTO MISSING BARYONS AND GALAXY MASS FUNCTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faltenbacher, A.; Finoguenov, A.; Drory, N.

    2010-03-20

    The baryon content of high-density regions in the universe is relevant to two critical unanswered questions: the workings of nurture effects on galaxies and the whereabouts of the missing baryons. In this paper, we analyze the distribution of dark matter and semianalytical galaxies in the Millennium Simulation to investigate these problems. Applying the same density field reconstruction schemes as used for the overall matter distribution to the matter locked in halos, we study the mass contribution of halos to the total mass budget at various background field densities, i.e., the conditional halo mass function. In this context, we present amore » simple fitting formula for the cumulative mass function accurate to {approx}<5% for halo masses between 10{sup 10} and 10{sup 15} h {sup -1} M{sub sun}. We find that in dense environments the halo mass function becomes top heavy and present corresponding fitting formulae for different redshifts. We demonstrate that the major fraction of matter in high-density fields is associated with galaxy groups. Since current X-ray surveys are able to nearly recover the universal baryon fraction within groups, our results indicate that the major part of the so-far undetected warm-hot intergalactic medium resides in low-density regions. Similarly, we show that the differences in galaxy mass functions with environment seen in observed and simulated data stem predominantly from differences in the mass distribution of halos. In particular, the hump in the galaxy mass function is associated with the central group galaxies, and the bimodality observed in the galaxy mass function is therefore interpreted as that of central galaxies versus satellites.« less

  6. Effects of Data Replication on Data Exfiltration in Mobile Ad Hoc Networks Utilizing Reactive Protocols

    DTIC Science & Technology

    2015-03-01

    2.5.5 Availability Schemes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2.6 Simulation Environments...routing scheme can prove problematic. Two prominent proactive protocols, 7 Destination-Sequenced Distance-Vector (DSDV) and Optimized Link State...distributed file management systems such as Tahoe- LAFS as part of its replication scheme . Altman and De Pellegrini [4] examine the impact of FEC and

  7. The StarLite Project Prototyping Real-Time Software

    DTIC Science & Technology

    1991-10-01

    multiversion data objects using the prototyping environment. Section 5 concludes the paper. 2. Message-Based Simulation When prototyping distributed...phase locking and priority-based synchronization algorithms, and between a multiversion database and its corresponding single-version database, through...its deadline, since the transaction is only aborted in the validation phase. 4.5. A Multiversion Database System To illustrate the effctivcness of the

  8. Planetary geomorphology research: FY 1990-1991

    NASA Technical Reports Server (NTRS)

    Malin, M. C.

    1991-01-01

    Progress in the following research areas is discussed: (1) volatile ice sublimation in a simulated Martian polar environment; (2) a global synthesis of Venusian tectonics; (3) a summary of nearly a decade of field studies of eolian processes in cold volcanic deserts; and (4) a model for interpretation of Martian sediment distribution using Viking observations. Some conclusions from the research are presented.

  9. Distributed Decision Making in a Dynamic Network Environment

    DTIC Science & Technology

    1990-01-01

    protocols, particularly when traffic arrival statistics are varying or unknown, and loads are high. Both nonpreemptive and preemptive repeat disciplines are...The simulation model allows general value functions, continuous time operation, and preemptive or nonpreemptive service. For reasons of tractability... nonpreemptive LIFO, (4) nonpreemptive LIFO with discarding, (5) nonpreemptive HOL, (6) nonpreemp- tive HOL with discarding, (7) preemptive repeat HOL, (8

  10. Agent-based Decision Support System for the Third Generation Distributed Dynamic Decision-making (DDD-III) Simulator

    DTIC Science & Technology

    2004-06-01

    suitable form of organizational adaptation is effective organizational diagnosis and analysis. The organizational diagnosis and analysis involve...related to the mission environment, organizational structure, and strategy is imperative for an effective and efficient organizational diagnosis . The...not easily articulated nor expressed otherwise. These displays are crucial to facilitate effective organizational diagnosis and analysis, and

  11. An Automatic Instrument to Study the Spatial Scaling Behavior of Emissivity

    PubMed Central

    Tian, Jing; Zhang, Renhua; Su, Hongbo; Sun, Xiaomin; Chen, Shaohui; Xia, Jun

    2008-01-01

    In this paper, the design of an automatic instrument for measuring the spatial distribution of land surface emissivity is presented, which makes the direct in situ measurement of the spatial distribution of emissivity possible. The significance of this new instrument lies in two aspects. One is that it helps to investigate the spatial scaling behavior of emissivity and temperature; the other is that, the design of the instrument provides theoretical and practical foundations for the implement of measuring distribution of surface emissivity on airborne or spaceborne. To improve the accuracy of the measurements, the emissivity measurement and its uncertainty are examined in a series of carefully designed experiments. The impact of the variation of target temperature and the environmental irradiance on the measurement of emissivity is analyzed as well. In addition, the ideal temperature difference between hot environment and cool environment is obtained based on numerical simulations. Finally, the scaling behavior of surface emissivity caused by the heterogeneity of target is discussed. PMID:27879735

  12. Presumed PDF Modeling of Early Flame Propagation in Moderate to Intense Turbulence Environments

    NASA Technical Reports Server (NTRS)

    Carmen, Christina; Feikema, Douglas A.

    2003-01-01

    The present paper describes the results obtained from a one-dimensional time dependent numerical technique that simulates early flame propagation in a moderate to intense turbulent environment. Attention is focused on the development of a spark-ignited, premixed, lean methane/air mixture with the unsteady spherical flame propagating in homogeneous and isotropic turbulence. A Monte-Carlo particle tracking method, based upon the method of fractional steps, is utilized to simulate the phenomena represented by a probability density function (PDF) transport equation. Gaussian distributions of fluctuating velocity and fuel concentration are prescribed. Attention is focused on three primary parameters that influence the initial flame kernel growth: the detailed ignition system characteristics, the mixture composition, and the nature of the flow field. The computational results of moderate and intense isotropic turbulence suggests that flames within the distributed reaction zone are not as vulnerable, as traditionally believed, to the adverse effects of increased turbulence intensity. It is also shown that the magnitude of the flame front thickness significantly impacts the turbulent consumption flame speed. Flame conditions studied have fuel equivalence ratio s in the range phi = 0.6 to 0.9 at standard temperature and pressure.

  13. Effects of distribution density and cell dimension of 3D vegetation model on canopy NDVI simulation base on DART

    NASA Astrophysics Data System (ADS)

    Tao, Zhu; Shi, Runhe; Zeng, Yuyan; Gao, Wei

    2017-09-01

    The 3D model is an important part of simulated remote sensing for earth observation. Regarding the small-scale spatial extent of DART software, both the details of the model itself and the number of models of the distribution have an important impact on the scene canopy Normalized Difference Vegetation Index (NDVI).Taking the phragmitesaustralis in the Yangtze Estuary as an example, this paper studied the effect of the P.australias model on the canopy NDVI, based on the previous studies of the model precision, mainly from the cell dimension of the DART software and the density distribution of the P.australias model in the scene, As well as the choice of the density of the P.australiass model under the cost of computer running time in the actual simulation. The DART Cell dimensions and the density of the scene model were set by using the optimal precision model from the existing research results. The simulation results of NDVI with different model densities under different cell dimensions were analyzed by error analysis. By studying the relationship between relative error, absolute error and time costs, we have mastered the density selection method of P.australias model in the simulation of small-scale spatial scale scene. Experiments showed that the number of P.australias in the simulated scene need not be the same as those in the real environment due to the difference between the 3D model and the real scenarios. The best simulation results could be obtained by keeping the density ratio of about 40 trees per square meter, simultaneously, of the visual effects.

  14. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  15. Understanding environment-influenced swarm behavior from a social force perspective

    NASA Astrophysics Data System (ADS)

    Jiang, J.; Lu, D.; Jiang, Y.; Lee, Z.; Zhang, Y.; Yu, J.

    2018-02-01

    The relevant research on swarm behavior has focused on the facts that when individuals agree with other members in the system globally consistent behaviors are generated and that individual decisions are completely dominated by other members. In fact, when individuals generate their own behavior strategies, they tend to consider not only the influences of other members but also autonomically consider their current environment. For example, in the social foraging of flocks, the behavior strategy of each individual animal is influenced by the food distribution, and individual movement patterns are characterized by a highly efficient search strategy-Lévy walks. To investigate this, this paper proposes using an environment-driven social force perspective to explore the Lévy walks of individuals in a group in patchy food environments. This model adopts the concept of social force to quantify the social effects and the interactions between individuals and food. The coordination between forces is a key in the formation of individual behavior strategies. Our simulation results show a power-law frequency distribution for agent flight lengths that conforms to Lévy walks and verifies the hypothesis of a relationship between food density and the Lévy index. In our model, the flock still exhibits collective consistency and cohesion and yields a high value for the order parameter and population density when moving between food patches. In addition, our model explains the intraspecific cooperation and competition that occurs during foraging as proposed in related work. The simulation also validates the impact of two inducements for individual behaviors compared with several benchmark models.

  16. Efficient Round-Trip Time Optimization for Replica-Exchange Enveloping Distribution Sampling (RE-EDS).

    PubMed

    Sidler, Dominik; Cristòfol-Clough, Michael; Riniker, Sereina

    2017-06-13

    Replica-exchange enveloping distribution sampling (RE-EDS) allows the efficient estimation of free-energy differences between multiple end-states from a single molecular dynamics (MD) simulation. In EDS, a reference state is sampled, which can be tuned by two types of parameters, i.e., smoothness parameters(s) and energy offsets, such that all end-states are sufficiently sampled. However, the choice of these parameters is not trivial. Replica exchange (RE) or parallel tempering is a widely applied technique to enhance sampling. By combining EDS with the RE technique, the parameter choice problem could be simplified and the challenge shifted toward an optimal distribution of the replicas in the smoothness-parameter space. The choice of a certain replica distribution can alter the sampling efficiency significantly. In this work, global round-trip time optimization (GRTO) algorithms are tested for the use in RE-EDS simulations. In addition, a local round-trip time optimization (LRTO) algorithm is proposed for systems with slowly adapting environments, where a reliable estimate for the round-trip time is challenging to obtain. The optimization algorithms were applied to RE-EDS simulations of a system of nine small-molecule inhibitors of phenylethanolamine N-methyltransferase (PNMT). The energy offsets were determined using our recently proposed parallel energy-offset (PEOE) estimation scheme. While the multistate GRTO algorithm yielded the best replica distribution for the ligands in water, the multistate LRTO algorithm was found to be the method of choice for the ligands in complex with PNMT. With this, the 36 alchemical free-energy differences between the nine ligands were calculated successfully from a single RE-EDS simulation 10 ns in length. Thus, RE-EDS presents an efficient method for the estimation of relative binding free energies.

  17. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    PubMed Central

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.

    2016-01-01

    Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387

  18. MaGate Simulator: A Simulation Environment for a Decentralized Grid Scheduler

    NASA Astrophysics Data System (ADS)

    Huang, Ye; Brocco, Amos; Courant, Michele; Hirsbrunner, Beat; Kuonen, Pierre

    This paper presents a simulator for of a decentralized modular grid scheduler named MaGate. MaGate’s design emphasizes scheduler interoperability by providing intelligent scheduling serving the grid community as a whole. Each MaGate scheduler instance is able to deal with dynamic scheduling conditions, with continuously arriving grid jobs. Received jobs are either allocated on local resources, or delegated to other MaGates for remote execution. The proposed MaGate simulator is based on GridSim toolkit and Alea simulator, and abstracts the features and behaviors of complex fundamental grid elements, such as grid jobs, grid resources, and grid users. Simulation of scheduling tasks is supported by a grid network overlay simulator executing distributed ant-based swarm intelligence algorithms to provide services such as group communication and resource discovery. For evaluation, a comparison of behaviors of different collaborative policies among a community of MaGates is provided. Results support the use of the proposed approach as a functional ready grid scheduler simulator.

  19. Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.

    PubMed

    Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle

    2014-11-01

    To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.

  20. Shared virtual environments for aerospace training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Voss, Mark

    1994-01-01

    Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.

  1. Quantifying Low Energy Proton Damage in Multijunction Solar Cells

    NASA Technical Reports Server (NTRS)

    Messenger, Scott R.; Burke, Edward A.; Walters, Robert J.; Warner, Jeffrey H.; Summers, Geoffrey P.; Lorentzen, Justin R.; Morton, Thomas L.; Taylor, Steven J.

    2007-01-01

    An analysis of the effects of low energy proton irradiation on the electrical performance of triple junction (3J) InGaP2/GaAs/Ge solar cells is presented. The Monte Carlo ion transport code (SRIM) is used to simulate the damage profile induced in a 3J solar cell under the conditions of typical ground testing and that of the space environment. The results are used to present a quantitative analysis of the defect, and hence damage, distribution induced in the cell active region by the different radiation conditions. The modelling results show that, in the space environment, the solar cell will experience a uniform damage distribution through the active region of the cell. Through an application of the displacement damage dose analysis methodology, the implications of this result on mission performance predictions are investigated.

  2. Latency Hiding in Dynamic Partitioning and Load Balancing of Grid Computing Applications

    NASA Technical Reports Server (NTRS)

    Das, Sajal K.; Harvey, Daniel J.; Biswas, Rupak

    2001-01-01

    The Information Power Grid (IPG) concept developed by NASA is aimed to provide a metacomputing platform for large-scale distributed computations, by hiding the intricacies of highly heterogeneous environment and yet maintaining adequate security. In this paper, we propose a latency-tolerant partitioning scheme that dynamically balances processor workloads on the.IPG, and minimizes data movement and runtime communication. By simulating an unsteady adaptive mesh application on a wide area network, we study the performance of our load balancer under the Globus environment. The number of IPG nodes, the number of processors per node, and the interconnected speeds are parameterized to derive conditions under which the IPG would be suitable for parallel distributed processing of such applications. Experimental results demonstrate that effective solution are achieved when the IPG nodes are connected by a high-speed asynchronous interconnection network.

  3. Pinning time statistics for vortex lines in disordered environments.

    PubMed

    Dobramysl, Ulrich; Pleimling, Michel; Täuber, Uwe C

    2014-12-01

    We study the pinning dynamics of magnetic flux (vortex) lines in a disordered type-II superconductor. Using numerical simulations of a directed elastic line model, we extract the pinning time distributions of vortex line segments. We compare different model implementations for the disorder in the surrounding medium: discrete, localized pinning potential wells that are either attractive and repulsive or purely attractive, and whose strengths are drawn from a Gaussian distribution; as well as continuous Gaussian random potential landscapes. We find that both schemes yield power-law distributions in the pinned phase as predicted by extreme-event statistics, yet they differ significantly in their effective scaling exponents and their short-time behavior.

  4. The Strata-l Experiment on Microgravity Regolith Segregation

    NASA Technical Reports Server (NTRS)

    Fries, M.; Abell, P.; Brisset, J.; Britt, D.; Colwell, J.; Durda, D.; Dove, A.; Graham, L.; Hartzell, C.; John, K.; hide

    2016-01-01

    The Strata-1 experiment studies the segregation of small-body regolith through long-duration exposure of simulant materials to the microgravity environment on the International Space Station (ISS). Many asteroids feature low bulk densities, which implies high values of porosity and a mechanical structure composed of loosely bound particles, (i.e. the "rubble pile" model), a prime example of a granular medium. Even the higher-density, mechanically coherent asteroids feature a significant surface layer of loose regolith. These bodies will evolve in response to very small perturbations such as micrometeoroid impacts, planetary flybys, and the YORP effect. A detailed understanding of asteroid mechanical evolution is needed in order to predict the surface characteristics of as-of-yet unvisited bodies, to understand the larger context of samples from sample return missions, and to mitigate risks for both manned and unmanned missions to asteroidal bodies. Due to observation of rocky regions on asteorids such as Eros and Itokawa, it has been hypothesized that grain size distribution with depth on an asteroid may be inhomogeneous: specifically, that large boulders have been mobilized to the surface. In terrestrial environments, this size-dependent sorting to the surface of the sample is called the Brazil Nut Effect. The microgravity and acceleration environment on the ISS is similar that of a small asteroid. Thus, Strata-1 investigates size segregation of regolith in an environment analogous to that of small bodies. Strata-1 consists of four regolith simulants in evacuated tubes, as shown in Figure 1 (Top and Middle). The simulants are (1) a crushed and sieved ordinary chondrite meteorite to simulate an asteroidal surface, (2) a carbonaceous chondrite simulant with a mixture of fine and course particles, and two simplified silicate glass simulants; (3) one with angular and (4) another with spherical particles. These materials were chosen to span a range of granular complexity. The materials were sorted into three size species pre-launch, and maintained during launch and return by a device called the Entrapulator. The hypothesis under test is that the particles that constitute a granular medium in a micro-gravity environment, subjected to a known vibration environemnt, will segregate in accordance to modeled predictions. Strata-1 is currently operating on ISS, with cameras capturing images of simulant motion throughout the one year mission. Vibration data is recorded and downlinked, and the simulants will be analyzed after return to Earth.

  5. Characterizing Feedbacks Between Environmental Forcing and Sediment Characteristics in Fluvial and Coastal Systems

    NASA Astrophysics Data System (ADS)

    Feehan, S.; Ruggiero, P.; Hempel, L. A.; Anderson, D. L.; Cohn, N.

    2016-12-01

    Characterizing Feedbacks Between Environmental Forcing and Sediment Characteristics in Fluvial and Coastal Systems American Geophysical Union, 2016 Fall Meeting: San Francisco, CA Authors: Scott Feehan, Peter Ruggiero, Laura Hempel, and Dylan Anderson Linking transport processes and sediment characteristics within different environments along the source to sink continuum provides critical insight into the dominant feedbacks between grain size distributions and morphological evolution. This research is focused on evaluating differences in sediment size distributions across both fluvial and coastal environments in the U.S. Pacific Northwest. The Cascades' high relief is characterized by diverse flow regimes with high peak/flashy flows and sub-threshold flows occurring in relative proximity and one of the most energetic wave climates in the world. Combining analyses of both fluvial and coastal environments provides a broader understanding of the dominant forces driving differences between each system's grain size distributions, sediment transport processes, and resultant evolution. We consider sediment samples taken during a large-scale flume experiment that simulated floods representative of both high/flashy peak flows analogous to runoff dominated rivers and sub-threshold flows, analogous to spring-fed rivers. High discharge flows resulted in narrower grain size distributions while low flows where less skewed. Relative sediment size showed clear dependence on distance from source and the environments' dominant fluid motion. Grain size distributions and sediment transport rates were also quantified in both wave dominated nearshore and aeolian dominated backshore portions of Long Beach Peninsula, Washington during SEDEX2, the Sandbar-aEolian-Dune EXchange Experiment of summer 2016. The distributions showed spatial patterns in mean grain size, skewness, and kurtosis dependent on the dominant sediment transport process. The feedback between these grain size distributions and the predominant driver of sediment transport controls the potential for geomorphic change on societally relevant time scales in multiple settings.

  6. Extreme Environment Simulation - Current and New Capabilities to Simulate Venus and Other Planetary Bodies

    NASA Technical Reports Server (NTRS)

    Kremic, Tibor; Vento, Dan; Lalli, Nick; Palinski, Timothy

    2014-01-01

    Science, technology, and planetary mission communities have a growing interest in components and systems that are capable of working in extreme (high) temperature and pressure conditions. Terrestrial applications range from scientific research, aerospace, defense, automotive systems, energy storage and power distribution, deep mining and others. As the target environments get increasingly extreme, capabilities to develop and test the sensors and systems designed to operate in such environments will be required. An application of particular importance to the planetary science community is the ability for a robotic lander to survive on the Venus surface where pressures are nearly 100 times that of Earth and temperatures approach 500C. The scientific importance and relevance of Venus missions are stated in the current Planetary Decadal Survey. Further, several missions to Venus were proposed in the most recent Discovery call. Despite this interest, the ability to accurately simulate Venus conditions at a scale that can test and validate instruments and spacecraft systems and accurately simulate the Venus atmosphere has been lacking. This paper discusses and compares the capabilities that are known to exist within and outside the United States to simulate the extreme environmental conditions found in terrestrial or planetary surfaces including the Venus atmosphere and surface. The paper then focuses on discussing the recent additional capability found in the NASA Glenn Extreme Environment Rig (GEER). The GEER, located at the NASA Glenn Research Center in Cleveland, Ohio, is designed to simulate not only the temperature and pressure extremes described, but can also accurately reproduce the atmospheric compositions of bodies in the solar system including those with acidic and hazardous elements. GEER capabilities and characteristics are described along with operational considerations relevant to potential users. The paper presents initial operating results and concludes with a sampling of investigations or tests that have been requested or expected.

  7. First-Principles Monte Carlo Simulations of Reaction Equilibria in Compressed Vapors

    PubMed Central

    2016-01-01

    Predictive modeling of reaction equilibria presents one of the grand challenges in the field of molecular simulation. Difficulties in the study of such systems arise from the need (i) to accurately model both strong, short-ranged interactions leading to the formation of chemical bonds and weak interactions arising from the environment, and (ii) to sample the range of time scales involving frequent molecular collisions, slow diffusion, and infrequent reactive events. Here we present a novel reactive first-principles Monte Carlo (RxFPMC) approach that allows for investigation of reaction equilibria without the need to prespecify a set of chemical reactions and their ideal-gas equilibrium constants. We apply RxFPMC to investigate a nitrogen/oxygen mixture at T = 3000 K and p = 30 GPa, i.e., conditions that are present in atmospheric lightning strikes and explosions. The RxFPMC simulations show that the solvation environment leads to a significantly enhanced NO concentration that reaches a maximum when oxygen is present in slight excess. In addition, the RxFPMC simulations indicate the formation of NO2 and N2O in mole fractions approaching 1%, whereas N3 and O3 are not observed. The equilibrium distributions obtained from the RxFPMC simulations agree well with those from a thermochemical computer code parametrized to experimental data. PMID:27413785

  8. Effects of simulated microgravity on Streptococcus mutans physiology and biofilm structure.

    PubMed

    Cheng, Xingqun; Xu, Xin; Chen, Jing; Zhou, Xuedong; Cheng, Lei; Li, Mingyun; Li, Jiyao; Wang, Renke; Jia, Wenxiang; Li, Yu-Qing

    2014-10-01

    Long-term spaceflights will eventually become an inevitable occurrence. Previous studies have indicated that oral infectious diseases, including dental caries, were more prevalent in astronauts due to the effect of microgravity. However, the impact of the space environment, especially the microgravity environment, on the virulence factors of Streptococcus mutans, a major caries-associated bacterium, is yet to be explored. In the present study, we investigated the impact of simulated microgravity on the physiology and biofilm structure of S. mutans. We also explored the dual-species interaction between S. mutans and Streptococcus sanguinis under a simulated microgravity condition. Results indicated that the simulated microgravity condition can enhance the acid tolerance ability, modify the biofilm architecture and extracellular polysaccharide distribution of S. mutans, and increase the proportion of S. mutans within a dual-species biofilm, probably through the regulation of various gene expressions. We hypothesize that the enhanced competitiveness of S. mutans under simulated microgravity may cause a multispecies micro-ecological imbalance, which would result in the initiation of dental caries. Our current findings are consistent with previous studies, which revealed a higher astronaut-associated incidence of caries. Further research is required to explore the detailed mechanisms. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  9. Communication Simulations for Power System Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.

    2013-05-29

    New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systemsmore » will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.« less

  10. Numerical simulation of a passive scalar transport from thermal power plants

    NASA Astrophysics Data System (ADS)

    Issakhov, Alibek; Baitureyeva, Aiymzhan

    2017-06-01

    The active development of the industry leads to an increase in the number of factories, plants, thermal power plants, nuclear power plants, thereby increasing the amount of emissions into the atmosphere. Harmful chemicals are deposited on the soil surface, remain in the atmosphere, which leads to a variety of environmental problems which are harmful for human health and the environment, flora and fauna. Considering the above problems, it is very important to control the emissions to keep them at an acceptable level for the environment. In order to do that it is necessary to investigate the spread of harmful emissions. The best way to assess it is the creating numerical simulation of gaseous substances' motion. In the present work the numerical simulation of the spreading of emissions from the thermal power plant chimney is considered. The model takes into account the physical properties of the emitted substances and allows to calculate the distribution of the mass fractions, depending on the wind velocity and composition of emissions. The numerical results were performed using the ANSYS Fluent software package. As a result, the results of numerical simulations and the graphs are given.

  11. Operating Characteristics of Statistical Methods for Detecting Gene-by-Measured Environment Interaction in the Presence of Gene-Environment Correlation under Violations of Distributional Assumptions.

    PubMed

    Van Hulle, Carol A; Rathouz, Paul J

    2015-02-01

    Accurately identifying interactions between genetic vulnerabilities and environmental factors is of critical importance for genetic research on health and behavior. In the previous work of Van Hulle et al. (Behavior Genetics, Vol. 43, 2013, pp. 71-84), we explored the operating characteristics for a set of biometric (e.g., twin) models of Rathouz et al. (Behavior Genetics, Vol. 38, 2008, pp. 301-315), for testing gene-by-measured environment interaction (GxM) in the presence of gene-by-measured environment correlation (rGM) where data followed the assumed distributional structure. Here we explore the effects that violating distributional assumptions have on the operating characteristics of these same models even when structural model assumptions are correct. We simulated N = 2,000 replicates of n = 1,000 twin pairs under a number of conditions. Non-normality was imposed on either the putative moderator or on the ultimate outcome by ordinalizing or censoring the data. We examined the empirical Type I error rates and compared Bayesian information criterion (BIC) values. In general, non-normality in the putative moderator had little impact on the Type I error rates or BIC comparisons. In contrast, non-normality in the outcome was often mistaken for or masked GxM, especially when the outcome data were censored.

  12. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    PubMed

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  13. Simulation of Triple Oxidation Ditch Wastewater Treatment Process

    NASA Astrophysics Data System (ADS)

    Yang, Yue; Zhang, Jinsong; Liu, Lixiang; Hu, Yongfeng; Xu, Ziming

    2010-11-01

    This paper presented the modeling mechanism and method of a sewage treatment system. A triple oxidation ditch process of a WWTP was simulated based on activated sludge model ASM2D with GPS-X software. In order to identify the adequate model structure to be implemented into the GPS-X environment, the oxidation ditch was divided into several completely stirred tank reactors depended on the distribution of aeration devices and dissolved oxygen concentration. The removal efficiency of COD, ammonia nitrogen, total nitrogen, total phosphorus and SS were simulated by GPS-X software with influent quality data of this WWTP from June to August 2009, to investigate the differences between the simulated results and the actual results. The results showed that, the simulated values could well reflect the actual condition of the triple oxidation ditch process. Mathematical modeling method was appropriate in effluent quality predicting and process optimizing.

  14. [Not Available].

    PubMed

    Pecevski, Dejan; Natschläger, Thomas; Schuch, Klaus

    2009-01-01

    The Parallel Circuit SIMulator (PCSIM) is a software package for simulation of neural circuits. It is primarily designed for distributed simulation of large scale networks of spiking point neurons. Although its computational core is written in C++, PCSIM's primary interface is implemented in the Python programming language, which is a powerful programming environment and allows the user to easily integrate the neural circuit simulator with data analysis and visualization tools to manage the full neural modeling life cycle. The main focus of this paper is to describe PCSIM's full integration into Python and the benefits thereof. In particular we will investigate how the automatically generated bidirectional interface and PCSIM's object-oriented modular framework enable the user to adopt a hybrid modeling approach: using and extending PCSIM's functionality either employing pure Python or C++ and thus combining the advantages of both worlds. Furthermore, we describe several supplementary PCSIM packages written in pure Python and tailored towards setting up and analyzing neural simulations.

  15. Information Foraging and Change Detection for Automated Science Exploration

    NASA Technical Reports Server (NTRS)

    Furlong, P. Michael; Dille, Michael

    2016-01-01

    This paper presents a new algorithm for autonomous on-line exploration in unknown environments. The objective is to free remote scientists from possibly-infeasible extensive preliminary site investigation prior to sending robotic agents. We simulate a common exploration task for an autonomous robot sampling the environment at various locations and compare performance against simpler control strategies. An extension is proposed and evaluated that further permits operation in the presence of environmental variability in which the robot encounters a change in the distribution underlying sampling targets. Experimental results indicate a strong improvement in performance across varied parameter choices for the scenario.

  16. The CSM testbed software system: A development environment for structural analysis methods on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Gillian, Ronnie E.; Lotts, Christine G.

    1988-01-01

    The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.

  17. Controlling multiple security robots in a warehouse environment

    NASA Technical Reports Server (NTRS)

    Everett, H. R.; Gilbreath, G. A.; Heath-Pastore, T. A.; Laird, R. T.

    1994-01-01

    The Naval Command Control and Ocean Surveillance Center (NCCOSC) has developed an architecture to provide coordinated control of multiple autonomous vehicles from a single host console. The multiple robot host architecture (MRHA) is a distributed multiprocessing system that can be expanded to accommodate as many as 32 robots. The initial application will employ eight Cybermotion K2A Navmaster robots configured as remote security platforms in support of the Mobile Detection Assessment and Response System (MDARS) Program. This paper discusses developmental testing of the MRHA in an operational warehouse environment, with two actual and four simulated robotic platforms.

  18. Methods for Combining Payload Parameter Variations with Input Environment

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.; Straayer, J. W.

    1975-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occuring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular value of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the methods are also presented.

  19. Distributed automatic control of technological processes in conditions of weightlessness

    NASA Technical Reports Server (NTRS)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  20. Situational Awareness Issues in the Implementation of Datalink: Shared Situational Awareness in the Joint Flight Deck-ATC Aviation System

    NASA Technical Reports Server (NTRS)

    Hansman, Robert John, Jr.

    1999-01-01

    MIT has investigated Situational Awareness issues relating to the implementation of Datalink in the Air Traffic Control environment for a number of years under this grant activity. This work has investigated: 1) The Effect of "Party Line" Information. 2) The Effect of Datalink-Enabled Automated Flight Management Systems (FMS) on Flight Crew Situational Awareness. 3) The Effect of Cockpit Display of Traffic Information (CDTI) on Situational Awareness During Close Parallel Approaches. 4) Analysis of Flight Path Management Functions in Current and Future ATM Environments. 5) Human Performance Models in Advanced ATC Automation: Flight Crew and Air Traffic Controllers. 6) CDTI of Datalink-Based Intent Information in Advanced ATC Environments. 7) Shared Situational Awareness between the Flight Deck and ATC in Datalink-Enabled Environments. 8) Analysis of Pilot and Controller Shared SA Requirements & Issues. 9) Development of Robust Scenario Generation and Distributed Simulation Techniques for Flight Deck ATC Simulation. 10) Methods of Testing Situation Awareness Using Testable Response Techniques. The work is detailed in specific technical reports that are listed in the following bibliography, and are attached as an appendix to the master final technical report.

  1. Multi-physics simulations of space weather

    NASA Astrophysics Data System (ADS)

    Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Cohen, Ofer; Glocer, Alex; Manchester, Ward, IV; Ridley, Aaron

    Presently magnetohydrodynamic (MHD) models represent the "workhorse" technology for simulating the space environment from the solar corona to the ionosphere. While these models are very successful in describing many important phenomena, they are based on a low-order moment approximation of the phase-space distribution function. In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magnetosphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. BATS-R-US can solve the equations of "standard" ideal MHD, but it can also go beyond this first approximation. It can solve resistive MHD, Hall MHD, semi-relativistic MHD (that keeps the displacement current), multispecies (different ion species have different continuity equations) and multifluid (all ion species have separate continuity, momentum and energy equations) MHD. Recently we added two-fluid Hall MHD (solving the electron and ion energy equations separately) and are working on extended magnetohydrodynamics with anisotropic pressures. This talk will show the effects of added physics and compare space weather simulation results to "standard" ideal MHD.

  2. MATSIM: Development of a Voxel Model of the MATROSHKA Astronaut Dosimetric Phantom

    NASA Astrophysics Data System (ADS)

    Beck, Peter; Zechner, Andrea; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Hranitzky, Christian; Latocha, Marcin; Reitz, Günther; Stadtmann, Hannes; Vana, Norbert; Wind, Michael

    2011-08-01

    The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center, to perform FLUKA Monte Carlo simulations of the MATROSHKA numerical phantom irradiated under reference radiation field conditions as well as for the radiation environment at the International Space Station (ISS). MATSIM is carried out as co-investigation of the ESA ELIPS projects SORD and RADIS (commonly known as MATROSHKA), an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. During MATSIM a computer tomography scan of the MATROSHKA phantom has been converted into a high resolution 3-dimensional voxel model. The energy imparted and absorbed dose distribution inside the model is determined for various radiation fields. The major goal of the MATSIM project is the validation of the numerical model under reference radiation conditions and further investigations under the radiation environment at ISS. In this report we compare depth dose distributions inside the phantom measured with thermoluminescence detectors (TLDs) and an ionization chamber with FLUKA Monte Carlo particle transport simulations due to 60Co photon exposure. Further reference irradiations with neutrons, protons and heavy ions are planned. The fully validated numerical model MATSIM will provide a perfect tool to assess the radiation exposure to humans during current and future space missions to ISS, Moon, Mars and beyond.

  3. Simulated effects of host fish distribution on juvenile unionid mussel dispersal in a large river

    USGS Publications Warehouse

    Daraio, J.A.; Weber, L.J.; Zigler, S.J.; Newton, T.J.; Nestler, J.M.

    2012-01-01

    Larval mussels (Family Unionidae) are obligate parasites on fish, and after excystment from their host, as juveniles, they are transported with flow. We know relatively little about the mechanisms that affect dispersal and subsequent settlement of juvenile mussels in large rivers. We used a three-dimensional hydrodynamic model of a reach of the Upper Mississippi River with stochastic Lagrangian particle tracking to simulate juvenile dispersal. Sensitivity analyses were used to determine the importance of excystment location in two-dimensional space (lateral and longitudinal) and to assess the effects of vertical location (depth in the water column) on dispersal distances and juvenile settling distributions. In our simulations, greater than 50% of juveniles mussels settled on the river bottom within 500 m of their point of excystment, regardless of the vertical location of the fish in the water column. Dispersal distances were most variable in environments with higher velocity and high gradients in velocity, such as along channel margins, near the channel bed, or where effects of river bed morphology caused large changes in hydraulics. Dispersal distance was greater and variance was greater when juvenile excystment occurred in areas where vertical velocity (w) was positive (indicating an upward velocity) than when w was negative. Juvenile dispersal distance is likely to be more variable for mussels species whose hosts inhabit areas with steeper velocity gradients (e.g. channel margins) than a host that generally inhabits low-flow environments (e.g. impounded areas).

  4. Low-cost real-time 3D PC distributed-interactive-simulation (DIS) application for C4I

    NASA Astrophysics Data System (ADS)

    Gonthier, David L.; Veron, Harry

    1998-04-01

    A 3D Distributed Interactive Simulation (DIS) application was developed and demonstrated in a PC environment. The application is capable of running in the stealth mode or as a player which includes battlefield simulations, such as ModSAF. PCs can be clustered together, but not necessarily collocated, to run a simulation or training exercise on their own. A 3D perspective view of the battlefield is displayed that includes terrain, trees, buildings and other objects supported by the DIS application. Screen update rates of 15 to 20 frames per second have been achieved with fully lit and textured scenes thus providing high quality and fast graphics. A complete PC system can be configured for under $2,500. The software runs under Windows95 and WindowsNT. It is written in C++ and uses a commercial API called RenderWare for 3D rendering. The software uses Microsoft Foundation classes and Microsoft DirectPlay for joystick input. The RenderWare libraries enhance the performance through optimization for MMX and the Pentium Pro processor. The RenderWare and the Righteous 3D graphics board from Orchid Technologies with an advertised rendering rate of up to 2 million texture mapped triangles per second. A low-cost PC DIS simulator that can partake in a real-time collaborative simulation with other platforms is thus achieved.

  5. Coordinating teams of autonomous vehicles: an architectural perspective

    NASA Astrophysics Data System (ADS)

    Czichon, Cary; Peterson, Robert W.; Mettala, Erik G.; Vondrak, Ivo

    2005-05-01

    In defense-related robotics research, a mission level integration gap exists between mission tasks (tactical) performed by ground, sea, or air applications and elementary behaviors enacted by processing, communications, sensors, and weaponry resources (platform specific). The gap spans ensemble (heterogeneous team) behaviors, automatic MOE/MOP tracking, and tactical task modeling/simulation for virtual and mixed teams comprised of robotic and human combatants. This study surveys robotic system architectures, compares approaches for navigating problem/state spaces by autonomous systems, describes an architecture for an integrated, repository-based modeling, simulation, and execution environment, and outlines a multi-tiered scheme for robotic behavior components that is agent-based, platform-independent, and extendable via plug-ins. Tools for this integrated environment, along with a distributed agent framework for collaborative task performance are being developed by a U.S. Army funded SBIR project (RDECOM Contract N61339-04-C-0005).

  6. Particle Laden Turbulence in a Radiation Environment Using a Portable High Preformace Solver Based on the Legion Runtime System

    NASA Astrophysics Data System (ADS)

    Torres, Hilario; Iaccarino, Gianluca

    2017-11-01

    Soleil-X is a multi-physics solver being developed at Stanford University as a part of the Predictive Science Academic Alliance Program II. Our goal is to conduct high fidelity simulations of particle laden turbulent flows in a radiation environment for solar energy receiver applications as well as to demonstrate our readiness to effectively utilize next generation Exascale machines. The novel aspect of Soleil-X is that it is built upon the Legion runtime system to enable easy portability to different parallel distributed heterogeneous architectures while also being written entirely in high-level/high-productivity languages (Ebb and Regent). An overview of the Soleil-X software architecture will be given. Results from coupled fluid flow, Lagrangian point particle tracking, and thermal radiation simulations will be presented. Performance diagnostic tools and metrics corresponding the the same cases will also be discussed. US Department of Energy, National Nuclear Security Administration.

  7. Thermodynamic properties of water in confined environments: a Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Gladovic, Martin; Bren, Urban; Urbic, Tomaž

    2018-05-01

    Monte Carlo simulations of Mercedes-Benz water in a crowded environment were performed. The simulated systems are representative of both composite, porous or sintered materials and living cells with typical matrix packings. We studied the influence of overall temperature as well as the density and size of matrix particles on water density, particle distributions, hydrogen bond formation and thermodynamic quantities. Interestingly, temperature and space occupancy of matrix exhibit a similar effect on water properties following the competition between the kinetic and the potential energy of the system, whereby temperature increases the kinetic and matrix packing decreases the potential contribution. A novel thermodynamic decomposition approach was applied to gain insight into individual contributions of different types of inter-particle interactions. This decomposition proved to be useful and in good agreement with the total thermodynamic quantities especially at higher temperatures and matrix packings, where higher-order potential-energy mixing terms lose their importance.

  8. Distributed Processing System for Restoration of Electric Power Distribution Network Using Two-Layered Contract Net Protocol

    NASA Astrophysics Data System (ADS)

    Kodama, Yu; Hamagami, Tomoki

    Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.

  9. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments.

    PubMed

    Thomas, Brandon R; Chylek, Lily A; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H A; Hlavacek, William S; Posner, Richard G

    2016-03-01

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary data are available at Bioinformatics online. bionetgen.help@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Temperature and composition dependence of short-range order and entropy, and statistics of bond length: the semiconductor alloy (GaN)(1-x)(ZnO)(x).

    PubMed

    Liu, Jian; Pedroza, Luana S; Misch, Carissa; Fernández-Serra, Maria V; Allen, Philip B

    2014-07-09

    We present total energy and force calculations for the (GaN)1-x(ZnO)x alloy. Site-occupancy configurations are generated from Monte Carlo (MC) simulations, on the basis of a cluster expansion model proposed in a previous study. Local atomic coordinate relaxations of surprisingly large magnitude are found via density-functional calculations using a 432-atom periodic supercell, for three representative configurations at x = 0.5. These are used to generate bond-length distributions. The configurationally averaged composition- and temperature-dependent short-range order (SRO) parameters of the alloys are discussed. The entropy is approximated in terms of pair distribution statistics and thus related to SRO parameters. This approximate entropy is compared with accurate numerical values from MC simulations. An empirical model for the dependence of the bond length on the local chemical environments is proposed.

  11. Channel MAC Protocol for Opportunistic Communication in Ad Hoc Wireless Networks

    NASA Astrophysics Data System (ADS)

    Ashraf, Manzur; Jayasuriya, Aruna; Perreau, Sylvie

    2008-12-01

    Despite significant research effort, the performance of distributed medium access control methods has failed to meet theoretical expectations. This paper proposes a protocol named "Channel MAC" performing a fully distributed medium access control based on opportunistic communication principles. In this protocol, nodes access the channel when the channel quality increases beyond a threshold, while neighbouring nodes are deemed to be silent. Once a node starts transmitting, it will keep transmitting until the channel becomes "bad." We derive an analytical throughput limit for Channel MAC in a shared multiple access environment. Furthermore, three performance metrics of Channel MAC—throughput, fairness, and delay—are analysed in single hop and multihop scenarios using NS2 simulations. The simulation results show throughput performance improvement of up to 130% with Channel MAC over IEEE 802.11. We also show that the severe resource starvation problem (unfairness) of IEEE 802.11 in some network scenarios is reduced by the Channel MAC mechanism.

  12. Numerical modelling of temporal and spatial patterns of petroleum hydrocarbons concentration in the Bohai Sea.

    PubMed

    Guo, Weijun; Wu, Guoxiang; Xu, Tiaojian; Li, Xueyan; Ren, Xiaozhong; Hao, Yanni

    2018-02-01

    The discharge of petroleum hydrocarbons (PHs; ~10,000tons annually) into the Bohai Sea, a shallow inland sea in China, presents a serious threat to the marine environment. To evaluate the effects of PHs pollution and estimate the corresponding environmental capacity, we have developed a genetic algorithm-based coupled hydrodynamic/transport for simulating PHs concentration evolution and distribution from July 2006 to October 2007, with the predicted values being in good agreement with monitoring results. Importantly, the mean PHs concentrations and seasonal concentration variations were primarily determined by external loading, i.e., currents were shown to drive PHs transport, reconfigure local PHs patterns, and increase PHs concentration in water masses, even at large distances from discharge sources. The developed model could realistically simulate PHs distribution and evolution, thus being a useful tool for estimating the seasonal environmental capacity of PHs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Red mud flocculation process in alumina production

    NASA Astrophysics Data System (ADS)

    Fedorova, E. R.; Firsov, A. Yu

    2018-05-01

    The process of thickening and washing red mud is a gooseneck of alumina production. The existing automated systems of the thickening process control involve stabilizing the parameters of the primary technological circuits of the thickener. The actual direction of scientific research is the creation and improvement of models and systems of the thickening process control by model. But the known models do not fully consider the presence of perturbing effects, in particular the particle size distribution in the feed process, distribution of floccules by size after the aggregation process in the feed barrel. The article is devoted to the basic concepts and terms used in writing the population balance algorithm. The population balance model is implemented in the MatLab environment. The result of the simulation is the particle size distribution after the flocculation process. This model allows one to foreseen the distribution range of floccules after the process of aggregation of red mud in the feed barrel. The mud of Jamaican bauxite was acting as an industrial sample of red mud; Cytec Industries of HX-3000 series with a concentration of 0.5% was acting as a flocculant. When simulating, model constants obtained in a tubular tank in the laboratories of CSIRO (Australia) were used.

  14. Flock Foraging Efficiency in Relation to Food Sensing Ability and Distribution: a Simulation Study

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Hee

    2013-08-01

    Flocking may be an advantageous strategy for acquiring food resources. The degree of advantage is related to two factors: the ability of flock members to detect food resources and patterns of food distribution in the environment. To understand foraging efficiency as a function of these factors, I constructed a two-dimensional (2D) flocking model incorporating the two factors. At the start of the simulation, food particles were heterogeneously distributed. The heterogeneity, H, was characterized as a value ranging from 0.0 to 1.0. For each flock member, food sensing ability was defined by two variables: sensing distance, R and sensing angle, θ. Foraging efficiency of a flock was defined as the time, τ, required for a flock to consume all the available food resources. Simulation results showed that flock foraging is most efficient when individuals had an intermediate sensing ability (R = 60), but decreased for low (R < 60) and high (R > 60) sensing ability. When R > 60, patterns in foraging efficiency with increasing sensing distance and food resource aggregation were less consistent. This inconsistency was due to instability of the flock and a higher rate of individuals failing to capture target food resources. In addition, I briefly discuss the benefits obtained by foraging in flocks from an evolutionary perspective.

  15. Performance analysis of air conditioning system and airflow simulation in an operating theater

    NASA Astrophysics Data System (ADS)

    Alhamid, Muhammad Idrus; Budihardjo, Rahmat

    2018-02-01

    The importance of maintaining performance of a hospital operating theater is to establish an adequate circulation of clean air within the room. The parameter of air distribution in a space should be based on Air Changes per Hour (ACH) to maintain a positive room pressure. The dispersion of airborne particles in the operating theater was governed by regulating the air distribution so that the operating theater meets clean room standards ie ISO 14664 and ASHRAE 170. Here, we introduced several input parameters in a simulation environment to observe the pressure distribution in the room. Input parameters were air temperature, air velocity and volumetric flow rate entering and leaving room for existing and designed condition. In the existing operating theatre, several observations were found. It was found that the outlet air velocity at the HEPA filter above the operating table was too high thus causing a turbulent airflow pattern. Moreover, the setting temperature at 19°C was found to be too low. The supply of air into the room was observed at lower than 20 ACH which is under the standard requirement. Our simulation using FloVent 8.2™ program showed that not only airflow turbulence could be reduced but also the amount of particle contamination could also be minimized.

  16. The Radiation Belt Electron Scattering by Magnetosonic Wave: Dependence on Key Parameters

    NASA Astrophysics Data System (ADS)

    Lei, Mingda; Xie, Lun; Li, Jinxing; Pu, Zuyin; Fu, Suiyan; Ni, Binbin; Hua, Man; Chen, Lunjin; Li, Wen

    2017-12-01

    Magnetosonic (MS) waves have been found capable of creating radiation belt electron butterfly distributions in the inner magnetosphere. To investigate the physical nature of the interactions between radiation belt electrons and MS waves, and to explore a preferential condition for MS waves to scatter electrons efficiently, we performed a comprehensive parametric study of MS wave-electron interactions using test particle simulations. The diffusion coefficients simulated by varying the MS wave frequency show that the scattering effect of MS waves is frequency insensitive at low harmonics (f < 20 fcp), which has great implications on modeling the electron scattering caused by MS waves with harmonic structures. The electron scattering caused by MS waves is very sensitive to wave normal angles, and MS waves with off 90° wave normal angles scatter electrons more efficiently. By simulating the diffusion coefficients and the electron phase space density evolution at different L shells under different plasma environment circumstances, we find that MS waves can readily produce electron butterfly distributions in the inner part of the plasmasphere where the ratio of electron plasma-to-gyrofrequency (fpe/fce) is large, while they may essentially form a two-peak distribution outside the plasmapause and in the inner radiation belt where fpe/fce is small.

  17. UGV navigation in wireless sensor and actuator network environments

    NASA Astrophysics Data System (ADS)

    Zhang, Guyu; Li, Jianfeng; Duncan, Christian A.; Kanno, Jinko; Selmic, Rastko R.

    2012-06-01

    We consider a navigation problem in a distributed, self-organized and coordinate-free Wireless Sensor and Ac- tuator Network (WSAN). We rst present navigation algorithms that are veried using simulation results. Con- sidering more than one destination and multiple mobile Unmanned Ground Vehicles (UGVs), we introduce a distributed solution to the Multi-UGV, Multi-Destination navigation problem. The objective of the solution to this problem is to eciently allocate UGVs to dierent destinations and carry out navigation in the network en- vironment that minimizes total travel distance. The main contribution of this paper is to develop a solution that does not attempt to localize either the UGVs or the sensor and actuator nodes. Other than some connectivity as- sumptions about the communication graph, we consider that no prior information about the WSAN is available. The solution presented here is distributed, and the UGV navigation is solely based on feedback from neigh- boring sensor and actuator nodes. One special case discussed in the paper, the Single-UGV, Multi-Destination navigation problem, is essentially equivalent to the well-known and dicult Traveling Salesman Problem (TSP). Simulation results are presented that illustrate the navigation distance traveled through the network. We also introduce an experimental testbed for the realization of coordinate-free and localization-free UGV navigation. We use the Cricket platform as the sensor and actuator network and a Pioneer 3-DX robot as the UGV. The experiments illustrate the UGV navigation in a coordinate-free WSAN environment where the UGV successfully arrives at the assigned destinations.

  18. Ejecta cloud from the AIDA space project kinetic impact on the secondary of a binary asteroid: I. mechanical environment and dynamical model

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Michel, Patrick; Schwartz, Stephen R.; Naidu, Shantanu P.; Benner, Lance A. M.

    2017-01-01

    An understanding of the post-impact dynamics of ejecta clouds are crucial to the planning of a kinetic impact mission to an asteroid, and also has great implications for the history of planetary formation. The purpose of this article is to track the evolution of ejecta produced by AIDA mission, which targets for kinetic impact the secondary of near-Earth binary asteroid (65803) Didymos on 2022, and to feedback essential informations to AIDA's ongoing phase-A study. We present a detailed dynamic model for the simulation of an ejecta cloud from a binary asteroid that synthesizes all relevant forces based on a previous analysis of the mechanical environment. We apply our method to gain insight into the expected response of Didymos to the AIDA impact, including the subsequent evolution of debris and dust. The crater scaling relations from laboratory experiments are employed to approximate the distributions of ejecta mass and launching speed. The size distribution of fragments is modeled with a power law fitted from observations of real asteroid surface. A full-scale demonstration is simulated using parameters specified by the mission. We report the results of the simulation, which include the computed spread of the ejecta cloud and the recorded history of ejecta accretion and escape. The violent period of the ejecta evolution is found to be short, and is followed by a stage where the remaining ejecta is gradually cleared. Solar radiation pressure proves to be efficient in cleaning dust-size ejecta, and the simulation results after two weeks shows that large debris on polar orbits (perpendicular to the binary orbital plane) has a survival advantage over smaller ejecta and ejecta that keeps to low latitudes.

  19. Three-dimensional modelling of the hydrodynamics of the Southern Bight of the North Sea: first results

    NASA Astrophysics Data System (ADS)

    Ivanov, Evgeny; Capet, Arthur; Barth, Alexander; Delhez, Eric; Soetaert, Karline; Grégoire, Marilaure

    2017-04-01

    In the frame of the Belgian research project FaCE-It (Functional biodiversity in a Changing sedimentary Environment: Implications for biogeochemistry and food webs in a managerial setting), the impact of dredging activities and offshore wind farm installation on the spatial distribution of sediment grain size, biodiversity and biogeochemistry will be estimated in the Southern Bight of the North Sea (SBNS) with a focus on the Belgian Coastal Zone (BCZ). To reach this goal, the three-dimensional hydrodynamical model ROMS-COAWST is implemented in the SBNS in order to simulate the complex hydrodynamics and sediment transport. Two levels of nesting are used to reach a resolution of 250 m in the BCZ. The model is forced at the air-sea interface by the 6-hourly ECMWF ERA-interim atmospheric dataset and at the open boundaries by the coarse resolution model results available from CMEMS (Copernicus Marine Environment Monitoring Service), and also considers tides and 4 main rivers (Scheldt, Rhine with Maas, Thames and Seine). Two types of simulations have been performed: a 10-years climatological simulation and a simulation over 2003-2013 to investigate the interannual dynamics. The model skills are evaluated by comparing its outputs to historical data (e.g. salinity, temperature and currents) from remote sensing and in-situ. The sediment transport module will then be implemented and its outputs compared to historical and newly collected (in the frame of FaCE-iT) observations on grain size distribution as well as with satellite Suspended Particulate Matter (SPM) images. This will allow assessing the impact of substrate modification due to offshore human activities at local and regional scales.

  20. [Using sequential indicator simulation method to define risk areas of soil heavy metals in farmland.

    PubMed

    Yang, Hao; Song, Ying Qiang; Hu, Yue Ming; Chen, Fei Xiang; Zhang, Rui

    2018-05-01

    The heavy metals in soil have serious impacts on safety, ecological environment and human health due to their toxicity and accumulation. It is necessary to efficiently identify the risk area of heavy metals in farmland soil, which is of important significance for environment protection, pollution warning and farmland risk control. We collected 204 samples and analyzed the contents of seven kinds of heavy metals (Cu, Zn, Pb, Cd, Cr, As, Hg) in Zengcheng District of Guangzhou, China. In order to overcame the problems of the data, including the limitation of abnormal values and skewness distribution and the smooth effect with the traditional kriging methods, we used sequential indicator simulation method (SISIM) to define the spatial distribution of heavy metals, and combined Hakanson index method to identify potential ecological risk area of heavy metals in farmland. The results showed that: (1) Based on the similar accuracy of spatial prediction of soil heavy metals, the SISIM had a better expression of detail rebuild than ordinary kriging in small scale area. Compared to indicator kriging, the SISIM had less error rate (4.9%-17.1%) in uncertainty evaluation of heavy-metal risk identification. The SISIM had less smooth effect and was more applicable to simulate the spatial uncertainty assessment of soil heavy metals and risk identification. (2) There was no pollution in Zengcheng's farmland. Moderate potential ecological risk was found in the southern part of study area due to enterprise production, human activities, and river sediments. This study combined the sequential indicator simulation with Hakanson risk index method, and effectively overcame the outlier information loss and smooth effect of traditional kriging method. It provided a new way to identify the soil heavy metal risk area of farmland in uneven sampling.

  1. The Impact of Infiltration Losses and Model Resolution on the Simulated Hydrometeorological Response of a Semi-Arid Catchment

    NASA Astrophysics Data System (ADS)

    Mitchell, M. F.; Goodrich, D. C.; Gochis, D. J.; Lahmers, T. M.

    2017-12-01

    In semi-arid environments with complex terrain, redistribution of moisture occurs through runoff, stream infiltration, and regional groundwater flow. In semi-arid regions, stream infiltration has been shown to account for 10-40% of total recharge in high runoff years. These processes can potentially significantly alter land-atmosphere interactions through changes in sensible and latent heat release. However, currently, their overall impact is still unclear as historical model simulations generally made use of a coarse grid resolution, where these smaller-scale processes were either parameterized or not accounted for. To improve our understanding on the importance of stream infiltration and our ability to represent them in a coupled land-atmosphere model, this study focuses on the Walnut Gulch Experimental Watershed (WGEW) and Long-Term Agro-ecosystem Research (LTAR) site, surrounding the city of Tombstone, AZ. High-resolution surface precipitation, meteorological forcing and distributed runoff measurements have been obtained in WGEW since the 1960s. These data will be used as input for the spatially distributed WRF-Hydro model, a spatially distributed hydrological model that uses the NOAH-MP land surface model. Recently, we have implemented an infiltration loss scheme to WRF-Hydro. We will present the performance of WRF-Hydro to account for stream infiltration by comparing model simulation with in-situ observations. More specifically, as the performance of the model simulations has been shown to depend on the used model grid resolution, in the current work results will present WRF-Hydro simulations obtained at different pixel resolution (10-1000m).

  2. Intelligent computer-aided training authoring environment

    NASA Technical Reports Server (NTRS)

    Way, Robert D.

    1994-01-01

    Although there has been much research into intelligent tutoring systems (ITS), there are few authoring systems available that support ITS metaphors. Instructional developers are generally obliged to use tools designed for creating on-line books. We are currently developing an authoring environment derived from NASA's research on intelligent computer-aided training (ICAT). The ICAT metaphor, currently in use at NASA has proven effective in disciplines from satellite deployment to high school physics. This technique provides a personal trainer (PT) who instructs the student using a simulated work environment (SWE). The PT acts as a tutor, providing individualized instruction and assistance to each student. Teaching in an SWE allows the student to learn tasks by doing them, rather than by reading about them. This authoring environment will expedite ICAT development by providing a tool set that guides the trainer modeling process. Additionally, this environment provides a vehicle for distributing NASA's ICAT technology to the private sector.

  3. Numerical simulation and experimental verification of extended source interferometer

    NASA Astrophysics Data System (ADS)

    Hou, Yinlong; Li, Lin; Wang, Shanshan; Wang, Xiao; Zang, Haijun; Zhu, Qiudong

    2013-12-01

    Extended source interferometer, compared with the classical point source interferometer, can suppress coherent noise of environment and system, decrease dust scattering effects and reduce high-frequency error of reference surface. Numerical simulation and experimental verification of extended source interferometer are discussed in this paper. In order to provide guidance for the experiment, the modeling of the extended source interferometer is realized by using optical design software Zemax. Matlab codes are programmed to rectify the field parameters of the optical system automatically and get a series of interferometric data conveniently. The communication technique of DDE (Dynamic Data Exchange) was used to connect Zemax and Matlab. Then the visibility of interference fringes can be calculated through adding the collected interferometric data. Combined with the simulation, the experimental platform of the extended source interferometer was established, which consists of an extended source, interference cavity and image collection system. The decrease of high-frequency error of reference surface and coherent noise of the environment is verified. The relation between the spatial coherence and the size, shape, intensity distribution of the extended source is also verified through the analysis of the visibility of interference fringes. The simulation result is in line with the result given by real extended source interferometer. Simulation result shows that the model can simulate the actual optical interference of the extended source interferometer quite well. Therefore, the simulation platform can be used to guide the experiment of interferometer which is based on various extended sources.

  4. Neutral atom imaging at Mercury

    NASA Astrophysics Data System (ADS)

    Mura, A.; Orsini, S.; Milillo, A.; Di Lellis, A. M.; De Angelis, E.

    2006-02-01

    The feasibility of neutral atom detection and imaging in the Hermean environment is discussed in this study. In particular, we consider those energetic neutral atoms (ENA) whose emission is directly related to solar wind entrance into Mercury's magnetosphere. In fact, this environment is characterised by a weak magnetic field; thus, cusp regions are extremely large if compared to the Earth's ones, and intense proton fluxes are expected there. Our study includes a model of H + distribution in space, energy and pitch angle, simulated by means of a single-particle, Monte-Carlo simulation. Among processes that could generate neutral atom emission, we focus our attention on charge-exchange and ion sputtering, which, in principle, are able to produce directional ENA fluxes. Simulated neutral atom images are investigated in the frame of the neutral particle analyser-ion spectrometer (NPA-IS) SERENA experiment, proposed to fly on board the ESA mission BepiColombo/MPO. The ELENA (emitted low-energy neutral atoms) unit, which is part of this experiment, will be able to detect such fluxes; instrumental details and predicted count rates are given.

  5. Hemispherical reflectance model for passive images in an outdoor environment.

    PubMed

    Kim, Charles C; Thai, Bea; Yamaoka, Neil; Aboutalib, Omar

    2015-05-01

    We present a hemispherical reflectance model for simulating passive images in an outdoor environment where illumination is provided by natural sources such as the sun and the clouds. While the bidirectional reflectance distribution function (BRDF) accurately produces radiance from any objects after the illumination, using the BRDF in calculating radiance requires double integration. Replacing the BRDF by hemispherical reflectance under the natural sources transforms the double integration into a multiplication. This reduces both storage space and computation time. We present the formalism for the radiance of the scene using hemispherical reflectance instead of BRDF. This enables us to generate passive images in an outdoor environment taking advantage of the computational and storage efficiencies. We show some examples for illustration.

  6. Simulating cloud environment for HIS backup using secret sharing.

    PubMed

    Kuroda, Tomohiro; Kimura, Eizen; Matsumura, Yasushi; Yamashita, Yoshinori; Hiramatsu, Haruhiko; Kume, Naoto

    2013-01-01

    In the face of a disaster hospitals are expected to be able to continue providing efficient and high-quality care to patients. It is therefore crucial for hospitals to develop business continuity plans (BCPs) that identify their vulnerabilities, and prepare procedures to overcome them. A key aspect of most hospitals' BCPs is creating the backup of the hospital information system (HIS) data at multiple remote sites. However, the need to keep the data confidential dramatically increases the costs of making such backups. Secret sharing is a method to split an original secret message so that individual pieces are meaningless, but putting sufficient number of pieces together reveals the original message. It allows creation of pseudo-redundant arrays of independent disks for privacy-sensitive data over the Internet. We developed a secret sharing environment for StarBED, a large-scale network experiment environment, and evaluated its potential and performance during disaster recovery. Simulation results showed that the entire main HIS database of Kyoto University Hospital could be retrieved within three days even if one of the distributed storage systems crashed during a disaster.

  7. Simulating variable source problems via post processing of individual particle tallies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.

    2000-10-20

    Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less

  8. The Fireball integrated code package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobranich, D.; Powers, D.A.; Harper, F.T.

    1997-07-01

    Many deep-space satellites contain a plutonium heat source. An explosion, during launch, of a rocket carrying such a satellite offers the potential for the release of some of the plutonium. The fireball following such an explosion exposes any released plutonium to a high-temperature chemically-reactive environment. Vaporization, condensation, and agglomeration processes can alter the distribution of plutonium-bearing particles. The Fireball code package simulates the integrated response of the physical and chemical processes occurring in a fireball and the effect these processes have on the plutonium-bearing particle distribution. This integrated treatment of multiple phenomena represents a significant improvement in the state ofmore » the art for fireball simulations. Preliminary simulations of launch-second scenarios indicate: (1) most plutonium vaporization occurs within the first second of the fireball; (2) large non-aerosol-sized particles contribute very little to plutonium vapor production; (3) vaporization and both homogeneous and heterogeneous condensation occur simultaneously; (4) homogeneous condensation transports plutonium down to the smallest-particle sizes; (5) heterogeneous condensation precludes homogeneous condensation if sufficient condensation sites are available; and (6) agglomeration produces larger-sized particles but slows rapidly as the fireball grows.« less

  9. Evaluation of NCMRWF unified model vertical cloud structure with CloudSat over the Indian summer monsoon region

    NASA Astrophysics Data System (ADS)

    Jayakumar, A.; Mamgain, Ashu; Jisesh, A. S.; Mohandas, Saji; Rakhi, R.; Rajagopal, E. N.

    2016-05-01

    Representation of rainfall distribution and monsoon circulation in the high resolution versions of NCMRWF Unified model (NCUM-REG) for the short-range forecasting of extreme rainfall event is vastly dependent on the key factors such as vertical cloud distribution, convection and convection/cloud relationship in the model. Hence it is highly relevant to evaluate the vertical structure of cloud and precipitation of the model over the monsoon environment. In this regard, we utilized the synergy of the capabilities of CloudSat data for long observational period, by conditioning it for the synoptic situation of the model simulation period. Simulations were run at 4-km grid length with the convective parameterization effectively switched off and on. Since the sample of CloudSat overpasses through the monsoon domain is small, the aforementioned methodology may qualitatively evaluate the vertical cloud structure for the model simulation period. It is envisaged that the present study will open up the possibility of further improvement in the high resolution version of NCUM in the tropics for the Indian summer monsoon associated rainfall events.

  10. Peer-to-peer model for the area coverage and cooperative control of mobile sensor networks

    NASA Astrophysics Data System (ADS)

    Tan, Jindong; Xi, Ning

    2004-09-01

    This paper presents a novel model and distributed algorithms for the cooperation and redeployment of mobile sensor networks. A mobile sensor network composes of a collection of wireless connected mobile robots equipped with a variety of sensors. In such a sensor network, each mobile node has sensing, computation, communication, and locomotion capabilities. The locomotion ability enhances the autonomous deployment of the system. The system can be rapidly deployed to hostile environment, inaccessible terrains or disaster relief operations. The mobile sensor network is essentially a cooperative multiple robot system. This paper first presents a peer-to-peer model to define the relationship between neighboring communicating robots. Delaunay Triangulation and Voronoi diagrams are used to define the geometrical relationship between sensor nodes. This distributed model allows formal analysis for the fusion of spatio-temporal sensory information of the network. Based on the distributed model, this paper discusses a fault tolerant algorithm for autonomous self-deployment of the mobile robots. The algorithm considers the environment constraints, the presence of obstacles and the nonholonomic constraints of the robots. The distributed algorithm enables the system to reconfigure itself such that the area covered by the system can be enlarged. Simulation results have shown the effectiveness of the distributed model and deployment algorithms.

  11. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  12. Coordinated control of micro-grid based on distributed moving horizon control.

    PubMed

    Ma, Miaomiao; Shao, Liyang; Liu, Xiangjie

    2018-05-01

    This paper proposed the distributed moving horizon coordinated control scheme for the power balance and economic dispatch problems of micro-grid based on distributed generation. We design the power coordinated controller for each subsystem via moving horizon control by minimizing a suitable objective function. The objective function of distributed moving horizon coordinated controller is chosen based on the principle that wind power subsystem has the priority to generate electricity while photovoltaic power generation coordinates with wind power subsystem and the battery is only activated to meet the load demand when necessary. The simulation results illustrate that the proposed distributed moving horizon coordinated controller can allocate the output power of two generation subsystems reasonably under varying environment conditions, which not only can satisfy the load demand but also limit excessive fluctuations of output power to protect the power generation equipment. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage

    NASA Technical Reports Server (NTRS)

    Sibille, L.; Carpenter, P.; Schlagheck, R.; French, R. A.

    2006-01-01

    Experience gained during the Apollo program demonstrated the need for extensive testing of surface systems in relevant environments, including regolith materials similar to those encountered on the lunar surface. As NASA embarks on a return to the Moon, it is clear that the current lunar sample inventory is not only insufficient to support lunar surface technology and system development, but its scientific value is too great to be consumed by destructive studies. Every effort must be made to utilize standard simulant materials, which will allow developers to reduce the cost, development, and operational risks to surface systems. The Lunar Regolith Simulant Materials Workshop held in Huntsville, AL, on January 24 26, 2005, identified the need for widely accepted standard reference lunar simulant materials to perform research and development of technologies required for lunar operations. The workshop also established a need for a common, traceable, and repeatable process regarding the standardization, characterization, and distribution of lunar simulants. This document presents recommendations for the standardization, production and usage of lunar regolith simulant materials.

  14. Reflectivity retrieval in a networked radar environment

    NASA Astrophysics Data System (ADS)

    Lim, Sanghun

    Monitoring of precipitation using a high-frequency radar system such as X-band is becoming increasingly popular due to its lower cost compared to its counterpart at S-band. Networks of meteorological radar systems at higher frequencies are being pursued for targeted applications such as coverage over a city or a small basin. However, at higher frequencies, the impact of attenuation due to precipitation needs to be resolved for successful implementation. In this research, new attenuation correction algorithms are introduced to compensate the attenuation impact due to rain medium. In order to design X-band radar systems as well as evaluate algorithm development, it is useful to have simultaneous X-band observation with and without the impact of path attenuation. One way to obtain that data set is through theoretical models. Methodologies for generating realistic range profiles of radar variables at attenuating frequencies such as X-band for rain medium are presented here. Fundamental microphysical properties of precipitation, namely size and shape distribution information, are used to generate realistic profiles of X-band starting with S-band observations. Conditioning the simulation from S-band radar measurements maintains the natural distribution of microphysical parameters associated with rainfall. In this research, data taken by the CSU-CHILL radar and the National Center for Atmospheric Research S-POL radar are used to simulate X-band radar variables. Three procedures to simulate the radar variables at X-band and sample applications are presented. A new attenuation correction algorithm based on profiles of reflectivity, differential reflectivity, and differential propagation phase shift is presented. A solution for specific attenuation retrieval in rain medium is proposed that solves the integral equations for reflectivity and differential reflectivity with cumulative differential propagation phase shift constraint. The conventional rain profiling algorithms that connect reflectivity and specific attenuation can retrieve specific attenuation values along the radar path assuming a constant intercept parameter of the normalized drop size distribution. However, in convective storms, the drop size distribution parameters can have significant variation along the path. In this research, a dual-polarization rain profiling algorithm for horizontal-looking radars incorporating reflectivity as well as differential reflectivity profiles is developed. The dual-polarization rain profiling algorithm has been evaluated with X-band radar observations simulated from drop size distribution derived from high-resolution S-band measurements collected by the CSU-CHILL radar. The analysis shows that the dual-polarization rain profiling algorithm provides significant improvement over the current algorithms. A methodology for reflectivity and attenuation retrieval for rain medium in a networked radar environment is described. Electromagnetic waves backscattered from a common volume in networked radar systems are attenuated differently along the different paths. A solution for the specific attenuation distribution is proposed by solving the integral equation for reflectivity. The set of governing integral equations describing the backscatter and propagation of common resolution volume are solved simultaneously with constraints on total path attenuation. The proposed algorithm is evaluated based on simulated X-band radar observations synthesized from S-band measurements collected by the CSU-CHILL radar. Retrieved reflectivity and specific attenuation using the proposed method show good agreement with simulated reflectivity and specific attenuation.

  15. Chlorophyll Can Be Reduced in Crop Canopies with Little Penalty to Photosynthesis1[OPEN

    PubMed Central

    Drewry, Darren T.; VanLoocke, Andy; Cho, Young B.

    2018-01-01

    The hypothesis that reducing chlorophyll content (Chl) can increase canopy photosynthesis in soybeans was tested using an advanced model of canopy photosynthesis. The relationship among leaf Chl, leaf optical properties, and photosynthetic biochemical capacity was measured in 67 soybean (Glycine max) accessions showing large variation in leaf Chl. These relationships were integrated into a biophysical model of canopy-scale photosynthesis to simulate the intercanopy light environment and carbon assimilation capacity of canopies with wild type, a Chl-deficient mutant (Y11y11), and 67 other mutants spanning the extremes of Chl to quantify the impact of variation in leaf-level Chl on canopy-scale photosynthetic assimilation and identify possible opportunities for improving canopy photosynthesis through Chl reduction. These simulations demonstrate that canopy photosynthesis should not increase with Chl reduction due to increases in leaf reflectance and nonoptimal distribution of canopy nitrogen. However, similar rates of canopy photosynthesis can be maintained with a 9% savings in leaf nitrogen resulting from decreased Chl. Additionally, analysis of these simulations indicate that the inability of Chl reductions to increase photosynthesis arises primarily from the connection between Chl and leaf reflectance and secondarily from the mismatch between the vertical distribution of leaf nitrogen and the light absorption profile. These simulations suggest that future work should explore the possibility of using reduced Chl to improve canopy performance by adapting the distribution of the “saved” nitrogen within the canopy to take greater advantage of the more deeply penetrating light. PMID:29061904

  16. Bottom currents and sediment transport in Long Island Sound: A modeling study

    USGS Publications Warehouse

    Signell, R.P.; List, J.H.; Farris, A.S.

    2000-01-01

    A high resolution (300-400 m grid spacing), process oriented modeling study was undertaken to elucidate the physical processes affecting the characteristics and distribution of sea-floor sedimentary environments in Long Island Sound. Simulations using idealized forcing and high-resolution bathymetry were performed using a three-dimensional circulation model ECOM (Blumberg and Mellor, 1987) and a stationary shallow water wave model HISWA (Holthuijsen et al., 1989). The relative contributions of tide-, density-, wind- and wave-driven bottom currents are assessed and related to observed characteristics of the sea-floor environments, and simple bedload sediment transport simulations are performed. The fine grid spacing allows features with scales of several kilometers to be resolved. The simulations clearly show physical processes that affect the observed sea-floor characteristics at both regional and local scales. Simulations of near-bottom tidal currents reveal a strong gradient in the funnel-shaped eastern part of the Sound, which parallels an observed gradient in sedimentary environments from erosion or nondeposition, through bedload transport and sediment sorting, to fine-grained deposition. A simulation of estuarine flow driven by the along-axis gradient in salinity shows generally westward bottom currents of 2-4 cm/s that are locally enhanced to 6-8 cm/s along the axial depression of the Sound. Bottom wind-driven currents flow downwind along the shallow margins of the basin, but flow against the wind in the deeper regions. These bottom flows (in opposition to the wind) are strongest in the axial depression and add to the estuarine flow when winds are from the west. The combination of enhanced bottom currents due to both estuarine circulation and the prevailing westerly winds provide an explanation for the relatively coarse sediments found along parts of the axial depression. Climatological simulations of wave-driven bottom currents show that frequent high-energy events occur along the shallow margins of the Sound, explaining the occurrence of relatively coarse sediments in these regions. Bedload sediment transport calculations show that the estuarine circulation coupled with the oscillatory tidal currents result in a net westward transport of sand in much of the eastern Sound. Local departures from this regional westward trend occur around topographic and shoreline irregularities, and there is strong predicted convergence of bedload transport over most of the large, linear sand ridges in the eastern Sound, providing a mechanism which prevents their decay. The strong correlation between the near-bottom current intensity based on the model results and the sediment response, as indicated by the distribution of sedimentary environments, provides a framework for predicting the long-term effects of anthropogenic activities.

  17. An Expert System And Simulation Approach For Sensor Management & Control In A Distributed Surveillance Network

    NASA Astrophysics Data System (ADS)

    Leon, Barbara D.; Heller, Paul R.

    1987-05-01

    A surveillance network is a group of multiplatform sensors cooperating to improve network performance. Network control is distributed as a measure to decrease vulnerability to enemy threat. The network may contain diverse sensor types such as radar, ESM (Electronic Support Measures), IRST (Infrared search and track) and E-0 (Electro-Optical). Each platform may contain a single sensor or suite of sensors. In a surveillance network it is desirable to control sensors to make the overall system more effective. This problem has come to be known as sensor management and control (SM&C). Two major facets of network performance are surveillance and survivability. In a netted environment, surveillance can be enhanced if information from all sensors is combined and sensor operating conditions are controlled to provide a synergistic effect. In contrast, when survivability is the main concern for the network, the best operating status for all sensors would be passive or off. Of course, improving survivability tends to degrade surveillance. Hence, the objective of SM&C is to optimize surveillance and survivability of the network. Too voluminous data of various formats and the quick response time are two characteristics of this problem which make it an ideal application for Artificial Intelligence. A solution to the SM&C problem, presented as a computer simulation, will be presented in this paper. The simulation is a hybrid production written in LISP and FORTRAN. It combines the latest conventional computer programming methods with Artificial Intelligence techniques to produce a flexible state-of-the-art tool to evaluate network performance. The event-driven simulation contains environment models coupled with an expert system. These environment models include sensor (track-while-scan and agile beam) and target models, local tracking, and system tracking. These models are used to generate the environment for the sensor management and control expert system. The expert system, driven by a forward chaining inference engine, makes decisions based on the global database. The global database contains current track and sensor information supplied by the simulation. At present, the rule base emphasizes the surveillance features with rules grouped into three main categories: maintenance and enhancing track on prioritized targets; filling coverage holes and countering jamming; and evaluating sensor status. The paper will describe the architecture used for the expert system and the reasons for selecting the chosen methods. The SM&C simulation produces a graphical representation of sensors and their associated tracks such that the benefits of the sensor management and control expert system are evident. Jammer locations are also part of the display. The paper will describe results from several scenarios that best illustrate the sensor management and control concepts.

  18. Using cognitive architectures to study issues in team cognition in a complex task environment

    NASA Astrophysics Data System (ADS)

    Smart, Paul R.; Sycara, Katia; Tang, Yuqing

    2014-05-01

    Cognitive social simulation is a computer simulation technique that aims to improve our understanding of the dynamics of socially-situated and socially-distributed cognition. This makes cognitive social simulation techniques particularly appealing as a means to undertake experiments into team cognition. The current paper reports on the results of an ongoing effort to develop a cognitive social simulation capability that can be used to undertake studies into team cognition using the ACT-R cognitive architecture. This capability is intended to support simulation experiments using a team-based problem solving task, which has been used to explore the effect of different organizational environments on collective problem solving performance. The functionality of the ACT-R-based cognitive social simulation capability is presented and a number of areas of future development work are outlined. The paper also describes the motivation for adopting cognitive architectures in the context of social simulation experiments and presents a number of research areas where cognitive social simulation may be useful in developing a better understanding of the dynamics of team cognition. These include the use of cognitive social simulation to study the role of cognitive processes in determining aspects of communicative behavior, as well as the impact of communicative behavior on the shaping of task-relevant cognitive processes (e.g., the social shaping of individual and collective memory as a result of communicative exchanges). We suggest that the ability to perform cognitive social simulation experiments in these areas will help to elucidate some of the complex interactions that exist between cognitive, social, technological and informational factors in the context of team-based problem-solving activities.

  19. Distributed Spectral Monitoring For Emitter Localization

    DTIC Science & Technology

    2018-02-12

    localization techniques in a DSA sensor network. The results of the research are presented through simulation of localization algorithms, emulation of a...network on a wireless RF environment emulator, and field tests. The results of the various tests in both the lab and field are obtained and analyzed to... are two main classes of localization techniques, and the technique to use will depend on the information available with the emitter. The first class

  20. Depth dose distribution study within a phantom torso after irradiation with a simulated Solar Particle Event at NSRL

    NASA Astrophysics Data System (ADS)

    Berger, Thomas; Matthiä, Daniel; Koerner, Christine; George, Kerry; Rhone, Jordan; Cucinotta, Francis A.; Reitz, Guenther

    The adequate knowledge of the radiation environment and the doses incurred during a space mission is essential for estimating an astronaut's health risk. The space radiation environment is complex and variable, and exposures inside the spacecraft and the astronaut's body are com-pounded by the interactions of the primary particles with the atoms of the structural materials and with the body itself. Astronauts' radiation exposures are measured by means of personal dosimetry, but there remains substantial uncertainty associated with the computational extrap-olation of skin dose to organ dose, which can lead to over-or under-estimation of the health risk. Comparisons of models to data showed that the astronaut's Effective dose (E) can be pre-dicted to within about a +10In the research experiment "Depth dose distribution study within a phantom torso" at the NASA Space Radiation Laboratory (NSRL) at BNL, Brookhaven, USA the large 1972 SPE spectrum was simulated using seven different proton energies from 50 up to 450 MeV. A phantom torso constructed of natural bones and realistic distributions of human tissue equivalent materials, which is comparable to the torso of the MATROSHKA phantom currently on the ISS, was equipped with a comprehensive set of thermoluminescence detectors and human cells. The detectors are applied to assess the depth dose distribution and radiation transport codes (e.g. GEANT4) are used to assess the radiation field and interactions of the radiation field with the phantom torso. Lymphocyte cells are strategically embedded at selected locations at the skin and internal organs and are processed after irradiation to assess the effects of shielding on the yield of chromosome damage. The first focus of the pre-sented experiment is to correlate biological results with physical dosimetry measurements in the phantom torso. Further on the results of the passive dosimetry using the anthropomorphic phantoms represent the best tool to generate reliable to benchmark computational radiation transport models in a radiation field of interest. The presentation will give first results of the physical dose distribution, the comparison with GEANT4 computer simulations, based on a Voxel model of the phantom, and a comparison with the data from the chromosome aberration study. The help and support of Adam Russek and Michael Sivertz of the NASA Space Radiation Laboratory (NSRL), Brookhaven, USA during the setup and the irradiation of the phantom are highly appreciated. The Voxel model describing the human phantom used for the GEANT4 simulations was kindly provided by Monika Puchalska (CHALMERS, Gothenburg, Sweden).

  1. A simulation environment for assisting system design of coherent laser doppler wind sensor for active wind turbine pitch control

    NASA Astrophysics Data System (ADS)

    Shinohara, Leilei; Pham Tran, Tuan Anh; Beuth, Thorsten; Umesh Babu, Harsha; Heussner, Nico; Bogatscher, Siegwart; Danilova, Svetlana; Stork, Wilhelm

    2013-05-01

    In order to assist a system design of laser coherent Doppler wind sensor for active pitch control of wind turbine systems (WTS), we developed a numerical simulation environment for modeling and simulation of the sensor system. In this paper we present this simulation concept. In previous works, we have shown the general idea and the possibility of using a low cost coherent laser Doppler wind sensing system for an active pitch control of WTS in order to achieve a reduced mechanical stress, increase the WTS lifetime and therefore reduce the electricity price from wind energy. Such a system is based on a 1.55μm Continuous-Wave (CW) laser plus an erbium-doped fiber amplifier (EDFA) with an output power of 1W. Within this system, an optical coherent detection method is chosen for the Doppler frequency measurement in megahertz range. A comparatively low cost short coherent length laser with a fiber delay line is used for achieving a multiple range measurement. In this paper, we show the current results on the improvement of our simulation by applying a Monte Carlo random generation method for positioning the random particles in atmosphere and extend the simulation to the entire beam penetrated space by introducing a cylindrical co-ordinate concept and meshing the entire volume into small elements in order to achieve a faster calculation and gain more realistic simulation result. In addition, by applying different atmospheric parameters, such as particle sizes and distributions, we can simulate different weather and wind situations.

  2. Physically-Based Modelling and Real-Time Simulation of Fluids.

    NASA Astrophysics Data System (ADS)

    Chen, Jim Xiong

    1995-01-01

    Simulating physically realistic complex fluid behaviors presents an extremely challenging problem for computer graphics researchers. Such behaviors include the effects of driving boats through water, blending differently colored fluids, rain falling and flowing on a terrain, fluids interacting in a Distributed Interactive Simulation (DIS), etc. Such capabilities are useful in computer art, advertising, education, entertainment, and training. We present a new method for physically-based modeling and real-time simulation of fluids in computer graphics and dynamic virtual environments. By solving the 2D Navier -Stokes equations using a CFD method, we map the surface into 3D using the corresponding pressures in the fluid flow field. This achieves realistic real-time fluid surface behaviors by employing the physical governing laws of fluids but avoiding extensive 3D fluid dynamics computations. To complement the surface behaviors, we calculate fluid volume and external boundary changes separately to achieve full 3D general fluid flow. To simulate physical activities in a DIS, we introduce a mechanism which uses a uniform time scale proportional to the clock-time and variable time-slicing to synchronize physical models such as fluids in the networked environment. Our approach can simulate many different fluid behaviors by changing the internal or external boundary conditions. It can model different kinds of fluids by varying the Reynolds number. It can simulate objects moving or floating in fluids. It can also produce synchronized general fluid flows in a DIS. Our model can serve as a testbed to simulate many other fluid phenomena which have never been successfully modeled previously.

  3. The Hydrology of Malaria: Model Development and Application to a Sahelian Village

    NASA Astrophysics Data System (ADS)

    Bomblies, A.; Duchemin, J.; Eltahir, E. A.

    2008-12-01

    We present a coupled hydrology and entomology model for the mechanistic simulation of local-scale response of malaria transmission to hydrological and climatological determinants in semi-arid, desert fringe environments. The model is applied to the Sahel village of Banizoumbou, Niger, to predict interannual variability in malaria vector mosquito populations which lead to variations in malaria transmission. Using a high-resolution, small-scale distributed hydrology model that incorporates remotely-sensed data for land cover and topography, we simulate the formation and persistence of the pools constituting the primary breeding habitat of Anopheles gambiae s.l. mosquitoes, the principal regional malaria vector mosquitoes. An agent-based mosquito population model is coupled to the distributed hydrology model, with aquatic stage and adult stage components. For each individual adult mosquito, the model tracks attributes relevant to population dynamics and malaria transmission, which are updated as mosquitoes interact with their environment, humans, and animals. Weekly field observations were made in 2005 and 2006. The model reproduces mosquito population variability at seasonal and interannual time scales, and highlights individual pool persistence as a dominant control. Future developments to the presented model can be used in the evaluation of impacts of climate change on malaria, as well as the a priori evaluation of environmental management-based interventions.

  4. [Adsorption characteristics of proteins on membrane surface and effect of protein solution environment on permeation behavior of berberine].

    PubMed

    Li, Yi-Qun; Xu, Li; Zhu, Hua-Xu; Tang, Zhi-Shu; Li, Bo; Pan, Yong-Lan; Yao, Wei-Wei; Fu, Ting-Ming; Guo, Li-Wei

    2017-10-01

    In order to explore the adsorption characteristics of proteins on the membrane surface and the effect of protein solution environment on the permeation behavior of berberine, berberine and proteins were used as the research object to prepare simulated solution. Low field NMR, static adsorption experiment and membrane separation experiment were used to study the interaction between the proteins and ceramic membrane or between the proteins and berberine. The static adsorption capacity of proteins, membrane relative flux, rejection rate of proteins, transmittance rate of berberine and the adsorption rate of proteins and berberine were used as the evaluation index. Meanwhile, the membrane resistance distribution, the particle size distribution and the scanning electron microscope (SEM) were determined to investigate the adsorption characteristics of proteins on ceramic membrane and the effect on membrane separation process of berberine. The results showed that the ceramic membrane could adsorb the proteins and the adsorption model was consistent with Langmuir adsorption model. In simulating the membrane separation process, proteins were the main factor to cause membrane fouling. However, when the concentration of proteins was 1 g•L⁻¹, the proteins had no significant effect on membrane separation process of berberine. Copyright© by the Chinese Pharmaceutical Association.

  5. Mimicking Atmospheric Flow Conditions to Examine Mosquito Orientation Behavior

    NASA Astrophysics Data System (ADS)

    Huang, Yi-Chun; Vickers, Neil; Hultmark, Marcus

    2017-11-01

    Host-seeking female mosquitoes utilize a variety of sensory cues to locate potential hosts. In addition to visual cues, other signals include CO2 , volatile skin emanations, humidity, and thermal cues, each of which can be considered as passive scalars in the environment, primarily distributed by local flow conditions. The behavior of host-seeking female mosquito vectors can be more thoroughly understood by simulating the natural features of the environment through which they navigate, namely the atmospheric boundary layer. Thus, an exploration and understanding of the dynamics of a scalar plume will not only establish the effect of fluid environment on scalar coherence and distribution, but also provide a bioassay platform for approaches directed at disrupting or preventing the cycle of mosquito-vectored disease transmission. In order to bridge between laboratory findings and the natural, ecologically relevant setting, a unique active flow modulation system consisting of a grid of 60 independently operated paddles was developed. Unlike static grids that generate turbulence within a predefined range of scales, an active grid imposes variable and controllable turbulent structures onto the moving air by synchronized rotation of the paddles at specified frequencies.

  6. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  7. Extended computational kernels in a massively parallel implementation of the Trotter-Suzuki approximation

    NASA Astrophysics Data System (ADS)

    Wittek, Peter; Calderaro, Luca

    2015-12-01

    We extended a parallel and distributed implementation of the Trotter-Suzuki algorithm for simulating quantum systems to study a wider range of physical problems and to make the library easier to use. The new release allows periodic boundary conditions, many-body simulations of non-interacting particles, arbitrary stationary potential functions, and imaginary time evolution to approximate the ground state energy. The new release is more resilient to the computational environment: a wider range of compiler chains and more platforms are supported. To ease development, we provide a more extensive command-line interface, an application programming interface, and wrappers from high-level languages.

  8. RELATIVISTIC CYCLOTRON INSTABILITY IN ANISOTROPIC PLASMAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    López, Rodrigo A.; Moya, Pablo S.; Muñoz, Víctor

    2016-11-20

    A sufficiently large temperature anisotropy can sometimes drive various types of electromagnetic plasma micro-instabilities, which can play an important role in the dynamics of relativistic pair plasmas in space, astrophysics, and laboratory environments. Here, we provide a detailed description of the cyclotron instability of parallel propagating electromagnetic waves in relativistic pair plasmas on the basis of a relativistic anisotropic distribution function. Using plasma kinetic theory and particle-in-cell simulations, we study the influence of the relativistic temperature and the temperature anisotropy on the collective and noncollective modes of these plasmas. Growth rates and dispersion curves from the linear theory show amore » good agreement with simulations results.« less

  9. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.

  10. Cooperative high-performance storage in the accelerated strategic computing initiative

    NASA Technical Reports Server (NTRS)

    Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark

    1996-01-01

    The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.

  11. Modeling the Blast Load Simulator Airblast Environment using First Principles Codes. Report 1, Blast Load Simulator Environment

    DTIC Science & Technology

    2016-11-01

    ER D C/ G SL T R- 16 -3 1 Modeling the Blast Load Simulator Airblast Environment Using First Principles Codes Report 1, Blast Load...Simulator Airblast Environment using First Principles Codes Report 1, Blast Load Simulator Environment Gregory C. Bessette, James L. O’Daniel...evaluate several first principles codes (FPCs) for modeling airblast environments typical of those encountered in the BLS. The FPCs considered were

  12. First-principles Monte Carlo simulations of reaction equilibria in compressed vapors

    DOE PAGES

    Fetisov, Evgenii O.; Kuo, I-Feng William; Knight, Chris; ...

    2016-06-13

    Predictive modeling of reaction equilibria presents one of the grand challenges in the field of molecular simulation. Difficulties in the study of such systems arise from the need (i) to accurately model both strong, short-ranged interactions leading to the formation of chemical bonds and weak interactions arising from the environment, and (ii) to sample the range of time scales involving frequent molecular collisions, slow diffusion, and infrequent reactive events. Here we present a novel reactive first-principles Monte Carlo (RxFPMC) approach that allows for investigation of reaction equilibria without the need to prespecify a set of chemical reactions and their ideal-gasmore » equilibrium constants. We apply RxFPMC to investigate a nitrogen/oxygen mixture at T = 3000 K and p = 30 GPa, i.e., conditions that are present in atmospheric lightning strikes and explosions. The RxFPMC simulations show that the solvation environment leads to a significantly enhanced NO concentration that reaches a maximum when oxygen is present in slight excess. In addition, the RxFPMC simulations indicate the formation of NO 2 and N 2O in mole fractions approaching 1%, whereas N 3 and O 3 are not observed. Lastly, the equilibrium distributions obtained from the RxFPMC simulations agree well with those from a thermochemical computer code parametrized to experimental data.« less

  13. First-principles Monte Carlo simulations of reaction equilibria in compressed vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fetisov, Evgenii O.; Kuo, I-Feng William; Knight, Chris

    Predictive modeling of reaction equilibria presents one of the grand challenges in the field of molecular simulation. Difficulties in the study of such systems arise from the need (i) to accurately model both strong, short-ranged interactions leading to the formation of chemical bonds and weak interactions arising from the environment, and (ii) to sample the range of time scales involving frequent molecular collisions, slow diffusion, and infrequent reactive events. Here we present a novel reactive first-principles Monte Carlo (RxFPMC) approach that allows for investigation of reaction equilibria without the need to prespecify a set of chemical reactions and their ideal-gasmore » equilibrium constants. We apply RxFPMC to investigate a nitrogen/oxygen mixture at T = 3000 K and p = 30 GPa, i.e., conditions that are present in atmospheric lightning strikes and explosions. The RxFPMC simulations show that the solvation environment leads to a significantly enhanced NO concentration that reaches a maximum when oxygen is present in slight excess. In addition, the RxFPMC simulations indicate the formation of NO 2 and N 2O in mole fractions approaching 1%, whereas N 3 and O 3 are not observed. Lastly, the equilibrium distributions obtained from the RxFPMC simulations agree well with those from a thermochemical computer code parametrized to experimental data.« less

  14. Why Isn't There More High-fidelity Simulation Training in Diagnostic Radiology? Results of a Survey of Academic Radiologists.

    PubMed

    Cook, Tessa S; Hernandez, Jessica; Scanlon, Mary; Langlotz, Curtis; Li, Chun-Der L

    2016-07-01

    Despite its increasing use in training other medical specialties, high-fidelity simulation to prepare diagnostic radiology residents for call remains an underused educational resource. To attempt to characterize the barriers toward adoption of this technology, we conducted a survey of academic radiologists and radiology trainees. An Institutional Review Board-approved survey was distributed to the Association of University Radiologists members via e-mail. Survey results were collected electronically, tabulated, and analyzed. A total of 68 survey responses representing 51 programs were received from program directors, department chairs, chief residents, and program administrators. The most common form of educational activity for resident call preparation was lectures. Faculty supervised "baby call" was also widely reported. Actual simulated call environments were quite rare with only three programs reporting this type of educational activity. Barriers to the use of simulation include lack of faculty time, lack of faculty expertise, and lack of perceived need. High-fidelity simulation can be used to mimic the high-stress, high-stakes independent call environment that the typical radiology resident encounters during the second year of training, and can provide objective data for program directors to assess the Accreditation Council of Graduate Medical Education milestones. We predict that this technology will begin to supplement traditional diagnostic radiology teaching methods and to improve patient care and safety in the next decade. Published by Elsevier Inc.

  15. C3 System Performance Simulation and User Manual. Getting Started: Guidelines for Users

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This document is a User's Manual describing the C3 Simulation capabilities. The subject work was designed to simulate the communications involved in the flight of a Remotely Operated Aircraft (ROA) using the Opnet software. Opnet provides a comprehensive development environment supporting the modeling of communication networks and distributed systems. It has tools for model design, simulation, data collection, and data analysis. Opnet models are hierarchical -- consisting of a project which contains node models which in turn contain process models. Nodes can be fixed, mobile, or satellite. Links between nodes can be physical or wireless. Communications are packet based. The model is very generic in its current form. Attributes such as frequency and bandwidth can easily be modified to better reflect a specific platform. The model is not fully developed at this stage -- there are still more enhancements to be added. Current issues are documented throughout this guide.

  16. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  17. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  18. [Dynamic road vehicle emission inventory simulation study based on real time traffic information].

    PubMed

    Huang, Cheng; Liu, Juan; Chen, Chang-Hong; Zhang, Jian; Liu, Deng-Guo; Zhu, Jing-Yu; Huang, Wei-Ming; Chao, Yuan

    2012-11-01

    The vehicle activity survey, including traffic flow distribution, driving condition, and vehicle technologies, were conducted in Shanghai. The databases of vehicle flow, VSP distribution and vehicle categories were established according to the surveyed data. Based on this, a dynamic vehicle emission inventory simulation method was designed by using the real time traffic information data, such as traffic flow and average speed. Some roads in Shanghai city were selected to conduct the hourly vehicle emission simulation as a case study. The survey results show that light duty passenger car and taxi are major vehicles on the roads of Shanghai city, accounting for 48% - 72% and 15% - 43% of the total flow in each hour, respectively. VSP distribution has a good relationship with the average speed. The peak of VSP distribution tends to move to high load section and become lower with the increase of average speed. Vehicles achieved Euro 2 and Euro 3 standards are majorities of current vehicle population in Shanghai. Based on the calibration of vehicle travel mileage data, the proportions of Euro 2 and Euro 3 standard vehicles take up 11% - 70% and 17% - 51% in the real-world situation, respectively. The emission simulation results indicate that the ratios of emission peak and valley for the pollutants of CO, VOC, NO(x) and PM are 3.7, 4.6, 9.6 and 19.8, respectively. CO and VOC emissions mainly come from light-duty passenger car and taxi, which has a good relationship with the traffic flow. NO(x) and PM emissions are mainly from heavy-duty bus and public buses and mainly concentrate in the morning and evening peak hours. The established dynamic vehicle emission simulation method can reflect the change of actual road emission and output high emission road sectors and hours in real time. The method can provide an important technical means and decision-making basis for transportation environment management.

  19. Identifying the optimal spatially and temporally invariant root distribution for a semiarid environment

    NASA Astrophysics Data System (ADS)

    Sivandran, Gajan; Bras, Rafael L.

    2012-12-01

    In semiarid regions, the rooting strategies employed by vegetation can be critical to its survival. Arid regions are characterized by high variability in the arrival of rainfall, and species found in these areas have adapted mechanisms to ensure the capture of this scarce resource. Vegetation roots have strong control over this partitioning, and assuming a static root profile, predetermine the manner in which this partitioning is undertaken.A coupled, dynamic vegetation and hydrologic model, tRIBS + VEGGIE, was used to explore the role of vertical root distribution on hydrologic fluxes. Point-scale simulations were carried out using two spatially and temporally invariant rooting schemes: uniform: a one-parameter model and logistic: a two-parameter model. The simulations were forced with a stochastic climate generator calibrated to weather stations and rain gauges in the semiarid Walnut Gulch Experimental Watershed (WGEW) in Arizona. A series of simulations were undertaken exploring the parameter space of both rooting schemes and the optimal root distribution for the simulation, which was defined as the root distribution with the maximum mean transpiration over a 100-yr period, and this was identified. This optimal root profile was determined for five generic soil textures and two plant-functional types (PFTs) to illustrate the role of soil texture on the partitioning of moisture at the land surface. The simulation results illustrate the strong control soil texture has on the partitioning of rainfall and consequently the depth of the optimal rooting profile. High-conductivity soils resulted in the deepest optimal rooting profile with land surface moisture fluxes dominated by transpiration. As we move toward the lower conductivity end of the soil spectrum, a shallowing of the optimal rooting profile is observed and evaporation gradually becomes the dominate flux from the land surface. This study offers a methodology through which local plant, soil, and climate can be accounted for in the parameterization of rooting profiles in semiarid regions.

  20. Thermodynamic sensitivities in observed and simulated extreme-rain-producing mesoscale convective systems

    NASA Astrophysics Data System (ADS)

    Schumacher, R. S.; Peters, J. M.

    2015-12-01

    Mesoscale convective systems (MCSs) are responsible for a large fraction of warm-season extreme rainfall events over the continental United States, as well as other midlatitude regions globally. The rainfall production in these MCSs is determined by numerous factors, including the large-scale forcing for ascent, the organization of the convection, cloud microphysical processes, and the surrounding thermodynamic and kinematic environment. Furthermore, heavy-rain-producing MCSs are most common at night, which means that well-studied mechanisms for MCS maintenance and organization such as cold pools (gravity currents) are not always at work. In this study, we use numerical model simulations and recent field observations to investigate the sensitivity of low-level MCS structures, and their influences on rainfall, to the details of the thermodynamic environment. In particular, small alterations to the initial conditions in idealized and semi-idealized simulations result in comparatively large precipitation changes, both in terms of the intensity and the spatial distribution. The uncertainties in the thermodynamic enviroments in the model simulations will be compared with high-resolution observations from the Plains Elevated Convection At Night (PECAN) field experiment in 2015. The results have implications for the paradigms of "surface-based" versus "elevated" convection, as well as for the predictability of warm-season convective rainfall.

  1. Adjustment of spatio-temporal precipitation patterns in a high Alpine environment

    NASA Astrophysics Data System (ADS)

    Herrnegger, Mathew; Senoner, Tobias; Nachtnebel, Hans-Peter

    2018-01-01

    This contribution presents a method for correcting the spatial and temporal distribution of precipitation fields in a mountainous environment. The approach is applied within a flood forecasting model in the Upper Enns catchment in the Central Austrian Alps. Precipitation exhibits a large spatio-temporal variability in Alpine areas. Additionally the density of the monitoring network is low and measurements are subjected to major errors. This can lead to significant deficits in water balance estimation and stream flow simulations, e.g. for flood forecasting models. Therefore precipitation correction factors are frequently applied. For the presented study a multiplicative, stepwise linear correction model is implemented in the rainfall-runoff model COSERO to adjust the precipitation pattern as a function of elevation. To account for the local meteorological conditions, the correction model is derived for two elevation zones: (1) Valley floors to 2000 m a.s.l. and (2) above 2000 m a.s.l. to mountain peaks. Measurement errors also depend on the precipitation type, with higher magnitudes in winter months during snow fall. Therefore, additionally, separate correction factors for winter and summer months are estimated. Significant improvements in the runoff simulations could be achieved, not only in the long-term water balance simulation and the overall model performance, but also in the simulation of flood peaks.

  2. The Strata-1 experiment on small body regolith segregation

    NASA Astrophysics Data System (ADS)

    Fries, Marc; Abell, Paul; Brisset, Julie; Britt, Daniel; Colwell, Joshua; Dove, Adrienne; Durda, Dan; Graham, Lee; Hartzell, Christine; Hrovat, Kenneth; John, Kristen; Karrer, Dakotah; Leonard, Matthew; Love, Stanley; Morgan, Joseph; Poppin, Jayme; Rodriguez, Vincent; Sánchez-Lana, Paul; Scheeres, Dan; Whizin, Akbar

    2018-01-01

    The Strata-1 experiment studies the mixing and segregation dynamics of regolith on small bodies by exposing a suite of regolith simulants to the microgravity environment aboard the International Space Station (ISS) for one year. This will improve our understanding of regolith dynamics and properties on small asteroids, and may assist in interpreting analyses of samples from missions to small bodies such as OSIRIS-REx, Hayabusa-1 and -2, and future missions to small bodies. The Strata-1 experiment consists of four evacuated tubes partially filled with regolith simulants. The simulants are chosen to represent models of regolith covering a range of complexity and tailored to inform and improve computational studies. The four tubes are regularly imaged while moving in response to the ambient vibrational environment using dedicated cameras. The imagery is then downlinked to the Strata-1 science team about every two months. Analyses performed on the imagery includes evaluating the extent of the segregation of Strata-1 samples and comparing the observations to computational models. After Strata-1's return to Earth, x-ray tomography and optical microscopy will be used to study the post-flight simulant distribution. Strata-1 is also a pathfinder for the new "1E" ISS payload class, which is intended to simplify and accelerate emplacement of experiments on board ISS.

  3. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  4. Simulating Exposure Concentrations of Engineered Nanomaterials in Surface Water Systems: Release of WASP8

    NASA Astrophysics Data System (ADS)

    Knightes, C. D.; Bouchard, D.; Zepp, R. G.; Henderson, W. M.; Han, Y.; Hsieh, H. S.; Avant, B. K.; Acrey, B.; Spear, J.

    2017-12-01

    The unique properties of engineered nanomaterials led to their increased production and potential release into the environment. Currently available environmental fate models developed for traditional contaminants are limited in their ability to simulate nanomaterials' environmental behavior. This is due to an incomplete understanding and representation of the processes governing nanomaterial distribution in the environment and by scarce empirical data quantifying the interaction of nanomaterials with environmental surfaces. The well-known Water Quality Analysis Simulation Program (WASP) was updated to incorporate nanomaterial-specific processes, specifically hetero-aggregation with particulate matter. In parallel with this effort, laboratory studies were used to quantify parameter values parameters necessary for governing processes in surface waters. This presentation will discuss the recent developments in the new architecture for WASP8 and the newly constructed Advanced Toxicant Module. The module includes advanced algorithms for increased numbers of state variables: chemicals, solids, dissolved organic matter, pathogens, temperature, and salinity. This presentation will focus specifically on the incorporation of nanomaterials, with the applications of the fate and transport of hypothetical releases of Multi-Walled Carbon Nanotubes (MWCNT) and Graphene Oxide (GO) into the headwaters of a southeastern US coastal plains river. While this presentation focuses on nanomaterials, the advanced toxicant module can also simulate metals and organic contaminants.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salko, Robert K; Sung, Yixing; Kucukboyaci, Vefa

    The Virtual Environment for Reactor Applications core simulator (VERA-CS) being developed by the Consortium for the Advanced Simulation of Light Water Reactors (CASL) includes coupled neutronics, thermal-hydraulics, and fuel temperature components with an isotopic depletion capability. The neutronics capability employed is based on MPACT, a three-dimensional (3-D) whole core transport code. The thermal-hydraulics and fuel temperature models are provided by the COBRA-TF (CTF) subchannel code. As part of the CASL development program, the VERA-CS (MPACT/CTF) code system was applied to model and simulate reactor core response with respect to departure from nucleate boiling ratio (DNBR) at the limiting time stepmore » of a postulated pressurized water reactor (PWR) main steamline break (MSLB) event initiated at the hot zero power (HZP), either with offsite power available and the reactor coolant pumps in operation (high-flow case) or without offsite power where the reactor core is cooled through natural circulation (low-flow case). The VERA-CS simulation was based on core boundary conditions from the RETRAN-02 system transient calculations and STAR-CCM+ computational fluid dynamics (CFD) core inlet distribution calculations. The evaluation indicated that the VERA-CS code system is capable of modeling and simulating quasi-steady state reactor core response under the steamline break (SLB) accident condition, the results are insensitive to uncertainties in the inlet flow distributions from the CFD simulations, and the high-flow case is more DNB limiting than the low-flow case.« less

  6. Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin

    2018-01-04

    In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment-trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. Copyright © 2018 Montesinos-Lopez et al.

  7. Freud: a software suite for high-throughput simulation analysis

    NASA Astrophysics Data System (ADS)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  8. Revealing Hidden Conformational Space of LOV Protein VIVID Through Rigid Residue Scan Simulations

    NASA Astrophysics Data System (ADS)

    Zhou, Hongyu; Zoltowski, Brian D.; Tao, Peng

    2017-04-01

    VIVID(VVD) protein is a Light-Oxygen-Voltage(LOV) domain in circadian clock system. Upon blue light activation, a covalent bond is formed between VVD residue Cys108 and its cofactor flavin adenine dinucleotide(FAD), and prompts VVD switching from Dark state to Light state with significant conformational deviation. However, the mechanism of this local environment initiated global protein conformational change remains elusive. We employed a recently developed computational approach, rigid residue scan(RRS), to systematically probe the impact of the internal degrees of freedom in each amino acid residue of VVD on its overall dynamics by applying rigid body constraint on each residue in molecular dynamics simulations. Key residues were identified with distinctive impacts on Dark and Light states, respectively. All the simulations display wide range of distribution on a two-dimensional(2D) plot upon structural root-mean-square deviations(RMSD) from either Dark or Light state. Clustering analysis of the 2D RMSD distribution leads to 15 representative structures with drastically different conformation of N-terminus, which is also a key difference between Dark and Light states of VVD. Further principle component analyses(PCA) of RRS simulations agree with the observation of distinctive impact from individual residues on Dark and Light states.

  9. Characterization of Surface Reflectance Variation Effects on Remote Sensing

    NASA Technical Reports Server (NTRS)

    Pearce, W. A.

    1984-01-01

    The use of Monte Carlo radiative transfer codes to simulate the effects on remote sensing in visible and infrared wavelengths of variables which affect classification is examined. These variables include detector viewing angle, atmospheric aerosol size distribution, aerosol vertical and horizontal distribution (e.g., finite clouds), the form of the bidirectional ground reflectance function, and horizontal variability of reflectance type and reflectivity (albedo). These simulations are used to characterize the sensitivity of observables (intensity and polarization) to variations in the underlying physical parameters both to improve algorithms for the removal of atmospheric effects and to identify techniques which can improve classification accuracy. It was necessary to revise and validate the simulation codes (CTRANS, ARTRAN, and the Mie scattering code) to improve efficiency and accommodate a new operational environment, and to build the basic software tools for acquisition and off-line manipulation of simulation results. Initial calculations compare cases in which increasing amounts of aerosol are shifted into the stratosphere, maintaining a constant optical depth. In the case of moderate aerosol optical depth, the effect on the spread function is to scale it linearly as would be expected from a single scattering model. Varying the viewing angle appears to provide the same qualitative effect as modifying the vertical optical depth (for Lambertian ground reflectance).

  10. Revealing Hidden Conformational Space of LOV Protein VIVID Through Rigid Residue Scan Simulations

    PubMed Central

    Zhou, Hongyu; Zoltowski, Brian D.; Tao, Peng

    2017-01-01

    VIVID(VVD) protein is a Light-Oxygen-Voltage(LOV) domain in circadian clock system. Upon blue light activation, a covalent bond is formed between VVD residue Cys108 and its cofactor flavin adenine dinucleotide(FAD), and prompts VVD switching from Dark state to Light state with significant conformational deviation. However, the mechanism of this local environment initiated global protein conformational change remains elusive. We employed a recently developed computational approach, rigid residue scan(RRS), to systematically probe the impact of the internal degrees of freedom in each amino acid residue of VVD on its overall dynamics by applying rigid body constraint on each residue in molecular dynamics simulations. Key residues were identified with distinctive impacts on Dark and Light states, respectively. All the simulations display wide range of distribution on a two-dimensional(2D) plot upon structural root-mean-square deviations(RMSD) from either Dark or Light state. Clustering analysis of the 2D RMSD distribution leads to 15 representative structures with drastically different conformation of N-terminus, which is also a key difference between Dark and Light states of VVD. Further principle component analyses(PCA) of RRS simulations agree with the observation of distinctive impact from individual residues on Dark and Light states. PMID:28425502

  11. Geostatistical conditional simulation for the assessment of contaminated land by abandoned heavy metal mining.

    PubMed

    Ersoy, Adem; Yunsel, Tayfun Yusuf; Atici, Umit

    2008-02-01

    Abandoned mine workings can undoubtedly cause varying degrees of contamination of soil with heavy metals such as lead and zinc has occurred on a global scale. Exposure to these elements may cause to harm human health and environment. In the study, a total of 269 soil samples were collected at 1, 5, and 10 m regular grid intervals of 100 x 100 m area of Carsington Pasture in the UK. Cell declustering technique was applied to the data set due to no statistical representativity. Directional experimental semivariograms of the elements for the transformed data showed that both geometric and zonal anisotropy exists in the data. The most evident spatial dependence structure of the continuity for the directional experimental semivariogram, characterized by spherical and exponential models of Pb and Zn were obtained. This study reports the spatial distribution and uncertainty of Pb and Zn concentrations in soil at the study site using a probabilistic approach. The approach was based on geostatistical sequential Gaussian simulation (SGS), which is used to yield a series of conditional images characterized by equally probable spatial distributions of the heavy elements concentrations across the area. Postprocessing of many simulations allowed the mapping of contaminated and uncontaminated areas, and provided a model for the uncertainty in the spatial distribution of element concentrations. Maps of the simulated Pb and Zn concentrations revealed the extent and severity of contamination. SGS was validated by statistics, histogram, variogram reproduction, and simulation errors. The maps of the elements might be used in the remediation studies, help decision-makers and others involved in the abandoned heavy metal mining site in the world.

  12. CFD simulation of a cabin thermal environment with and without human body - thermal comfort evaluation

    NASA Astrophysics Data System (ADS)

    Danca, Paul; Bode, Florin; Nastase, Ilinca; Meslem, Amina

    2018-02-01

    Nowadays, thermal comfort became one of the criteria in choosing a vehicle. In last decades time spent by people in vehicles had risen substantially. During each trip, thermal comfort must to be ensured for a good psychological and physical state of the passengers. Also, a comfortable environment leads to a higher power concentration of the driver thereby to a safe trip for vehicle occupants and for all traffic participants. The present study numerically investigated the effect of human body sited in the driver's place, over the air velocity distribution and over the thermal comfort in a passenger compartment. CFD simulations were made with different angles of the left inlet grill, in both cases, with and without driver presence. In majority of the actual vehicles environment studies, are made without consideration of human body geometry, in this case, the results precision can be affected. The results show that the presence of human body, lead to global changing of the whole flow pattern inside the vehicular cabin. Also, the locations of the maximum velocities are changing with the angle of the guiding vanes. The thermal comfort PMV/PPD indexes were calculated for each case. The presence of human body leads to a more comfortable environment.

  13. The influence of different training schedules on the learning of psychomotor skills for endoscopic surgery.

    PubMed

    Verdaasdonk, E G G; Stassen, L P S; van Wijk, R P J; Dankelman, J

    2007-02-01

    Psychomotor skills for endoscopic surgery can be trained with virtual reality simulators. Distributed training is more effective than massed training, but it is unclear whether distributed training over several days is more effective than distributed training within 1 day. This study aimed to determine which of these two options is the most effective for training endoscopic psychomotor skills. Students with no endoscopic experience were randomly assigned either to distributed training on 3 consecutive days (group A, n = 10) or distributed training within 1 day (group B, n = 10). For this study the SIMENDO virtual reality simulator for endoscopic skills was used. The training involved 12 repetitions of three different exercises (drop balls, needle manipulation, 30 degree endoscope) in differently distributed training schedules. All the participants performed a posttraining test (posttest) for the trained tasks 7 days after the training. The parameters measured were time, nontarget environment collisions, and instrument path length. There were no significant differences between the groups in the first training session for all the parameters. In the posttest, group A (training over several days) performed 18.7% faster than group B (training on 1 day) (p = 0.013). The collision and path length scores for group A did not differ significantly from the scores for group B. The distributed group trained over several days was faster, with the same number of errors and the same instrument path length used. Psychomotor skill training for endoscopic surgery distributed over several days is superior to training on 1 day.

  14. Saturn's Magnetosphere Interaction with Titan for T9 Encounter: 3D Hybrid Modeling and Comparison with CAPS Observations

    NASA Technical Reports Server (NTRS)

    Lipatov, A. S.; Sittler, E. C., Jr.; Hartle, R. E.; Cooper, J. F.; Simpson, D. G.

    2011-01-01

    Global dynamics of ionized and neutral gases in the environment of Titan plays an important role in the interaction of Saturn s magnetosphere with Titan. Several hybrid simulations of this problem have already been done (Brecht et al., 2000; Kallio et al., 2004; Modolo et al., 2007a; Simon et al., 2007a, 2007b; Modolo and Chanteur, 2008). Observational data from CAPS for the T9 encounter (Sittler et al., 2009) indicates an absence of O(+) heavy ions in the upstream that change the models of interaction which were discussed in current publications (Kallio et al., 2004; Modolo et al., 2007a; Simon et al., 2007a, 2007b; Ma et al., 2007; Szego et al., 2007). Further analysis of the CAPS data shows very low density or even an absence of H(+) ions in upstream. In this paper we discuss two models of the interaction of Saturn s magnetosphere with Titan: (A) high density of H(+) ions in the upstream flow (0.1/cu cm), and (B) low density of H(+) ions in the upstream flow (0.02/cu cm). The hybrid model employs a fluid description for electrons and neutrals, whereas a particle approach is used for ions. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. The model atmosphere includes exospheric H(+), H(2+), N(2+)and CH(4+) pickup ion production as well as an immobile background ionosphere and a shell distribution for active ionospheric ions (M(sub i)=28 amu). The hybrid model allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. Our simulation shows an asymmetry of the ion density distribution and the magnetic field, including the formation of Alfven wing-like structures. The results of the ion dynamics in Titan s environment are compared with Cassini T9 encounter data (CAPS).

  15. Analysis of High Grazing Angle Sea-clutter with the KK-Distribution

    DTIC Science & Technology

    2013-11-01

    work undertaken at the DSTO in characterising the maritime environment from high altitude airborne platforms. The focus of this report is to characterise...multichannel synthetic aperture radar through Adelaide University. He has worked at the DSTO as an RF engineer in the missile simulation centre, as a...with the Cooperative Research Centre for Sensor, Signal and Information Processing where he worked in the Pattern Recognition Group on the application

  16. Program of Basic Research in Distributed Tactical Decision Making.

    DTIC Science & Technology

    1987-08-05

    computer -simulated game representing a "space war" battle context were devised and two experiments were conducted to test some of the underlying...assume that advanced communication and computation of ever increasing capabilities will ensure successful group performance simply by improving the...There was a total of 12 subjects, three in each condition. 0 Apparatus A computer -controlled DTDM environment was developed using a VAX-I 1/750. The DTDM

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Steffen; Gerwert, Klaus, E-mail: gerwert@bph.rub.de; Department of Biophysics, Chinese Academy of Sciences, Max-Planck-Gesellschaft Partner Institute for Computational Biology, 320 Yue Yang Road, 200031 Shanghai

    Proton conduction along protein-bound “water wires” is an essential feature in membrane proteins. Here, we analyze in detail a transient water wire, which conducts protons via a hydrophobic barrier within a membrane protein to create a proton gradient. It is formed only for a millisecond out of three water molecules distributed at inactive positions in a polar environment in the ground state. The movement into a hydrophobic environment causes characteristic shifts of the water bands reflecting their different chemical properties. These band shifts are identified by time-resolved Fourier Transform Infrared difference spectroscopy and analyzed by biomolecular Quantum Mechanical/Molecular Mechanical simulations.more » A non-hydrogen bonded (“dangling”) O–H stretching vibration band and a broad continuum absorbance caused by a combined vibration along the water wire are identified as characteristic marker bands of such water wires in a hydrophobic environment. The results provide a basic understanding of water wires in hydrophobic environments.« less

  18. Infrared spectral marker bands characterizing a transient water wire inside a hydrophobic membrane protein.

    PubMed

    Wolf, Steffen; Freier, Erik; Cui, Qiang; Gerwert, Klaus

    2014-12-14

    Proton conduction along protein-bound "water wires" is an essential feature in membrane proteins. Here, we analyze in detail a transient water wire, which conducts protons via a hydrophobic barrier within a membrane protein to create a proton gradient. It is formed only for a millisecond out of three water molecules distributed at inactive positions in a polar environment in the ground state. The movement into a hydrophobic environment causes characteristic shifts of the water bands reflecting their different chemical properties. These band shifts are identified by time-resolved Fourier Transform Infrared difference spectroscopy and analyzed by biomolecular Quantum Mechanical/Molecular Mechanical simulations. A non-hydrogen bonded ("dangling") O-H stretching vibration band and a broad continuum absorbance caused by a combined vibration along the water wire are identified as characteristic marker bands of such water wires in a hydrophobic environment. The results provide a basic understanding of water wires in hydrophobic environments.

  19. Difficulties in applying numerical simulations to an evaluation of occupational hazards caused by electromagnetic fields

    PubMed Central

    Zradziński, Patryk

    2015-01-01

    Due to the various physical mechanisms of interaction between a worker's body and the electromagnetic field at various frequencies, the principles of numerical simulations have been discussed for three areas of worker exposure: to low frequency magnetic field, to low and intermediate frequency electric field and to radiofrequency electromagnetic field. This paper presents the identified difficulties in applying numerical simulations to evaluate physical estimators of direct and indirect effects of exposure to electromagnetic fields at various frequencies. Exposure of workers operating a plastic sealer have been taken as an example scenario of electromagnetic field exposure at the workplace for discussion of those difficulties in applying numerical simulations. The following difficulties in reliable numerical simulations of workers’ exposure to the electromagnetic field have been considered: workers’ body models (posture, dimensions, shape and grounding conditions), working environment models (objects most influencing electromagnetic field distribution) and an analysis of parameters for which exposure limitations are specified in international guidelines and standards. PMID:26323781

  20. Meteorological and Land Surface Properties Impacting Sea Breeze Extent and Aerosol Distribution in a Dry Environment

    NASA Astrophysics Data System (ADS)

    Igel, Adele L.; van den Heever, Susan C.; Johnson, Jill S.

    2018-01-01

    The properties of sea breeze circulations are influenced by a variety of meteorological and geophysical factors that interact with one another. These circulations can redistribute aerosol particles and pollution and therefore can play an important role in local air quality, as well as impact remote sensing. In this study, we select 11 factors that have the potential to impact either the sea breeze circulation properties and/or the spatial distribution of aerosols. Simulations are run to identify which of the 11 factors have the largest influence on the sea breeze properties and aerosol concentrations and to subsequently understand the mean response of these variables to the selected factors. All simulations are designed to be representative of conditions in coastal sub tropical environments and are thus relatively dry, as such they do not support deep convection associated with the sea breeze front. For this dry sea breeze regime, we find that the background wind speed was the most influential factor for the sea breeze propagation, with the soil saturation fraction also being important. For the spatial aerosol distribution, the most important factors were the soil moisture, sea-air temperature difference, and the initial boundary layer height. The importance of these factors seems to be strongly tied to the development of the surface-based mixed layer both ahead of and behind the sea breeze front. This study highlights potential avenues for further research regarding sea breeze dynamics and the impact of sea breeze circulations on pollution dispersion and remote sensing algorithms.

  1. Chemotactic preferences govern competition and pattern formation in simulated two-strain microbial communities.

    PubMed

    Centler, Florian; Thullner, Martin

    2015-01-01

    Substrate competition is a common mode of microbial interaction in natural environments. While growth properties play an important and well-studied role in competition, we here focus on the influence of motility. In a simulated two-strain community populating a homogeneous two-dimensional environment, strains competed for a common substrate and only differed in their chemotactic preference, either responding more sensitively to a chemoattractant excreted by themselves or responding more sensitively to substrate. Starting from homogeneous distributions, three possible behaviors were observed depending on the competitors' chemotactic preferences: (i) distributions remained homogeneous, (ii) patterns formed but dissolved at a later time point, resulting in a shifted community composition, and (iii) patterns emerged and led to the extinction of one strain. When patterns formed, the more aggregating strain populated the core of microbial aggregates where starving conditions prevailed, while the less aggregating strain populated the more productive zones at the fringe or outside aggregates, leading to a competitive advantage of the less aggregating strain. The presence of a competitor was found to modulate a strain's behavior, either suppressing or promoting aggregate formation. This observation provides a potential mechanism by which an aggregated lifestyle might evolve even if it is initially disadvantageous. Adverse effects can be avoided as a competitor hinders aggregate formation by a strain which has just acquired this ability. The presented results highlight both, the importance of microbial motility for competition and pattern formation, and the importance of the temporal evolution, or history, of microbial communities when trying to explain an observed distribution.

  2. Applying genetic algorithms for calibrating a hexagonal cellular automata model for the simulation of debris flows characterised by strong inertial effects

    NASA Astrophysics Data System (ADS)

    Iovine, G.; D'Ambrosio, D.; Di Gregorio, S.

    2005-03-01

    In modelling complex a-centric phenomena which evolve through local interactions within a discrete time-space, cellular automata (CA) represent a valid alternative to standard solution methods based on differential equations. Flow-type phenomena (such as lava flows, pyroclastic flows, earth flows, and debris flows) can be viewed as a-centric dynamical systems, and they can therefore be properly investigated in CA terms. SCIDDICA S 4a is the last release of a two-dimensional hexagonal CA model for simulating debris flows characterised by strong inertial effects. S 4a has been obtained by progressively enriching an initial simplified model, originally derived for simulating very simple cases of slow-moving flow-type landslides. Using an empirical strategy, in S 4a, the inertial character of the flowing mass is translated into CA terms by means of local rules. In particular, in the transition function of the model, the distribution of landslide debris among the cells is obtained through a double cycle of computation. In the first phase, the inertial character of the landslide debris is taken into account by considering indicators of momentum. In the second phase, any remaining debris in the central cell is distributed among the adjacent cells, according to the principle of maximum possible equilibrium. The complexities of the model and of the phenomena to be simulated suggested the need for an automated technique of evaluation for the determination of the best set of global parameters. Accordingly, the model is calibrated using a genetic algorithm and by considering the May 1998 Curti-Sarno (Southern Italy) debris flow. The boundaries of the area affected by the debris flow are simulated well with the model. Errors computed by comparing the simulations with the mapped areal extent of the actual landslide are smaller than those previously obtained without genetic algorithms. As the experiments have been realised in a sequential computing environment, they could be improved by adopting a parallel environment, which allows the performance of a great number of tests in reasonable times.

  3. Subsurface conditions in hydrothermal vents inferred from diffuse flow composition, and models of reaction and transport

    NASA Astrophysics Data System (ADS)

    Larson, B. I.; Houghton, J. L.; Lowell, R. P.; Farough, A.; Meile, C. D.

    2015-08-01

    Chemical gradients in the subsurface of mid-ocean ridge hydrothermal systems create an environment where minerals precipitate and dissolve and where chemosynthetic organisms thrive. However, owing to the lack of easy access to the subsurface, robust knowledge of the nature and extent of chemical transformations remains elusive. Here, we combine measurements of vent fluid chemistry with geochemical and transport modeling to give new insights into the under-sampled subsurface. Temperature-composition relationships from a geochemical mixing model are superimposed on the subsurface temperature distribution determined using a heat flow model to estimate the spatial distribution of fluid composition. We then estimate the distribution of Gibb's free energies of reaction beneath mid oceanic ridges and by combining flow simulations with speciation calculations estimate anhydrite deposition rates. Applied to vent endmembers observed at the fast spreading ridge at the East Pacific Rise, our results suggest that sealing times due to anhydrite formation are longer than the typical time between tectonic and magmatic events. The chemical composition of the neighboring low temperature flow indicates relatively uniform energetically favorable conditions for commonly inferred microbial processes such as methanogenesis, sulfate reduction and numerous oxidation reactions, suggesting that factors other than energy availability may control subsurface microbial biomass distribution. Thus, these model simulations complement fluid-sample datasets from surface venting and help infer the chemical distribution and transformations in subsurface flow.

  4. Thermodynamics of Macromolecular Association in Heterogeneous Crowding Environments: Theoretical and Simulation Studies with a Simplified Model.

    PubMed

    Ando, Tadashi; Yu, Isseki; Feig, Michael; Sugita, Yuji

    2016-11-23

    The cytoplasm of a cell is crowded with many different kinds of macromolecules. The macromolecular crowding affects the thermodynamics and kinetics of biological reactions in a living cell, such as protein folding, association, and diffusion. Theoretical and simulation studies using simplified models focus on the essential features of the crowding effects and provide a basis for analyzing experimental data. In most of the previous studies on the crowding effects, a uniform crowder size is assumed, which is in contrast to the inhomogeneous size distribution of macromolecules in a living cell. Here, we evaluate the free energy changes upon macromolecular association in a cell-like inhomogeneous crowding system via a theory of hard-sphere fluids and free energy calculations using Brownian dynamics trajectories. The inhomogeneous crowding model based on 41 different types of macromolecules represented by spheres with different radii mimics the physiological concentrations of macromolecules in the cytoplasm of Mycoplasma genitalium. The free energy changes of macromolecular association evaluated by the theory and simulations were in good agreement with each other. The crowder size distribution affects both specific and nonspecific molecular associations, suggesting that not only the volume fraction but also the size distribution of macromolecules are important factors for evaluating in vivo crowding effects. This study relates in vitro experiments on macromolecular crowding to in vivo crowding effects by using the theory of hard-sphere fluids with crowder-size heterogeneity.

  5. Simulating neutron star mergers as r-process sources in ultrafaint dwarf galaxies

    NASA Astrophysics Data System (ADS)

    Safarzadeh, Mohammadtaher; Scannapieco, Evan

    2017-10-01

    To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.

  6. Glimpsing the imprint of local environment on the galaxy stellar mass function

    NASA Astrophysics Data System (ADS)

    Tomczak, Adam R.; Lemaux, Brian C.; Lubin, Lori M.; Gal, Roy R.; Wu, Po-Feng; Holden, Bradford; Kocevski, Dale D.; Mei, Simona; Pelliccia, Debora; Rumbaugh, Nicholas; Shen, Lu

    2017-12-01

    We investigate the impact of local environment on the galaxy stellar mass function (SMF) spanning a wide range of galaxy densities from the field up to dense cores of massive galaxy clusters. Data are drawn from a sample of eight fields from the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) survey. Deep photometry allow us to select mass-complete samples of galaxies down to 109 M⊙. Taking advantage of >4000 secure spectroscopic redshifts from ORELSE and precise photometric redshifts, we construct three-dimensional density maps between 0.55 < z < 1.3 using a Voronoi tessellation approach. We find that the shape of the SMF depends strongly on local environment exhibited by a smooth, continual increase in the relative numbers of high- to low-mass galaxies towards denser environments. A straightforward implication is that local environment proportionally increases the efficiency of (a) destroying lower mass galaxies and/or (b) growth of higher mass galaxies. We also find a presence of this environmental dependence in the SMFs of star-forming and quiescent galaxies, although not quite as strongly for the quiescent subsample. To characterize the connection between the SMF of field galaxies and that of denser environments, we devise a simple semi-empirical model. The model begins with a sample of ≈106 galaxies at zstart = 5 with stellar masses distributed according to the field. Simulated galaxies then evolve down to zfinal = 0.8 following empirical prescriptions for star-formation, quenching and galaxy-galaxy merging. We run the simulation multiple times, testing a variety of scenarios with differing overall amounts of merging. Our model suggests that a large number of mergers are required to reproduce the SMF in dense environments. Additionally, a large majority of these mergers would have to occur in intermediate density environments (e.g. galaxy groups).

  7. An Entropy Approach to Disclosure Risk Assessment: Lessons from Real Applications and Simulated Domains

    PubMed Central

    Airoldi, Edoardo M.; Bai, Xue; Malin, Bradley A.

    2011-01-01

    We live in an increasingly mobile world, which leads to the duplication of information across domains. Though organizations attempt to obscure the identities of their constituents when sharing information for worthwhile purposes, such as basic research, the uncoordinated nature of such environment can lead to privacy vulnerabilities. For instance, disparate healthcare providers can collect information on the same patient. Federal policy requires that such providers share “de-identified” sensitive data, such as biomedical (e.g., clinical and genomic) records. But at the same time, such providers can share identified information, devoid of sensitive biomedical data, for administrative functions. On a provider-by-provider basis, the biomedical and identified records appear unrelated, however, links can be established when multiple providers’ databases are studied jointly. The problem, known as trail disclosure, is a generalized phenomenon and occurs because an individual’s location access pattern can be matched across the shared databases. Due to technical and legal constraints, it is often difficult to coordinate between providers and thus it is critical to assess the disclosure risk in distributed environments, so that we can develop techniques to mitigate such risks. Research on privacy protection has so far focused on developing technologies to suppress or encrypt identifiers associated with sensitive information. There is growing body of work on the formal assessment of the disclosure risk of database entries in publicly shared databases, but a less attention has been paid to the distributed setting. In this research, we review the trail disclosure problem in several domains with known vulnerabilities and show that disclosure risk is influenced by the distribution of how people visit service providers. Based on empirical evidence, we propose an entropy metric for assessing such risk in shared databases prior to their release. This metric assesses risk by leveraging the statistical characteristics of a visit distribution, as opposed to person-level data. It is computationally efficient and superior to existing risk assessment methods, which rely on ad hoc assessment that are often computationally expensive and unreliable. We evaluate our approach on a range of location access patterns in simulated environments. Our results demonstrate the approach is effective at estimating trail disclosure risks and the amount of self-information contained in a distributed system is one of the main driving factors. PMID:21647242

  8. The development and testing of the fast imaging plasma spectrometer and its application in the plasma environment at Mercury

    NASA Astrophysics Data System (ADS)

    Koehn, Patrick Leo

    The plasma environment at Mercury is a rich laboratory for studying the interaction of the solar wind with a planet. Three primary populations of ions exist at Mercury: solar wind, magnetospheric particles, and pickup ions. Pickup ions are generated through the ionization of Mercury's exosphere or are sputtered particles from the Mercury surface. A comprehensive mission to Mercury should include a sensor that is able to determine the dynamical properties and composition of all three plasma components. The Fast Imaging Plasma Spectrometer (FIPS) is an instrument to measure the composition of these ion populations and their three-dimensional velocity distribution functions. It is lightweight, fast, and has a very large field of view, and these properties made possible its accommodation within the highly mass- constrained payload of MESSENGER (MErcury: Surface, Space ENvironment, GEochemistry, Ranging) mission, a Mercury orbiter. This work details the development cycle of FIPS, from concept to prototype testing. It begins with science studies of the magnetospheric and pickup ion environments of Mercury, using state-of-the-art computer simulations to produce static and quasi-dynamic magnetospheric systems. Predictions are made of the spatially variable plasma environment at Mercury, and the temporally varying magnetosphere-solar wind interaction is examined. Pickup ion studies provide insights to particle loss mechanisms and the nature of the radar-bright regions at the Hermean poles. These studies produce science requirements for successfully measuring this environment with an orbiting mass spectrometer. With these science requirements in mind, a concept for a new electrostatic analyzer is created. This concept is considered from a theoretical standpoint, and compared with other, similarly performing instruments, both of the past and currently in use. The development cycle continues with instrument simulation, which allows the design to be adjusted to fit within the science requirements of the mission. Finally, a prototype electrostatic is constructed and tested in a space- simulating vacuum chamber system. The results of these tests are compared with the simulation results, and ultimately shown to fit within the science requirements for the MESSENGER mission.

  9. Evidence of Long Range Dependence and Self-similarity in Urban Traffic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Helmy, Ahmed; Hui, Pan

    2015-01-01

    Transportation simulation technologies should accurately model traffic demand, distribution, and assignment parame- ters for urban environment simulation. These three param- eters significantly impact transportation engineering bench- mark process, are also critical in realizing realistic traffic modeling situations. In this paper, we model and charac- terize traffic density distribution of thousands of locations around the world. The traffic densities are generated from millions of images collected over several years and processed using computer vision techniques. The resulting traffic den- sity distribution time series are then analyzed. It is found using the goodness-of-fit test that the traffic density dis- tributions follows heavy-tailmore » models such as Log-gamma, Log-logistic, and Weibull in over 90% of analyzed locations. Moreover, a heavy-tail gives rise to long-range dependence and self-similarity, which we studied by estimating the Hurst exponent (H). Our analysis based on seven different Hurst estimators strongly indicate that the traffic distribution pat- terns are stochastically self-similar (0.5 H 1.0). We believe this is an important finding that will influence the design and development of the next generation traffic simu- lation techniques and also aid in accurately modeling traffic engineering of urban systems. In addition, it shall provide a much needed input for the development of smart cities.« less

  10. Generation mechanism of nonlinear ultrasonic Lamb waves in thin plates with randomly distributed micro-cracks.

    PubMed

    Zhao, Youxuan; Li, Feilong; Cao, Peng; Liu, Yaolu; Zhang, Jianyu; Fu, Shaoyun; Zhang, Jun; Hu, Ning

    2017-08-01

    Since the identification of micro-cracks in engineering materials is very valuable in understanding the initial and slight changes in mechanical properties of materials under complex working environments, numerical simulations on the propagation of the low frequency S 0 Lamb wave in thin plates with randomly distributed micro-cracks were performed to study the behavior of nonlinear Lamb waves. The results showed that while the influence of the randomly distributed micro-cracks on the phase velocity of the low frequency S 0 fundamental waves could be neglected, significant ultrasonic nonlinear effects caused by the randomly distributed micro-cracks was discovered, which mainly presented as a second harmonic generation. By using a Monte Carlo simulation method, we found that the acoustic nonlinear parameter increased linearly with the micro-crack density and the size of micro-crack zone, and it was also related to the excitation frequency and friction coefficient of the micro-crack surfaces. In addition, it was found that the nonlinear effect of waves reflected by the micro-cracks was more noticeable than that of the transmitted waves. This study theoretically reveals that the low frequency S 0 mode of Lamb waves can be used as the fundamental waves to quantitatively identify micro-cracks in thin plates. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Path loss analysis in millimeter wave cellular systems for urban mobile communications

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Ramesh; Hoffman, Mitchell

    2016-09-01

    The proliferation in the number of mobile devices and developments in cellular technology has led to an ever increasing demand for mobile data. The global bandwidth shortage facing wireless carriers today has motivated research for fifth generation (5G) cellular systems. In recent years, millimeter wave (mmW) frequencies between 30 and 300 GHz are being considered as a promising technology for 5G systems. Such systems can offer superior user experience by providing data rates that exceed one Gigabit per second and latencies lower than a millisecond. However, there is little research about cellular mmW propagation in densely populated urban environments. Understanding the radio channel is a primary requirement for optimal design of mmW systems. Radio propagation in mmW systems faces significant challenges due to rapidly varying channel conditions and intermittent connectivity. In this paper, we study the propagation of mmW spectrum in an urban environment. We use a statistical model to simulate an urban environment with diverse building distributions. We perform extensive simulations to analyze the path loss behavior for both line of sight (LOS) and non line of sight (NLOS) conditions for 28 GHZ and 73 GHZ mmW frequencies. We observe that the path loss approximates a logarithmic fit for both LOS and NLOS environments. Our simulations show that the omnidirectional free space path loss is approximately 30 dB higher for mmW systems compared to current 3G PP cellular systems. To address this challenge, we propose using highly directional horn antennas with beam forming for reducing the path loss.

  12. Pathogen transfer through environment-host contact: an agent-based queueing theoretic framework.

    PubMed

    Chen, Shi; Lenhart, Suzanne; Day, Judy D; Lee, Chihoon; Dulin, Michael; Lanzas, Cristina

    2017-11-02

    Queueing theory studies the properties of waiting queues and has been applied to investigate direct host-to-host transmitted disease dynamics, but its potential in modelling environmentally transmitted pathogens has not been fully explored. In this study, we provide a flexible and customizable queueing theory modelling framework with three major subroutines to study the in-hospital contact processes between environments and hosts and potential nosocomial pathogen transfer, where environments are servers and hosts are customers. Two types of servers with different parameters but the same utilization are investigated. We consider various forms of transfer functions that map contact duration to the amount of pathogen transfer based on existing literature. We propose a case study of simulated in-hospital contact processes and apply stochastic queues to analyse the amount of pathogen transfer under different transfer functions, and assume that pathogen amount decreases during the inter-arrival time. Different host behaviour (feedback and non-feedback) as well as initial pathogen distribution (whether in environment and/or in hosts) are also considered and simulated. We assess pathogen transfer and circulation under these various conditions and highlight the importance of the nonlinear interactions among contact processes, transfer functions and pathogen demography during the contact process. Our modelling framework can be readily extended to more complicated queueing networks to simulate more realistic situations by adjusting parameters such as the number and type of servers and customers, and adding extra subroutines. © The authors 2017. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  13. How copying affects the amount, evenness and persistence of cultural knowledge: insights from the social learning strategies tournament

    PubMed Central

    Rendell, L.; Boyd, R.; Enquist, M.; Feldman, M. W.; Fogarty, L.; Laland, K. N.

    2011-01-01

    Darwinian processes should favour those individuals that deploy the most effective strategies for acquiring information about their environment. We organized a computer-based tournament to investigate which learning strategies would perform well in a changing environment. The most successful strategies relied almost exclusively on social learning (here, learning a behaviour performed by another individual) rather than asocial learning, even when environments were changing rapidly; moreover, successful strategies focused learning effort on periods of environmental change. Here, we use data from tournament simulations to examine how these strategies might affect cultural evolution, as reflected in the amount of culture (i.e. number of cultural traits) in the population, the distribution of cultural traits across individuals, and their persistence through time. We found that high levels of social learning are associated with a larger amount of more persistent knowledge, but a smaller amount of less persistent expressed behaviour, as well as more uneven distributions of behaviour, as individuals concentrated on exploiting a smaller subset of behaviour patterns. Increased rates of environmental change generated increases in the amount and evenness of behaviour. These observations suggest that copying confers on cultural populations an adaptive plasticity, allowing them to respond to changing environments rapidly by drawing on a wider knowledge base. PMID:21357234

  14. How copying affects the amount, evenness and persistence of cultural knowledge: insights from the social learning strategies tournament.

    PubMed

    Rendell, L; Boyd, R; Enquist, M; Feldman, M W; Fogarty, L; Laland, K N

    2011-04-12

    Darwinian processes should favour those individuals that deploy the most effective strategies for acquiring information about their environment. We organized a computer-based tournament to investigate which learning strategies would perform well in a changing environment. The most successful strategies relied almost exclusively on social learning (here, learning a behaviour performed by another individual) rather than asocial learning, even when environments were changing rapidly; moreover, successful strategies focused learning effort on periods of environmental change. Here, we use data from tournament simulations to examine how these strategies might affect cultural evolution, as reflected in the amount of culture (i.e. number of cultural traits) in the population, the distribution of cultural traits across individuals, and their persistence through time. We found that high levels of social learning are associated with a larger amount of more persistent knowledge, but a smaller amount of less persistent expressed behaviour, as well as more uneven distributions of behaviour, as individuals concentrated on exploiting a smaller subset of behaviour patterns. Increased rates of environmental change generated increases in the amount and evenness of behaviour. These observations suggest that copying confers on cultural populations an adaptive plasticity, allowing them to respond to changing environments rapidly by drawing on a wider knowledge base.

  15. Chaste: A test-driven approach to software development for biological modelling

    NASA Astrophysics Data System (ADS)

    Pitt-Francis, Joe; Pathmanathan, Pras; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Fletcher, Alexander G.; Mirams, Gary R.; Murray, Philip; Osborne, James M.; Walter, Alex; Chapman, S. Jon; Garny, Alan; van Leeuwen, Ingeborg M. M.; Maini, Philip K.; Rodríguez, Blanca; Waters, Sarah L.; Whiteley, Jonathan P.; Byrne, Helen M.; Gavaghan, David J.

    2009-12-01

    Chaste ('Cancer, heart and soft-tissue environment') is a software library and a set of test suites for computational simulations in the domain of biology. Current functionality has arisen from modelling in the fields of cancer, cardiac physiology and soft-tissue mechanics. It is released under the LGPL 2.1 licence. Chaste has been developed using agile programming methods. The project began in 2005 when it was reasoned that the modelling of a variety of physiological phenomena required both a generic mathematical modelling framework, and a generic computational/simulation framework. The Chaste project evolved from the Integrative Biology (IB) e-Science Project, an inter-institutional project aimed at developing a suitable IT infrastructure to support physiome-level computational modelling, with a primary focus on cardiac and cancer modelling. Program summaryProgram title: Chaste Catalogue identifier: AEFD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL 2.1 No. of lines in distributed program, including test data, etc.: 5 407 321 No. of bytes in distributed program, including test data, etc.: 42 004 554 Distribution format: tar.gz Programming language: C++ Operating system: Unix Has the code been vectorised or parallelized?: Yes. Parallelized using MPI. RAM:<90 Megabytes for two of the scenarios described in Section 6 of the manuscript (Monodomain re-entry on a slab or Cylindrical crypt simulation). Up to 16 Gigabytes (distributed across processors) for full resolution bidomain cardiac simulation. Classification: 3. External routines: Boost, CodeSynthesis XSD, CxxTest, HDF5, METIS, MPI, PETSc, Triangle, Xerces Nature of problem: Chaste may be used for solving coupled ODE and PDE systems arising from modelling biological systems. Use of Chaste in two application areas are described in this paper: cardiac electrophysiology and intestinal crypt dynamics. Solution method: Coupled multi-physics with PDE, ODE and discrete mechanics simulation. Running time: The largest cardiac simulation described in the manuscript takes about 6 hours to run on a single 3 GHz core. See results section (Section 6) of the manuscript for discussion on parallel scaling.

  16. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  17. Trait-based Modeling of Larval Dispersal in the Gulf of Maine

    NASA Astrophysics Data System (ADS)

    Jones, B.; Richardson, D.; Follows, M. J.; Hill, C. N.; Solow, A.; Ji, R.

    2016-02-01

    Population connectivity of marine species is the inter-generational movement of individuals among geographically separated subpopulations and is a crucial determinant of population dynamics, community structure, and optimal management strategies. For many marine species, population connectivity is largely determined by the dispersal patterns that emerge from a pelagic larval phase. These dispersal patterns are a result of interactions between the physical environment, adult spawning strategy, and larval ecology. Using a generalized trait-based model that represents the adult spawning strategy as a distribution of larval releases in time and space and the larval trait space with the pelagic larval duration, vertical swimming behavior, and settlement habitat preferences, we simulate dispersal patterns in the Gulf of Maine and surrounding regions. We implement this model as an individual-based simulation that tracks Lagrangian particles on a graphics processing unit as they move through hourly archived output from the Finite-Volume Community Ocean Model. The particles are released between the Hudson Canyon and Nova Scotia and the release distributions are determined using a novel method that minimizes the number of simulations required to achieve a predetermined level of precision for the connectivity matrices. The simulated larvae have a variable pelagic larval duration and exhibit multiple forms of dynamic depth-keeping behavior. We describe how these traits influence the dispersal trajectories and connectivity patterns among regions in the northwest Atlantic. Our description includes the probability of successful recruitment, patchiness of larval distributions, and the variability of these properties in time and space under a variety of larval dispersal strategies.

  18. Planning for distributed workflows: constraint-based coscheduling of computational jobs and data placement in distributed environments

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2015-05-01

    When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.

  19. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  20. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations.

    PubMed

    Fernández-Isabel, Alberto; Fuentes-Fernández, Rubén

    2015-06-15

    Intelligent Transportation Systems (ITSs) integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use.

  1. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations

    PubMed Central

    Fernández-Isabel, Alberto; Fuentes-Fernández, Rubén

    2015-01-01

    Intelligent Transportation Systems (ITSs) integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use. PMID:26083232

  2. Contribution of cosmic ray particles to radiation environment at high mountain altitude: Comparison of Monte Carlo simulations with experimental data.

    PubMed

    Mishev, A L

    2016-03-01

    A numerical model for assessment of the effective dose due to secondary cosmic ray particles of galactic origin at high mountain altitude of about 3000 m above the sea level is presented. The model is based on a newly numerically computed effective dose yield function considering realistic propagation of cosmic rays in the Earth magnetosphere and atmosphere. The yield function is computed using a full Monte Carlo simulation of the atmospheric cascade induced by primary protons and α- particles and subsequent conversion of secondary particle fluence (neutrons, protons, gammas, electrons, positrons, muons and charged pions) to effective dose. A lookup table of the newly computed effective dose yield function is provided. The model is compared with several measurements. The comparison of model simulations with measured spectral energy distributions of secondary cosmic ray neutrons at high mountain altitude shows good consistency. Results from measurements of radiation environment at high mountain station--Basic Environmental Observatory Moussala (42.11 N, 23.35 E, 2925 m a.s.l.) are also shown, specifically the contribution of secondary cosmic ray neutrons. A good agreement with the model is demonstrated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Probing Massive Black Hole Populations and Their Environments with LISA

    NASA Astrophysics Data System (ADS)

    Katz, Michael; Larson, Shane

    2018-01-01

    With the adoption of the LISA Mission Proposal by the European Space Agency in response to its call for L3 mission concepts, gravitational wave measurements from space are on the horizon. With data from the Illustris large-scale cosmological simulation, we provide analysis of LISA detection rates accompanied by characterization of the merging Massive Black Holes (MBH) and their host galaxies. MBHs of total mass $\\sim10^6-10^9 M_\\odot$ are the main focus of this study. Using a precise treatment of the dynamical friction evolutionary process prior to gravitational wave emission, we evolve MBH simulation particle mergers from $\\sim$kpc scales until coalescence to achieve a merger distribution. Using the statistical basis of the Illustris output, we Monte-carlo synthesize many realizations of the merging massive black hole population across space and time. We use those realizations to build mock LISA detection catalogs to understand the impact of LISA mission configurations on our ability to probe massive black hole merger populations and their environments throughout the visible Universe.

  4. Space environment simulation and sensor calibration facility

    NASA Astrophysics Data System (ADS)

    Engelhart, Daniel P.; Patton, James; Plis, Elena; Cooper, Russell; Hoffmann, Ryan; Ferguson, Dale; Hilmer, Robert V.; McGarity, John; Holeman, Ernest

    2018-02-01

    The Mumbo space environment simulation chamber discussed here comprises a set of tools to calibrate a variety of low flux, low energy electron and ion detectors used in satellite-mounted particle sensors. The chamber features electron and ion beam sources, a Lyman-alpha ultraviolet lamp, a gimbal table sensor mounting system, cryogenic sample mount and chamber shroud, and beam characterization hardware and software. The design of the electron and ion sources presented here offers a number of unique capabilities for space weather sensor calibration. Both sources create particle beams with narrow, well-characterized energetic and angular distributions with beam diameters that are larger than most space sensor apertures. The electron and ion sources can produce consistently low fluxes that are representative of quiescent space conditions. The particle beams are characterized by 2D beam mapping with several co-located pinhole aperture electron multipliers to capture relative variation in beam intensity and a large aperture Faraday cup to measure absolute current density.

  5. Space environment simulation and sensor calibration facility.

    PubMed

    Engelhart, Daniel P; Patton, James; Plis, Elena; Cooper, Russell; Hoffmann, Ryan; Ferguson, Dale; Hilmer, Robert V; McGarity, John; Holeman, Ernest

    2018-02-01

    The Mumbo space environment simulation chamber discussed here comprises a set of tools to calibrate a variety of low flux, low energy electron and ion detectors used in satellite-mounted particle sensors. The chamber features electron and ion beam sources, a Lyman-alpha ultraviolet lamp, a gimbal table sensor mounting system, cryogenic sample mount and chamber shroud, and beam characterization hardware and software. The design of the electron and ion sources presented here offers a number of unique capabilities for space weather sensor calibration. Both sources create particle beams with narrow, well-characterized energetic and angular distributions with beam diameters that are larger than most space sensor apertures. The electron and ion sources can produce consistently low fluxes that are representative of quiescent space conditions. The particle beams are characterized by 2D beam mapping with several co-located pinhole aperture electron multipliers to capture relative variation in beam intensity and a large aperture Faraday cup to measure absolute current density.

  6. Study on energy saving of subway station based on orthogonal experimental method

    NASA Astrophysics Data System (ADS)

    Guo, Lei

    2017-05-01

    With the characteristics of quick, efficient and large amount transport, the subway has become an important way to solve urban traffic congestion. As the subway environment will follow the change of external environment factors such as temperature and load of personnel changes, three-dimensional numerical simulations study is conducted by using CFD software for air distribution of subway platform. The influence of different loads (the supply air temperature and velocity of air condition, personnel load, heat flux of the wall) on the subway platform flow field are also analysed. The orthogonal experiment method is applied to the numerical simulation analysis for human comfort under different parameters. Based on those results, the functional relationship between human comfort and the boundary conditions of the platform is produced by multiple linear regression fitting method, the order of major boundary conditions which affect human comfort is obtained. The above study provides a theoretical basis for the final energy-saving strategies.

  7. Introducing Computational Fluid Dynamics Simulation into Olfactory Display

    NASA Astrophysics Data System (ADS)

    Ishida, Hiroshi; Yoshida, Hitoshi; Nakamoto, Takamichi

    An olfactory display is a device that delivers various odors to the user's nose. It can be used to add special effects to movies and games by releasing odors relevant to the scenes shown on the screen. In order to provide high-presence olfactory stimuli to the users, the display must be able to generate realistic odors with appropriate concentrations in a timely manner together with visual and audio playbacks. In this paper, we propose to use computational fluid dynamics (CFD) simulations in conjunction with the olfactory display. Odor molecules released from their source are transported mainly by turbulent flow, and their behavior can be extremely complicated even in a simple indoor environment. In the proposed system, a CFD solver is employed to calculate the airflow field and the odor dispersal in the given environment. An odor blender is used to generate the odor with the concentration determined based on the calculated odor distribution. Experimental results on presenting odor stimuli synchronously with movie clips show the effectiveness of the proposed system.

  8. Can virtual reality be used to conduct mass prophylaxis clinic training? A pilot program.

    PubMed

    Yellowlees, Peter; Cook, James N; Marks, Shayna L; Wolfe, Daniel; Mangin, Elanor

    2008-03-01

    To create and evaluate a pilot bioterrorism defense training environment using virtual reality technology. The present pilot project used Second Life, an internet-based virtual world system, to construct a virtual reality environment to mimic an actual setting that might be used as a Strategic National Stockpile (SNS) distribution site for northern California in the event of a bioterrorist attack. Scripted characters were integrated into the system as mock patients to analyze various clinic workflow scenarios. Users tested the virtual environment over two sessions. Thirteen users who toured the environment were asked to complete an evaluation survey. Respondents reported that the virtual reality system was relevant to their practice and had potential as a method of bioterrorism defense training. Computer simulations of bioterrorism defense training scenarios are feasible with existing personal computer technology. The use of internet-connected virtual environments holds promise for bioterrorism defense training. Recommendations are made for public health agencies regarding the implementation and benefits of using virtual reality for mass prophylaxis clinic training.

  9. Fuzzy Energy Management for a Catenary-Battery-Ultracapacitor based Hybrid Tramway

    NASA Astrophysics Data System (ADS)

    Jibin, Yang; Jiye, Zhang; Pengyun, Song

    2017-05-01

    In this paper, an energy management strategy (EMS) based on fuzzy logic control for a catenary-battery-ultracapacitor powered hybrid modern tramway was presented. The fuzzy logic controller for the catenary zone and catenary-less zone was respectively designed by analyzing the structure and working mode of the hybrid system, then an energy management strategy based on double fuzzy logic control was proposed to enhance the fuel economy. The hybrid modern tramway simulation model was developed based on MATLAB/Simulink environment. The simulation results show that the proposed EMS can satisfy the demand of dynamic performance of the tramway and achieve the power distribution reasonably between the each power source.

  10. Creating virtual humans for simulation-based training and planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less

  11. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    NASA Technical Reports Server (NTRS)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For testing, plug-ins are implemented in COSMOS to control the NOS3 simulations, while the command and telemetry tools available in COSMOS are used to communicate with FSW. NOS3 is actively being used for FSW development and component testing of the Simulation-to-Flight 1 (STF-1) CubeSat. As NOS3 matures, hardware models have been added for common CubeSat components such as Novatel GPS receivers, ClydeSpace electrical power systems and batteries, ISISpace antenna systems, etc. In the future, NASA IVV plans to distribute NOS3 to other CubeSat developers and release the suite to the open-source community.

  12. Simulations and measurements of hot-electron generation driven by the multibeam two-plasmon-decay instability

    NASA Astrophysics Data System (ADS)

    Follett, R. K.; Myatt, J. F.; Shaw, J. G.; Michel, D. T.; Solodov, A. A.; Edgell, D. H.; Yaakobi, B.; Froula, D. H.

    2017-10-01

    Multibeam experiments relevant to direct-drive inertial confinement fusion show the importance of nonlinear saturation mechanisms in the common-wave two-plasmon-decay (TPD) instability. Planar-target experiments on the OMEGA laser used hard x-ray measurements to study the influence of the linear common-wave growth rate on TPD-driven hot-electron production in two drive-beam configurations and over a range of overlapped laser intensities from 3.6 to 15.2 × 1014 W/cm2. The beam configuration with the larger linear common-wave growth rate had a lower intensity threshold for the onset of hot-electron production, but the linear growth rate made no significant impact on hot-electron production at high intensities. The experiments were modeled in 3-D using a hybrid code LPSE (laser plasma simulation environment) that combines a wave solver with a particle tracker to self-consistently calculate the electron velocity distribution and evolve electron Landau damping. Good quantitative agreement was obtained between the simulated and measured hot-electron distributions using a novel technique to account for macroscopic spatial and temporal variations that were present in the experiments.

  13. Research on Flow Field Perception Based on Artificial Lateral Line Sensor System.

    PubMed

    Liu, Guijie; Wang, Mengmeng; Wang, Anyi; Wang, Shirui; Yang, Tingting; Malekian, Reza; Li, Zhixiong

    2018-03-11

    In nature, the lateral line of fish is a peculiar and important organ for sensing the surrounding hydrodynamic environment, preying, escaping from predators and schooling. In this paper, by imitating the mechanism of fish lateral canal neuromasts, we developed an artificial lateral line system composed of micro-pressure sensors. Through hydrodynamic simulations, an optimized sensor structure was obtained and the pressure distribution models of the lateral surface were established in uniform flow and turbulent flow. Carrying out the corresponding underwater experiment, the validity of the numerical simulation method is verified by the comparison between the experimental data and the simulation results. In addition, a variety of effective research methods are proposed and validated for the flow velocity estimation and attitude perception in turbulent flow, respectively and the shape recognition of obstacles is realized by the neural network algorithm.

  14. Communication constraints, indexical countermeasures, and crew configuration effects in simulated space-dwelling groups

    NASA Astrophysics Data System (ADS)

    Hienz, Robert D.; Brady, Joseph V.; Hursh, Steven R.; Banner, Michele J.; Gasior, Eric D.; Spence, Kevin R.

    2007-02-01

    Previous research with groups of individually isolated crews communicating and problem-solving in a distributed interactive simulation environment has shown that the functional interchangeability of available communication channels can serve as an effective countermeasure to communication constraints. The present report extends these findings by investigating crew performance effects and psychosocial adaptation following: (1) the loss of all communication channels, and (2) changes in crew configuration. Three-person crews participated in a simulated planetary exploration mission that required identification, collection, and analysis of geologic samples. Results showed that crews developed and employed discrete navigation system operations that served as functionally effective communication signals (i.e., “indexical” or “deictic” cues) in generating appropriate crewmember responses and maintaining performance effectiveness in the absence of normal communication channels. Additionally, changes in crew configuration impacted both performance effectiveness and psychosocial adaptation.

  15. Radiation environment study of near space in China area

    NASA Astrophysics Data System (ADS)

    Fan, Dongdong; Chen, Xingfeng; Li, Zhengqiang; Mei, Xiaodong

    2015-10-01

    Aerospace activity becomes research hotspot for worldwide aviation big countries. Solar radiation study is the prerequisite for aerospace activity to carry out, but lack of observation in near space layer becomes the barrier. Based on reanalysis data, input key parameters are determined and simulation experiments are tried separately to simulate downward solar radiation and ultraviolet radiation transfer process of near space in China area. Results show that atmospheric influence on the solar radiation and ultraviolet radiation transfer process has regional characteristic. As key factors such as ozone are affected by atmospheric action both on its density, horizontal and vertical distribution, meteorological data of stratosphere needs to been considered and near space in China area is divided by its activity feature. Simulated results show that solar and ultraviolet radiation is time, latitude and ozone density-variant and has complicated variation characteristics.

  16. Head and neck response of a finite element anthropomorphic test device and human body model during a simulated rotary-wing aircraft impact.

    PubMed

    White, Nicholas A; Danelson, Kerry A; Gayzik, F Scott; Stitzel, Joel D

    2014-11-01

    A finite element (FE) simulation environment has been developed to investigate aviator head and neck response during a simulated rotary-wing aircraft impact using both an FE anthropomorphic test device (ATD) and an FE human body model. The head and neck response of the ATD simulation was successfully validated against an experimental sled test. The majority of the head and neck transducer time histories received a CORrelation and analysis (CORA) rating of 0.7 or higher, indicating good overall correlation. The human body model simulation produced a more biofidelic head and neck response than the ATD experimental test and simulation, including change in neck curvature. While only the upper and lower neck loading can be measured in the ATD, the shear force, axial force, and bending moment were reported for each level of the cervical spine in the human body model using a novel technique involving cross sections. This loading distribution provides further insight into the biomechanical response of the neck during a rotary-wing aircraft impact.

  17. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  18. Development of an informatics infrastructure for data exchange of biomolecular simulations: architecture, data models and ontology$

    PubMed Central

    Thibault, J. C.; Roe, D. R.; Eilbeck, K.; Cheatham, T. E.; Facelli, J. C.

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data – both within the same organization and among different ones – remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations. PMID:26387907

  19. Development of an informatics infrastructure for data exchange of biomolecular simulations: Architecture, data models and ontology.

    PubMed

    Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.

  20. Environmental dependence of the galaxy stellar mass function in the Dark Energy Survey Science Verification Data

    DOE PAGES

    Etherington, J.; Thomas, D.; Maraston, C.; ...

    2016-01-04

    Measurements of the galaxy stellar mass function are crucial to understand the formation of galaxies in the Universe. In a hierarchical clustering paradigm it is plausible that there is a connection between the properties of galaxies and their environments. Evidence for environmental trends has been established in the local Universe. The Dark Energy Survey (DES) provides large photometric datasets that enable further investigation of the assembly of mass. In this study we use ~3.2 million galaxies from the (South Pole Telescope) SPT-East field in the DES science verification (SV) dataset. From grizY photometry we derive galaxy stellar masses and absolutemore » magnitudes, and determine the errors on these properties using Monte-Carlo simulations using the full photometric redshift probability distributions. We compute galaxy environments using a fixed conical aperture for a range of scales. We construct galaxy environment probability distribution functions and investigate the dependence of the environment errors on the aperture parameters. We compute the environment components of the galaxy stellar mass function for the redshift range 0.15 < z < 1.05. For z < 0.75 we find that the fraction of massive galaxies is larger in high density environment than in low density environments. We show that the low density and high density components converge with increasing redshift up to z ~ 1.0 where the shapes of the mass function components are indistinguishable. As a result, our study shows how high density structures build up around massive galaxies through cosmic time.« less

Top