Sample records for simulation tool fast

  1. FAST Simulation Tool Containing Methods for Predicting the Dynamic Response of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, Jason

    2015-08-12

    FAST is a simulation tool (computer software) for modeling tlie dynamic response of horizontal-axis wind turbines. FAST employs a combined modal and multibody structural-dynamics formulation in the time domain.

  2. Simulator evaluation of the final approach spacing tool

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.

    1990-01-01

    The design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course is described. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arrivals as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a 4-D trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST was implemented on a high performance workstation. It can be operated as a stand-alone in the Terminal Radar Approach Control (TRACON) Facility or as an element of a system integrated with automation tools in the Air Route Traffic Control Center (ARTCC). FAST was evaluated by experienced TRACON controllers in a real-time air traffic control simulation. Simulation results show that FAST significantly reduced controller workload and demonstrated a potential for an increase in landing rate.

  3. Design and evaluation of an air traffic control Final Approach Spacing Tool

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.; Nedell, William

    1991-01-01

    This paper describes the design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arriving aircraft as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a four-dimensional trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST has been implemented on a high-performance workstation. It can be operated as a stand-alone in the terminal radar approach control facility or as an element of a system integrated with automation tools in the air route traffic control center. FAST was evaluated by experienced air traffic controllers in a real-time air traffic control simulation. simulation results summarized in the paper show that the automation tools significantly reduced controller work load and demonstrated a potential for an increase in landing rate.

  4. SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output

    PubMed Central

    Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.

    2011-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297

  5. SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†

    PubMed Central

    Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.

    2013-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136

  6. Massively Parallel Processing for Fast and Accurate Stamping Simulations

    NASA Astrophysics Data System (ADS)

    Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu

    2005-08-01

    The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.

  7. Calibration and validation of a spar-type floating offshore wind turbine model using the FAST dynamic simulation tool

    DOE PAGES

    Browning, J. R.; Jonkman, J.; Robertson, A.; ...

    2014-12-16

    In this study, high-quality computer simulations are required when designing floating wind turbines because of the complex dynamic responses that are inherent with a high number of degrees of freedom and variable metocean conditions. In 2007, the FAST wind turbine simulation tool, developed and maintained by the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL), was expanded to include capabilities that are suitable for modeling floating offshore wind turbines. In an effort to validate FAST and other offshore wind energy modeling tools, DOE funded the DeepCwind project that tested three prototype floating wind turbines at 1/50 th scalemore » in a wave basin, including a semisubmersible, a tension-leg platform, and a spar buoy. This paper describes the use of the results of the spar wave basin tests to calibrate and validate the FAST offshore floating simulation tool, and presents some initial results of simulated dynamic responses of the spar to several combinations of wind and sea states. Wave basin tests with the spar attached to a scale model of the NREL 5-megawatt reference wind turbine were performed at the Maritime Research Institute Netherlands under the DeepCwind project. This project included free-decay tests, tests with steady or turbulent wind and still water (both periodic and irregular waves with no wind), and combined wind/wave tests. The resulting data from the 1/50th model was scaled using Froude scaling to full size and used to calibrate and validate a full-size simulated model in FAST. Results of the model calibration and validation include successes, subtleties, and limitations of both wave basin testing and FAST modeling capabilities.« less

  8. Grid Integration Research | Wind | NREL

    Science.gov Websites

    -generated simulation of a wind turbine. Wind Power Plant Modeling and Simulation Engineers at the National computer-aided engineering tool, FAST, as well as their wind power plant simulation tool, Wind-Plant

  9. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  10. Simulation of the space station information system in Ada

    NASA Technical Reports Server (NTRS)

    Spiegel, James R.

    1986-01-01

    The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.

  11. FastBit: Interactively Searching Massive Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Ahern, Sean; Bethel, E. Wes

    2009-06-23

    As scientific instruments and computer simulations produce more and more data, the task of locating the essential information to gain insight becomes increasingly difficult. FastBit is an efficient software tool to address this challenge. In this article, we present a summary of the key underlying technologies, namely bitmap compression, encoding, and binning. Together these techniques enable FastBit to answer structured (SQL) queries orders of magnitude faster than popular database systems. To illustrate how FastBit is used in applications, we present three examples involving a high-energy physics experiment, a combustion simulation, and an accelerator simulation. In each case, FastBit significantly reducesmore » the response time and enables interactive exploration on terabytes of data.« less

  12. A Fast-Time Simulation Tool for Analysis of Airport Arrival Traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Meyn, Larry A.; Neuman, Frank

    2004-01-01

    The basic objective of arrival sequencing in air traffic control automation is to match traffic demand and airport capacity while minimizing delays. The performance of an automated arrival scheduling system, such as the Traffic Management Advisor developed by NASA for the FAA, can be studied by a fast-time simulation that does not involve running expensive and time-consuming real-time simulations. The fast-time simulation models runway configurations, the characteristics of arrival traffic, deviations from predicted arrival times, as well as the arrival sequencing and scheduling algorithm. This report reviews the development of the fast-time simulation method used originally by NASA in the design of the sequencing and scheduling algorithm for the Traffic Management Advisor. The utility of this method of simulation is demonstrated by examining the effect on delays of altering arrival schedules at a hub airport.

  13. Optimization-Based Calibration of FAST.Farm Parameters Against SOWFA: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreira, Paula D; Annoni, Jennifer; Jonkman, Jason

    2018-01-04

    FAST.Farm is a medium-delity wind farm modeling tool that can be used to assess power and loads contributions of wind turbines in a wind farm. The objective of this paper is to undertake a calibration procedure to set the user parameters of FAST.Farm to accurately represent results from large-eddy simulations. The results provide an in- depth analysis of the comparison of FAST.Farm and large-eddy simulations before and after calibration. The comparison of FAST.Farm and large-eddy simulation results are presented with respect to streamwise and radial velocity components as well as wake-meandering statistics (mean and standard deviation) in the lateral andmore » vertical directions under different atmospheric and turbine operating conditions.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doubrawa Moreira, Paula; Annoni, Jennifer; Jonkman, Jason

    FAST.Farm is a medium-delity wind farm modeling tool that can be used to assess power and loads contributions of wind turbines in a wind farm. The objective of this paper is to undertake a calibration procedure to set the user parameters of FAST.Farm to accurately represent results from large-eddy simulations. The results provide an in- depth analysis of the comparison of FAST.Farm and large-eddy simulations before and after calibration. The comparison of FAST.Farm and large-eddy simulation results are presented with respect to streamwise and radial velocity components as well as wake-meandering statistics (mean and standard deviation) in the lateral andmore » vertical directions under different atmospheric and turbine operating conditions.« less

  15. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  16. FAST Modularization Framework for Wind Turbine Simulation: Full-System Linearization

    DOE PAGES

    Jonkman, Jason M.; Jonkman, Bonnie J.

    2016-10-03

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well-established methods and tools for analyzing linear systems. Here, this paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  17. FAST modularization framework for wind turbine simulation: full-system linearization

    NASA Astrophysics Data System (ADS)

    Jonkman, J. M.; Jonkman, B. J.

    2016-09-01

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  18. Fast simulation tool for ultraviolet radiation at the earth's surface

    NASA Astrophysics Data System (ADS)

    Engelsen, Ola; Kylling, Arve

    2005-04-01

    FastRT is a fast, yet accurate, UV simulation tool that computes downward surface UV doses, UV indices, and irradiances in the spectral range 290 to 400 nm with a resolution as small as 0.05 nm. It computes a full UV spectrum within a few milliseconds on a standard PC, and enables the user to convolve the spectrum with user-defined and built-in spectral response functions including the International Commission on Illumination (CIE) erythemal response function used for UV index calculations. The program accounts for the main radiative input parameters, i.e., instrumental characteristics, solar zenith angle, ozone column, aerosol loading, clouds, surface albedo, and surface altitude. FastRT is based on look-up tables of carefully selected entries of atmospheric transmittances and spherical albedos, and exploits the smoothness of these quantities with respect to atmospheric, surface, geometrical, and spectral parameters. An interactive site, http://nadir.nilu.no/~olaeng/fastrt/fastrt.html, enables the public to run the FastRT program with most input options. This page also contains updated information about FastRT and links to freely downloadable source codes and binaries.

  19. Terminal - Tactical Separation Assured Flight Environment (T-TSafe)

    NASA Technical Reports Server (NTRS)

    Verma, Savita Arora; Tang, Huabin; Ballinger, Debbi

    2011-01-01

    The Tactical Separation Assured Flight Environment (TSAFE) has been previously tested as a conflict detection and resolution tool in the en-route phase of flight. Fast time simulations of a terminal version of this tool called Terminal TSAFE (T-TSAFE) have shown promise over the current conflict detection tools. It has shown to have fewer false alerts (as low as 2 per hour) and better prediction to conflict time than Conflict Alert. The tool will be tested in the simulated terminal area of Los Angeles International Airport, in a Human-in-the-loop experiment to identify controller procedures and information requirements. The simulation will include comparisons of T-TSAFE with NASA's version of Conflict Alert. Also, some other variables such as altitude entry by the controller, which improve T-TSAFE's predictions for conflict detection, will be tested. T-TSAFE integrates features of current conflict detection tools such as Automated Terminal Proximity Alert used to alleviate compression errors in the final approach phase. Based on fast-time simulation analysis, the anticipated benefits of T-TSAFE over Conflict Alert include reduced false/missed alerts and increased time to predicted loss of separation. Other metrics that will be used to evaluate the tool's impact on the controller include controller intervention, workload, and situation awareness.

  20. IslandFAST: A Semi-numerical Tool for Simulating the Late Epoch of Reionization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Yidong; Chen, Xuelei; Yue, Bin

    2017-08-01

    We present the algorithm and main results of our semi-numerical simulation, islandFAST, which was developed from 21cmFAST and designed for the late stage of reionization. The islandFAST simulation predicts the evolution and size distribution of the large-scale underdense neutral regions (neutral islands), and we find that the late Epoch of Reionization proceeds very fast, showing a characteristic scale of the neutral islands at each redshift. Using islandFAST, we compare the impact of two types of absorption systems, i.e., the large-scale underdense neutral islands versus small-scale overdense absorbers, in regulating the reionization process. The neutral islands dominate the morphology of themore » ionization field, while the small-scale absorbers dominate the mean-free path of ionizing photons, and also delay and prolong the reionization process. With our semi-numerical simulation, the evolution of the ionizing background can be derived self-consistently given a model for the small absorbers. The hydrogen ionization rate of the ionizing background is reduced by an order of magnitude in the presence of dense absorbers.« less

  1. Numerical and experimental study on the wave attenuation in bone--FDTD simulation of ultrasound propagation in cancellous bone.

    PubMed

    Nagatani, Yoshiki; Mizuno, Katsunori; Saeki, Takashi; Matsukawa, Mami; Sakaguchi, Takefumi; Hosoi, Hiroshi

    2008-11-01

    In cancellous bone, longitudinal waves often separate into fast and slow waves depending on the alignment of bone trabeculae in the propagation path. This interesting phenomenon becomes an effective tool for the diagnosis of osteoporosis because wave propagation behavior depends on the bone structure. Since the fast wave mainly propagates in trabeculae, this wave is considered to reflect the structure of trabeculae. For a new diagnosis method using the information of this fast wave, therefore, it is necessary to understand the generation mechanism and propagation behavior precisely. In this study, the generation process of fast wave was examined by numerical simulations using elastic finite-difference time-domain (FDTD) method and experimental measurements. As simulation models, three-dimensional X-ray computer tomography (CT) data of actual bone samples were used. Simulation and experimental results showed that the attenuation of fast wave was always higher in the early state of propagation, and they gradually decreased as the wave propagated in bone. This phenomenon is supposed to come from the complicated propagating paths of fast waves in cancellous bone.

  2. AeroDyn V15.04: Design tool for wind and MHK turbines

    DOE Data Explorer

    Murray, Robynne; Hayman, Greg; Jonkman, Jason

    2017-04-28

    AeroDyn is a time-domain wind and MHK turbine aerodynamics module that can be coupled into the FAST version 8 multi-physics engineering tool to enable aero-elastic simulation of horizontal-axis wind turbines. AeroDyn V15.04 has been updated to include a cavitation check for MHK turbines, and can be driven as a standalone code to compute wind turbine aerodynamic response uncoupled from FAST. Note that while AeroDyn has been updated to v15.04, FAST v8.16 has not yet been updated and still uses AeroDyn v15.03.

  3. A Fast-Time Simulation Environment for Airborne Merging and Spacing Research

    NASA Technical Reports Server (NTRS)

    Bussink, Frank J. L.; Doble, Nathan A.; Barmore, Bryan E.; Singer, Sharon

    2005-01-01

    As part of NASA's Distributed Air/Ground Traffic Management (DAG-TM) effort, NASA Langley Research Center is developing concepts and algorithms for merging multiple aircraft arrival streams and precisely spacing aircraft over the runway threshold. An airborne tool has been created for this purpose, called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR). To evaluate the performance of AMSTAR and complement human-in-the-loop experiments, a simulation environment has been developed that enables fast-time studies of AMSTAR operations. The environment is based on TMX, a multiple aircraft desktop simulation program created by the Netherlands National Aerospace Laboratory (NLR). This paper reviews the AMSTAR concept, discusses the integration of the AMSTAR algorithm into TMX and the enhancements added to TMX to support fast-time AMSTAR studies, and presents initial simulation results.

  4. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  5. Telescience - Concepts and contributions to the Extreme Ultraviolet Explorer mission

    NASA Technical Reports Server (NTRS)

    Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.

    1987-01-01

    It is shown how the contradictory goals of low-cost and fast data turnaround characterizing the Extreme Ultraviolet Explorer (EUVE) mission can be achieved via the early use of telescience style transparent tools and simulations. The use of transparent tools reduces the parallel development of capability while ensuring that valuable prelaunch experience is not lost in the operations phase. Efforts made to upgrade the 'EUVE electronics' simulator are described.

  6. Simulating Biomass Fast Pyrolysis at the Single Particle Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciesielski, Peter; Wiggins, Gavin; Daw, C Stuart

    2017-07-01

    Simulating fast pyrolysis at the scale of single particles allows for the investigation of the impacts of feedstock-specific parameters such as particle size, shape, and species of origin. For this reason particle-scale modeling has emerged as an important tool for understanding how variations in feedstock properties affect the outcomes of pyrolysis processes. The origins of feedstock properties are largely dictated by the composition and hierarchical structure of biomass, from the microstructural porosity to the external morphology of milled particles. These properties may be accounted for in simulations of fast pyrolysis by several different computational approaches depending on the level ofmore » structural and chemical complexity included in the model. The predictive utility of particle-scale simulations of fast pyrolysis can still be enhanced substantially by advancements in several areas. Most notably, considerable progress would be facilitated by the development of pyrolysis kinetic schemes that are decoupled from transport phenomena, predict product evolution from whole-biomass with increased chemical speciation, and are still tractable with present-day computational resources.« less

  7. Simulation of DKIST solar adaptive optics system

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Carlisle, Elizabeth; Schmidt, Dirk

    2016-07-01

    Solar adaptive optics (AO) simulations are a valuable tool to guide the design and optimization process of current and future solar AO and multi-conjugate AO (MCAO) systems. Solar AO and MCAO systems rely on extended object cross-correlating Shack-Hartmann wavefront sensors to measure the wavefront. Accurate solar AO simulations require computationally intensive operations, which have until recently presented a prohibitive computational cost. We present an update on the status of a solar AO and MCAO simulation tool being developed at the National Solar Observatory. The simulation tool is a multi-threaded application written in the C++ language that takes advantage of current large multi-core CPU computer systems and fast ethernet connections to provide accurate full simulation of solar AO and MCAO systems. It interfaces with KAOS, a state of the art solar AO control software developed by the Kiepenheuer-Institut fuer Sonnenphysik, that provides reliable AO control. We report on the latest results produced by the solar AO simulation tool.

  8. Beam-Dynamics Analysis of Long-Range Wakefield Effects on the SCRF Cavities at the Fast Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Young-Min; Bishofberger, Kip; Carlsten, Bruce

    Long-range wakefields in superconducting RF (SCRF) cavities create complicated effects on beam dynamics in SCRF-based FEL beamlines. The driving bunch excites effectively an infinite number of structure modes (including HOMs) which oscillate within the SCRF cavity. Couplers with loads are used to damp the HOMs. However, these HOMs can persist for long periods of time in superconducting structures, which leads to long-range wakefields. Clear understanding of the long-range wakefield effects is a critical element for risk mitigation of future SCRF accelerators such as XFEL at DESY, LCLS-II XFEL, and MaRIE XFEL. We are currently developing numerical tools for simulating long-rangemore » wakefields in SCRF accelerators and plan to experimentally verify the tools by measuring these wakefields at the Fermilab Accelerator Science and Technology (FAST) facility. This paper previews the experimental conditions at the FAST 50 MeV beamline based on the simulation results.« less

  9. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  10. Fast scattering simulation tool for multi-energy x-ray imaging

    NASA Astrophysics Data System (ADS)

    Sossin, A.; Tabary, J.; Rebuffel, V.; Létang, J. M.; Freud, N.; Verger, L.

    2015-12-01

    A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.

  11. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  12. A fast method for optical simulation of flood maps of light-sharing detector modules.

    PubMed

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W; Peng, Qiyu

    2015-12-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200-600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.

  13. 3D liver volume reconstructed for palpation training.

    PubMed

    Tibamoso, Gerardo; Perez-Gutierrez, Byron; Uribe-Quevedo, Alvaro

    2013-01-01

    Virtual Reality systems for medical procedures such as the palpation of different organs, requires fast, robust, accurate and reliable computational methods for providing realism during interaction with the 3D biological models. This paper presents the segmentation, reconstruction and palpation simulation of a healthy liver volume as a tool for training. The chosen method considers the mechanical characteristics and liver properties for correctly simulating palpation interactions, which results appropriate as a complementary tool for training medical students in familiarizing with the liver anatomy.

  14. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    PubMed

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  15. A fast ultrasonic simulation tool based on massively parallel implementations

    NASA Astrophysics Data System (ADS)

    Lambert, Jason; Rougeron, Gilles; Lacassagne, Lionel; Chatillon, Sylvain

    2014-02-01

    This paper presents a CIVA optimized ultrasonic inspection simulation tool, which takes benefit of the power of massively parallel architectures: graphical processing units (GPU) and multi-core general purpose processors (GPP). This tool is based on the classical approach used in CIVA: the interaction model is based on Kirchoff, and the ultrasonic field around the defect is computed by the pencil method. The model has been adapted and parallelized for both architectures. At this stage, the configurations addressed by the tool are : multi and mono-element probes, planar specimens made of simple isotropic materials, planar rectangular defects or side drilled holes of small diameter. Validations on the model accuracy and performances measurements are presented.

  16. Application of Fast Dynamic Allan Variance for the Characterization of FOGs-Based Measurement While Drilling.

    PubMed

    Wang, Lu; Zhang, Chunxi; Gao, Shuang; Wang, Tao; Lin, Tie; Li, Xianmu

    2016-12-07

    The stability of a fiber optic gyroscope (FOG) in measurement while drilling (MWD) could vary with time because of changing temperature, high vibration, and sudden power failure. The dynamic Allan variance (DAVAR) is a sliding version of the Allan variance. It is a practical tool that could represent the non-stationary behavior of the gyroscope signal. Since the normal DAVAR takes too long to deal with long time series, a fast DAVAR algorithm has been developed to accelerate the computation speed. However, both the normal DAVAR algorithm and the fast algorithm become invalid for discontinuous time series. What is worse, the FOG-based MWD underground often keeps working for several days; the gyro data collected aboveground is not only very time-consuming, but also sometimes discontinuous in the timeline. In this article, on the basis of the fast algorithm for DAVAR, we make a further advance in the fast algorithm (improved fast DAVAR) to extend the fast DAVAR to discontinuous time series. The improved fast DAVAR and the normal DAVAR are used to responsively characterize two sets of simulation data. The simulation results show that when the length of the time series is short, the improved fast DAVAR saves 78.93% of calculation time. When the length of the time series is long ( 6 × 10 5 samples), the improved fast DAVAR reduces calculation time by 97.09%. Another set of simulation data with missing data is characterized by the improved fast DAVAR. Its simulation results prove that the improved fast DAVAR could successfully deal with discontinuous data. In the end, a vibration experiment with FOGs-based MWD has been implemented to validate the good performance of the improved fast DAVAR. The results of the experience testify that the improved fast DAVAR not only shortens computation time, but could also analyze discontinuous time series.

  17. Application of Fast Dynamic Allan Variance for the Characterization of FOGs-Based Measurement While Drilling

    PubMed Central

    Wang, Lu; Zhang, Chunxi; Gao, Shuang; Wang, Tao; Lin, Tie; Li, Xianmu

    2016-01-01

    The stability of a fiber optic gyroscope (FOG) in measurement while drilling (MWD) could vary with time because of changing temperature, high vibration, and sudden power failure. The dynamic Allan variance (DAVAR) is a sliding version of the Allan variance. It is a practical tool that could represent the non-stationary behavior of the gyroscope signal. Since the normal DAVAR takes too long to deal with long time series, a fast DAVAR algorithm has been developed to accelerate the computation speed. However, both the normal DAVAR algorithm and the fast algorithm become invalid for discontinuous time series. What is worse, the FOG-based MWD underground often keeps working for several days; the gyro data collected aboveground is not only very time-consuming, but also sometimes discontinuous in the timeline. In this article, on the basis of the fast algorithm for DAVAR, we make a further advance in the fast algorithm (improved fast DAVAR) to extend the fast DAVAR to discontinuous time series. The improved fast DAVAR and the normal DAVAR are used to responsively characterize two sets of simulation data. The simulation results show that when the length of the time series is short, the improved fast DAVAR saves 78.93% of calculation time. When the length of the time series is long (6×105 samples), the improved fast DAVAR reduces calculation time by 97.09%. Another set of simulation data with missing data is characterized by the improved fast DAVAR. Its simulation results prove that the improved fast DAVAR could successfully deal with discontinuous data. In the end, a vibration experiment with FOGs-based MWD has been implemented to validate the good performance of the improved fast DAVAR. The results of the experience testify that the improved fast DAVAR not only shortens computation time, but could also analyze discontinuous time series. PMID:27941600

  18. DKIST Adaptive Optics System: Simulation Results

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Schmidt, Dirk

    2016-05-01

    The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.

  19. A fast method for optical simulation of flood maps of light-sharing detector modules

    PubMed Central

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu

    2016-01-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials. PMID:27660376

  20. A fast method for optical simulation of flood maps of light-sharing detector modules

    DOE PAGES

    Shi, Han; Du, Dong; Xu, JianFeng; ...

    2015-09-03

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. Here, we present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We also simulated conventional block detector designs with different slotted light guide patterns using the new approachmore » and compared the outcomes with those from GATE simulations. And while the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.« less

  1. Advanced surface design for logistics analysis

    NASA Astrophysics Data System (ADS)

    Brown, Tim R.; Hansen, Scott D.

    The development of anthropometric arm/hand and tool models and their manipulation in a large system model for maintenance simulation are discussed. The use of Advanced Surface Design and s-fig technology in anthropometrics, and three-dimensional graphics simulation tools, are found to achieve a good balance between model manipulation speed and model accuracy. The present second generation models are shown to be twice as fast to manipulate as the first generation b-surf models, to be easier to manipulate into various configurations, and to more closely approximate human contours.

  2. Validation of Hydrodynamic Load Models Using CFD for the OC4-DeepCwind Semisubmersible: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benitz, M. A.; Schmidt, D. P.; Lackner, M. A.

    Computational fluid dynamics (CFD) simulations were carried out on the OC4-DeepCwind semi-submersible to obtain a better understanding of how to set hydrodynamic coefficients for the structure when using an engineering tool such as FAST to model the system. The focus here was on the drag behavior and the effects of the free-surface, free-ends and multi-member arrangement of the semi-submersible structure. These effects are investigated through code-to-code comparisons and flow visualizations. The implications on mean load predictions from engineering tools are addressed. The work presented here suggests that selection of drag coefficients should take into consideration a variety of geometric factors.more » Furthermore, CFD simulations demonstrate large time-varying loads due to vortex shedding, which FAST's hydrodynamic module, HydroDyn, does not model. The implications of these oscillatory loads on the fatigue life needs to be addressed.« less

  3. XS: a FASTQ read simulator.

    PubMed

    Pratas, Diogo; Pinho, Armando J; Rodrigues, João M O S

    2014-01-16

    The emerging next-generation sequencing (NGS) is bringing, besides the natural huge amounts of data, an avalanche of new specialized tools (for analysis, compression, alignment, among others) and large public and private network infrastructures. Therefore, a direct necessity of specific simulation tools for testing and benchmarking is rising, such as a flexible and portable FASTQ read simulator, without the need of a reference sequence, yet correctly prepared for producing approximately the same characteristics as real data. We present XS, a skilled FASTQ read simulation tool, flexible, portable (does not need a reference sequence) and tunable in terms of sequence complexity. It has several running modes, depending on the time and memory available, and is aimed at testing computing infrastructures, namely cloud computing of large-scale projects, and testing FASTQ compression algorithms. Moreover, XS offers the possibility of simulating the three main FASTQ components individually (headers, DNA sequences and quality-scores). XS provides an efficient and convenient method for fast simulation of FASTQ files, such as those from Ion Torrent (currently uncovered by other simulators), Roche-454, Illumina and ABI-SOLiD sequencing machines. This tool is publicly available at http://bioinformatics.ua.pt/software/xs/.

  4. Toward high-speed 3D nonlinear soft tissue deformation simulations using Abaqus software.

    PubMed

    Idkaidek, Ashraf; Jasiuk, Iwona

    2015-12-01

    We aim to achieve a fast and accurate three-dimensional (3D) simulation of a porcine liver deformation under a surgical tool pressure using the commercial finite element software Abaqus. The liver geometry is obtained using magnetic resonance imaging, and a nonlinear constitutive law is employed to capture large deformations of the tissue. Effects of implicit versus explicit analysis schemes, element type, and mesh density on computation time are studied. We find that Abaqus explicit and implicit solvers are capable of simulating nonlinear soft tissue deformations accurately using first-order tetrahedral elements in a relatively short time by optimizing the element size. This study provides new insights and guidance on accurate and relatively fast nonlinear soft tissue simulations. Such simulations can provide force feedback during robotic surgery and allow visualization of tissue deformations for surgery planning and training of surgical residents.

  5. A fast simulation method for radiation maps using interpolation in a virtual environment.

    PubMed

    Li, Meng-Kun; Liu, Yong-Kuo; Peng, Min-Jun; Xie, Chun-Li; Yang, Li-Qun

    2018-05-10

    In nuclear decommissioning, virtual simulation technology is a useful tool to achieve an effective work process by using virtual environments to represent the physical and logical scheme of a real decommissioning project. This technology is cost-saving and time-saving, with the capacity to develop various decommissioning scenarios and reduce the risk of retrofitting. The method utilises a radiation map in a virtual simulation as the basis for the assessment of exposure to a virtual human. In this paper, we propose a fast simulation method using a known radiation source. The method has a unique advantage over point kernel and Monte Carlo methods because it generates the radiation map using interpolation in a virtual environment. The simulation of the radiation map including the calculation and the visualisation were realised using UNITY and MATLAB. The feasibility of the proposed method was tested on a hypothetical case and the results obtained are discussed in this paper.

  6. Will Your Battery Survive a World With Fast Chargers?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neubauer, J. S.; Wood, E.

    Fast charging is attractive to battery electric vehicle (BEV) drivers for its ability to enable long-distance travel and quickly recharge depleted batteries on short notice. However, such aggressive charging and the sustained vehicle operation that result could lead to excessive battery temperatures and degradation. Properly assessing the consequences of fast charging requires accounting for disparate cycling, heating, and aging of individual cells in large BEV packs when subjected to realistic travel patterns, usage of fast chargers, and climates over long durations (i.e., years). The U.S. Department of Energy's Vehicle Technologies Office has supported the National Renewable Energy Laboratory's development ofmore » BLAST-V-the Battery Lifetime Analysis and Simulation Tool for Vehicles-to create a tool capable of accounting for all of these factors. We present on the findings of applying this tool to realistic fast charge scenarios. The effects of different travel patterns, climates, battery sizes, battery thermal management systems, and other factors on battery performance and degradation are presented. We find that the impact of realistic fast charging on battery degradation is minimal for most drivers, due to the low frequency of use. However, in the absence of active battery cooling systems, a driver's desired utilization of a BEV and fast charging infrastructure can result in unsafe peak battery temperatures. We find that active battery cooling systems can control peak battery temperatures to safe limits while allowing the desired use of the vehicle.« less

  7. Integration of SimSET photon history generator in GATE for efficient Monte Carlo simulations of pinhole SPECT.

    PubMed

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J S; Tsui, Benjamin M W

    2008-07-01

    The authors developed and validated an efficient Monte Carlo simulation (MCS) workflow to facilitate small animal pinhole SPECT imaging research. This workflow seamlessly integrates two existing MCS tools: simulation system for emission tomography (SimSET) and GEANT4 application for emission tomography (GATE). Specifically, we retained the strength of GATE in describing complex collimator/detector configurations to meet the anticipated needs for studying advanced pinhole collimation (e.g., multipinhole) geometry, while inserting the fast SimSET photon history generator (PHG) to circumvent the relatively slow GEANT4 MCS code used by GATE in simulating photon interactions inside voxelized phantoms. For validation, data generated from this new SimSET-GATE workflow were compared with those from GATE-only simulations as well as experimental measurements obtained using a commercial small animal pinhole SPECT system. Our results showed excellent agreement (e.g., in system point response functions and energy spectra) between SimSET-GATE and GATE-only simulations, and, more importantly, a significant computational speedup (up to approximately 10-fold) provided by the new workflow. Satisfactory agreement between MCS results and experimental data were also observed. In conclusion, the authors have successfully integrated SimSET photon history generator in GATE for fast and realistic pinhole SPECT simulations, which can facilitate research in, for example, the development and application of quantitative pinhole and multipinhole SPECT for small animal imaging. This integrated simulation tool can also be adapted for studying other preclinical and clinical SPECT techniques.

  8. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo; Sterpin, Edmond

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithmmore » of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.« less

  9. Effects of ATC automation on precision approaches to closely space parallel runways

    NASA Technical Reports Server (NTRS)

    Slattery, R.; Lee, K.; Sanford, B.

    1995-01-01

    Improved navigational technology (such as the Microwave Landing System and the Global Positioning System) installed in modern aircraft will enable air traffic controllers to better utilize available airspace. Consequently, arrival traffic can fly approaches to parallel runways separated by smaller distances than are currently allowed. Previous simulation studies of advanced navigation approaches have found that controller workload is increased when there is a combination of aircraft that are capable of following advanced navigation routes and aircraft that are not. Research into Air Traffic Control automation at Ames Research Center has led to the development of the Center-TRACON Automation System (CTAS). The Final Approach Spacing Tool (FAST) is the component of the CTAS used in the TRACON area. The work in this paper examines, via simulation, the effects of FAST used for aircraft landing on closely spaced parallel runways. The simulation contained various combinations of aircraft, equipped and unequipped with advanced navigation systems. A set of simulations was run both manually and with an augmented set of FAST advisories to sequence aircraft, assign runways, and avoid conflicts. The results of the simulations are analyzed, measuring the airport throughput, aircraft delay, loss of separation, and controller workload.

  10. Disaster response team FAST skills training with a portable ultrasound simulator compared to traditional training: pilot study.

    PubMed

    Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J

    2015-03-01

    Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between training groups. This pilot study of a deployed mixed-provider disaster response team suggests that a novel portable US simulator may provide equivalent skills training in comparison to traditional live instructor and model training. Further studies with a larger sample size and other measures of short- and long-term clinical performance are warranted.

  11. Advanced computational simulations of water waves interacting with wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  12. A Validation and Code-to-Code Verification of FAST for a Megawatt-Scale Wind Turbine with Aeroelastically Tailored Blades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan

    This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less

  13. A Validation and Code-to-Code Verification of FAST for a Megawatt-Scale Wind Turbine with Aeroelastically Tailored Blades

    DOE PAGES

    Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan; ...

    2017-08-29

    This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less

  14. Workshop on data acquisition and trigger system simulations for high energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less

  15. BeamDyn: a high-fidelity wind turbine blade solver in the FAST modular framework

    DOE PAGES

    Wang, Qi; Sprague, Michael A.; Jonkman, Jason; ...

    2017-03-14

    Here, this paper presents a numerical implementation of the geometrically exact beam theory based on the Legendre-spectral-finite-element (LSFE) method. The displacement-based geometrically exact beam theory is presented, and the special treatment of three-dimensional rotation parameters is reviewed. An LSFE is a high-order finite element with nodes located at the Gauss-Legendre-Lobatto points. These elements can be an order of magnitude more computationally efficient than low-order finite elements for a given accuracy level. The new module, BeamDyn, is implemented in the FAST modularization framework for dynamic simulation of highly flexible composite-material wind turbine blades within the FAST aeroelastic engineering model. The frameworkmore » allows for fully interactive simulations of turbine blades in operating conditions. Numerical examples are provided to validate BeamDyn and examine the LSFE performance as well as the coupling algorithm in the FAST modularization framework. BeamDyn can also be used as a stand-alone high-fidelity beam tool.« less

  16. BeamDyn: a high-fidelity wind turbine blade solver in the FAST modular framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qi; Sprague, Michael A.; Jonkman, Jason

    Here, this paper presents a numerical implementation of the geometrically exact beam theory based on the Legendre-spectral-finite-element (LSFE) method. The displacement-based geometrically exact beam theory is presented, and the special treatment of three-dimensional rotation parameters is reviewed. An LSFE is a high-order finite element with nodes located at the Gauss-Legendre-Lobatto points. These elements can be an order of magnitude more computationally efficient than low-order finite elements for a given accuracy level. The new module, BeamDyn, is implemented in the FAST modularization framework for dynamic simulation of highly flexible composite-material wind turbine blades within the FAST aeroelastic engineering model. The frameworkmore » allows for fully interactive simulations of turbine blades in operating conditions. Numerical examples are provided to validate BeamDyn and examine the LSFE performance as well as the coupling algorithm in the FAST modularization framework. BeamDyn can also be used as a stand-alone high-fidelity beam tool.« less

  17. Analyzing and Visualizing Cosmological Simulations with ParaView

    NASA Astrophysics Data System (ADS)

    Woodring, Jonathan; Heitmann, Katrin; Ahrens, James; Fasel, Patricia; Hsu, Chung-Hsing; Habib, Salman; Pope, Adrian

    2011-07-01

    The advent of large cosmological sky surveys—ushering in the era of precision cosmology—has been accompanied by ever larger cosmological simulations. The analysis of these simulations, which currently encompass tens of billions of particles and up to a trillion particles in the near future, is often as daunting as carrying out the simulations in the first place. Therefore, the development of very efficient analysis tools combining qualitative and quantitative capabilities is a matter of some urgency. In this paper, we introduce new analysis features implemented within ParaView, a fully parallel, open-source visualization toolkit, to analyze large N-body simulations. A major aspect of ParaView is that it can live and operate on the same machines and utilize the same parallel power as the simulation codes themselves. In addition, data movement is in a serious bottleneck now and will become even more of an issue in the future; an interactive visualization and analysis tool that can handle data in situ is fast becoming essential. The new features in ParaView include particle readers and a very efficient halo finder that identifies friends-of-friends halos and determines common halo properties, including spherical overdensity properties. In combination with many other functionalities already existing within ParaView, such as histogram routines or interfaces to programming languages like Python, this enhanced version enables fast, interactive, and convenient analyses of large cosmological simulations. In addition, development paths are available for future extensions.

  18. Status and outlook of CFD technology at Mitsubishi Heavy Industries, Nagoya

    NASA Astrophysics Data System (ADS)

    Tanioka, Tadayuki

    1990-09-01

    Computational Fluid Dynamics (CFD) technology has made tremendous progress in the last several years. It has matured to become a practical simulation tool in aircraft industries. In MHI, CFD has become an indispensible tool for aerodynamic design aerospace vehicles. The present status is described of this advanced technology at MHI. Also mentioned are some future advances of the fast growing technology as well as associated hardware requirements.

  19. Simulation for Wind Turbine Generators -- With FAST and MATLAB-Simulink Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, M.; Muljadi, E.; Jonkman, J.

    This report presents the work done to develop generator and gearbox models in the Matrix Laboratory (MATLAB) environment and couple them to the National Renewable Energy Laboratory's Fatigue, Aerodynamics, Structures, and Turbulence (FAST) program. The goal of this project was to interface the superior aerodynamic and mechanical models of FAST to the excellent electrical generator models found in various Simulink libraries and applications. The scope was limited to Type 1, Type 2, and Type 3 generators and fairly basic gear-train models. Future work will include models of Type 4 generators and more-advanced gear-train models with increased degrees of freedom. Asmore » described in this study, implementation of the developed drivetrain model enables the software tool to be used in many ways. Several case studies are presented as examples of the many types of studies that can be performed using this tool.« less

  20. Metadyn View: Fast web-based viewer of free energy surfaces calculated by metadynamics

    NASA Astrophysics Data System (ADS)

    Hošek, Petr; Spiwok, Vojtěch

    2016-01-01

    Metadynamics is a highly successful enhanced sampling technique for simulation of molecular processes and prediction of their free energy surfaces. An in-depth analysis of data obtained by this method is as important as the simulation itself. Although there are several tools to compute free energy surfaces from metadynamics data, they usually lack user friendliness and a build-in visualization part. Here we introduce Metadyn View as a fast and user friendly viewer of bias potential/free energy surfaces calculated by metadynamics in Plumed package. It is based on modern web technologies including HTML5, JavaScript and Cascade Style Sheets (CSS). It can be used by visiting the web site and uploading a HILLS file. It calculates the bias potential/free energy surface on the client-side, so it can run online or offline without necessity to install additional web engines. Moreover, it includes tools for measurement of free energies and free energy differences and data/image export.

  1. Advanced studies of electromagnetic scattering

    NASA Technical Reports Server (NTRS)

    Ling, Hao

    1994-01-01

    In radar signature applications it is often desirable to generate the range profiles and inverse synthetic aperture radar (ISAR) images of a target. They can be used either as identification tools to distinguish and classify the target from a collection of possible targets, or as diagnostic/design tools to pinpoint the key scattering centers on the target. The simulation of synthetic range profiles and ISAR images is usually a time intensive task and computation time is of prime importance. Our research has been focused on the development of fast simulation algorithms for range profiles and ISAR images using the shooting and bouncing ray (SBR) method, a high frequency electromagnetic simulation technique for predicting the radar returns from realistic aerospace vehicles and the scattering by complex media.

  2. Computer software tool REALM for sustainable water allocation and management.

    PubMed

    Perera, B J C; James, B; Kularathna, M D U

    2005-12-01

    REALM (REsource ALlocation Model) is a generalised computer simulation package that models harvesting and bulk distribution of water resources within a water supply system. It is a modeling tool, which can be applied to develop specific water allocation models. Like other water resource simulation software tools, REALM uses mass-balance accounting at nodes, while the movement of water within carriers is subject to capacity constraints. It uses a fast network linear programming algorithm to optimise the water allocation within the network during each simulation time step, in accordance with user-defined operating rules. This paper describes the main features of REALM and provides potential users with an appreciation of its capabilities. In particular, it describes two case studies covering major urban and rural water supply systems. These case studies illustrate REALM's capabilities in the use of stochastically generated data in water supply planning and management, modelling of environmental flows, and assessing security of supply issues.

  3. CPAS Preflight Drop Test Analysis Process

    NASA Technical Reports Server (NTRS)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  4. A Fast Monte Carlo Simulation for the International Linear Collider Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furse, D.; /Georgia Tech

    2005-12-15

    The following paper contains details concerning the motivation for, implementation and performance of a Java-based fast Monte Carlo simulation for a detector designed to be used in the International Linear Collider. This simulation, presently included in the SLAC ILC group's org.lcsim package, reads in standard model or SUSY events in STDHEP file format, stochastically simulates the blurring in physics measurements caused by intrinsic detector error, and writes out an LCIO format file containing a set of final particles statistically similar to those that would have found by a full Monte Carlo simulation. In addition to the reconstructed particles themselves, descriptionsmore » of the calorimeter hit clusters and tracks that these particles would have produced are also included in the LCIO output. These output files can then be put through various analysis codes in order to characterize the effectiveness of a hypothetical detector at extracting relevant physical information about an event. Such a tool is extremely useful in preliminary detector research and development, as full simulations are extremely cumbersome and taxing on processor resources; a fast, efficient Monte Carlo can facilitate and even make possible detector physics studies that would be very impractical with the full simulation by sacrificing what is in many cases inappropriate attention to detail for valuable gains in time required for results.« less

  5. Development and Verification of the Soil-Pile Interaction Extension for SubDyn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick R; Wendt, Fabian F

    SubDyn is the substructure structural-dynamics module for the aero-hydro-servo-elastic tool FAST v8. SubDyn uses a finite-element model (FEM) to simulate complex multimember lattice structures connected to conventional turbines and towers, and it can make use of the Craig-Bampton model reduction. Here we describe the newly added capability to handle soil-pile stiffness and compare results for monopile and jacket-based offshore wind turbines as obtained with FAST v8, SACS, and EDP (the latter two are modeling software packages commonly used in the offshore oil and gas industry). The level of agreement in terms of modal properties and loads for the entire offshoremore » wind turbine components is excellent, thus allowing SubDyn and FAST v8 to accurately simulate offshore wind turbines on fixed-bottom structures and accounting for the effect of soil dynamics, thus reducing risk to the project.« less

  6. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    PubMed Central

    Calcagno, Cristina; Coppo, Mario

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327

  7. On designing multicore-aware simulators for systems biology endowed with OnLine statistics.

    PubMed

    Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, I.; Elgamal, A.; Romanowitz, H.

    Demand parameters for turbines, such as tower moment demand, are primarily driven by wind excitation and dynamics associated with operation. For that purpose, computational simulation platforms have been developed, such as FAST, maintained by the National Renewable Energy Laboratory (NREL). For seismically active regions, building codes also require the consideration of earthquake loading. Historically, it has been common to use simple building code approaches to estimate the structural demand from earthquake shaking, as an independent loading scenario. Currently, International Electrotechnical Commission (IEC) design requirements include the consideration of earthquake shaking while the turbine is operating. Numerical and analytical tools usedmore » to consider earthquake loads for buildings and other static civil structures are not well suited for modeling simultaneous wind and earthquake excitation in conjunction with operational dynamics. Through the addition of seismic loading capabilities to FAST, it is possible to simulate earthquake shaking in the time domain, which allows consideration of non-linear effects such as structural nonlinearities, aerodynamic hysteresis, control system influence, and transients. This paper presents a FAST model of a modern 900-kW wind turbine, which is calibrated based on field vibration measurements. With this calibrated model, both coupled and uncoupled simulations are conducted looking at the structural demand for the turbine tower. Response is compared under the conditions of normal operation and potential emergency shutdown due the earthquake induced vibrations. The results highlight the availability of a numerical tool for conducting such studies, and provide insights into the combined wind-earthquake loading mechanism.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dias, M F; Department of Radiation Oncology, Francis H. Burr Proton Therapy Center Massachusetts General Hospital; Seco, J

    Purpose: Research in carbon imaging has been growing over the past years, as a way to increase treatment accuracy and patient positioning in carbon therapy. The purpose of this tool is to allow a fast and flexible way to generate CDRR data without the need to use Monte Carlo (MC) simulations. It can also be used to predict future clinically measured data. Methods: A python interface has been developed, which uses information from CT or 4DCT and thetreatment calibration curve to compute the Water Equivalent Path Length (WEPL) of carbon ions. A GPU based ray tracing algorithm computes the WEPLmore » of each individual carbon traveling through the CT voxels. A multiple peak detection method to estimate high contrast margin positioning has been implemented (described elsewhere). MC simulations have been used to simulate carbons depth dose curves in order to simulate the response of a range detector. Results: The tool allows the upload of CT or 4DCT images. The user has the possibility to selectphase/slice of interested as well as position, angle…). The WEPL is represented as a range detector which can be used to assess range dilution and multiple peak detection effects. The tool also provides knowledge of the minimum energy that should be considered for imaging purposes. The multiple peak detection method has been used in a lung tumor case, showing an accuracy of 1mm in determine the exact interface position. Conclusion: The tool offers an easy and fast way to simulate carbon imaging data. It can be used for educational and for clinical purposes, allowing the user to test beam energies and angles before real acquisition. An analysis add-on is being developed, where the used will have the opportunity to select different reconstruction methods and detector types (range or energy). Fundacao para a Ciencia e a Tecnologia (FCT), PhD Grant number SFRH/BD/85749/2012.« less

  10. DUK - A Fast and Efficient Kmer Based Sequence Matching Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingkun; Copeland, Alex; Han, James

    2011-03-21

    A new tool, DUK, is developed to perform matching task. Matching is to find whether a query sequence partially or totally matches given reference sequences or not. Matching is similar to alignment. Indeed many traditional analysis tasks like contaminant removal use alignment tools. But for matching, there is no need to know which bases of a query sequence matches which position of a reference sequence, it only need know whether there exists a match or not. This subtle difference can make matching task much faster than alignment. DUK is accurate, versatile, fast, and has efficient memory usage. It uses Kmermore » hashing method to index reference sequences and Poisson model to calculate p-value. DUK is carefully implemented in C++ in object oriented design. The resulted classes can also be used to develop other tools quickly. DUK have been widely used in JGI for a wide range of applications such as contaminant removal, organelle genome separation, and assembly refinement. Many real applications and simulated dataset demonstrate its power.« less

  11. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies.

    PubMed

    Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross

    2016-06-01

    To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dpetstep images and noise properties agreed better with MC. The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.

  12. Teaching the Concept of Gibbs Energy Minimization through Its Application to Phase-Equilibrium Calculation

    ERIC Educational Resources Information Center

    Privat, Romain; Jaubert, Jean-Noe¨l; Berger, Etienne; Coniglio, Lucie; Lemaitre, Ce´cile; Meimaroglou, Dimitrios; Warth, Vale´rie

    2016-01-01

    Robust and fast methods for chemical or multiphase equilibrium calculation are routinely needed by chemical-process engineers working on sizing or simulation aspects. Yet, while industrial applications essentially require calculation tools capable of discriminating between stable and nonstable states and converging to nontrivial solutions,…

  13. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  14. FESetup: Automating Setup for Alchemical Free Energy Simulations.

    PubMed

    Loeffler, Hannes H; Michel, Julien; Woods, Christopher

    2015-12-28

    FESetup is a new pipeline tool which can be used flexibly within larger workflows. The tool aims to support fast and easy setup of alchemical free energy simulations for molecular simulation packages such as AMBER, GROMACS, Sire, or NAMD. Post-processing methods like MM-PBSA and LIE can be set up as well. Ligands are automatically parametrized with AM1-BCC, and atom mappings for a single topology description are computed with a maximum common substructure search (MCSS) algorithm. An abstract molecular dynamics (MD) engine can be used for equilibration prior to free energy setup or standalone. Currently, all modern AMBER force fields are supported. Ease of use, robustness of the code, and automation where it is feasible are the main development goals. The project follows an open development model, and we welcome contributions.

  15. Model reduction for slow–fast stochastic systems with metastable behaviour

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruna, Maria, E-mail: bruna@maths.ox.ac.uk; Computational Science Laboratory, Microsoft Research, Cambridge CB1 2FB; Chapman, S. Jonathan

    2014-05-07

    The quasi-steady-state approximation (or stochastic averaging principle) is a useful tool in the study of multiscale stochastic systems, giving a practical method by which to reduce the number of degrees of freedom in a model. The method is extended here to slow–fast systems in which the fast variables exhibit metastable behaviour. The key parameter that determines the form of the reduced model is the ratio of the timescale for the switching of the fast variables between metastable states to the timescale for the evolution of the slow variables. The method is illustrated with two examples: one from biochemistry (a fast-species-mediatedmore » chemical switch coupled to a slower varying species), and one from ecology (a predator–prey system). Numerical simulations of each model reduction are compared with those of the full system.« less

  16. Simulation of Needle-Type Corona Electrodes by the Finite Element Method

    NASA Astrophysics Data System (ADS)

    Yang, Shiyou; José Márcio, Machado; Nancy Mieko, Abe; Angelo, Passaro

    2007-12-01

    This paper describes a software tool, called LEVSOFT, suitable for the electric field simulations of corona electrodes by the Finite Element Method (FEM). Special attention was paid to the user friendly construction of geometries with corners and sharp points, and to the fast generation of highly refined triangular meshes and field maps. The execution of self-adaptive meshes was also implemented. These customized features make the code attractive for the simulation of needle-type corona electrodes. Some case examples involving needle type electrodes are presented.

  17. Validation of a FAST model of the Statoil-Hywind Demo floating wind turbine

    DOE PAGES

    Driscoll, Frederick; Jonkman, Jason; Robertson, Amy; ...

    2016-10-13

    To assess the accuracy of the National Renewable Energy Laboratory's (NREL's) FAST simulation tool for modeling the coupled response of floating offshore wind turbines under realistic open-ocean conditions, NREL developed a FAST model of the Statoil Hywind Demo floating offshore wind turbine, and validated simulation results against field measurements. Field data were provided by Statoil, which conducted a comprehensive test measurement campaign of its demonstration system, a 2.3-MW Siemens turbine mounted on a spar substructure deployed about 10 km off the island of Karmoy in Norway. A top-down approach was used to develop the FAST model, starting with modeling themore » blades and working down to the mooring system. Design data provided by Siemens and Statoil were used to specify the structural, aerodynamic, and dynamic properties. Measured wind speeds and wave spectra were used to develop the wind and wave conditions used in the model. The overall system performance and behavior were validated for eight sets of field measurements that span a wide range of operating conditions. The simulated controller response accurately reproduced the measured blade pitch and power. In conclusion, the structural and blade loads and spectra of platform motion agree well with the measured data.« less

  18. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  19. MDWiZ: a platform for the automated translation of molecular dynamics simulations.

    PubMed

    Rusu, Victor H; Horta, Vitor A C; Horta, Bruno A C; Lins, Roberto D; Baron, Riccardo

    2014-03-01

    A variety of popular molecular dynamics (MD) simulation packages were independently developed in the last decades to reach diverse scientific goals. However, such non-coordinated development of software, force fields, and analysis tools for molecular simulations gave rise to an array of software formats and arbitrary conventions for routine preparation and analysis of simulation input and output data. Different formats and/or parameter definitions are used at each stage of the modeling process despite largely contain redundant information between alternative software tools. Such Babel of languages that cannot be easily and univocally translated one into another poses one of the major technical obstacles to the preparation, translation, and comparison of molecular simulation data that users face on a daily basis. Here, we present the MDWiZ platform, a freely accessed online portal designed to aid the fast and reliable preparation and conversion of file formats that allows researchers to reproduce or generate data from MD simulations using different setups, including force fields and models with different underlying potential forms. The general structure of MDWiZ is presented, the features of version 1.0 are detailed, and an extensive validation based on GROMACS to LAMMPS conversion is presented. We believe that MDWiZ will be largely useful to the molecular dynamics community. Such fast format and force field exchange for a given system allows tailoring the chosen system to a given computer platform and/or taking advantage of a specific capabilities offered by different software engines. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  20. A fast and complete GEANT4 and ROOT Object-Oriented Toolkit: GROOT

    NASA Astrophysics Data System (ADS)

    Lattuada, D.; Balabanski, D. L.; Chesnevskaya, S.; Costa, M.; Crucillà, V.; Guardo, G. L.; La Cognata, M.; Matei, C.; Pizzone, R. G.; Romano, S.; Spitaleri, C.; Tumino, A.; Xu, Y.

    2018-01-01

    Present and future gamma-beam facilities represent a great opportunity to validate and evaluate the cross-sections of many photonuclear reactions at near-threshold energies. Monte Carlo (MC) simulations are very important to evaluate the reaction rates and to maximize the detection efficiency but, unfortunately, they can be very cputime-consuming and in some cases very hard to reproduce, especially when exploring near-threshold cross-section. We developed a software that makes use of the validated tracking GEANT4 libraries and the n-body event generator of ROOT in order to provide a fast, realiable and complete MC tool to be used for nuclear physics experiments. This tool is indeed intended to be used for photonuclear reactions at γ-beam facilities with ELISSA (ELI Silicon Strip Array), a new detector array under development at the Extreme Light Infrastructure - Nuclear Physics (ELI-NP). We discuss the results of MC simulations performed to evaluate the effects of the electromagnetic induced background, of the straggling due to the target thickness and of the resolution of the silicon detectors.

  1. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.

    PubMed

    McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S

    2015-10-20

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories

    PubMed Central

    McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.

    2015-01-01

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642

  3. sedFlow - an efficient tool for simulating bedload transport, bed roughness, and longitudinal profile evolution in mountain streams

    NASA Astrophysics Data System (ADS)

    Heimann, F. U. M.; Rickenmann, D.; Turowski, J. M.; Kirchner, J. W.

    2014-07-01

    Especially in mountainuous environments, the prediction of sediment dynamics is important for managing natural hazards, assessing in-stream habitats, and understanding geomorphic evolution. We present the new modelling tool sedFlow for simulating fractional bedload transport dynamics in mountain streams. The model can deal with the effects of adverse slopes and uses state of the art approaches for quantifying macro-roughness effects in steep channels. Local grain size distributions are dynamically adjusted according to the transport dynamics of each grain size fraction. The tool sedFlow features fast calculations and straightforward pre- and postprocessing of simulation data. The model is provided together with its complete source code free of charge under the terms of the GNU General Public License (www.wsl.ch/sedFlow). Examples of the application of sedFlow are given in a companion article by Heimann et al. (2014).

  4. STELLAR: fast and exact local alignments

    PubMed Central

    2011-01-01

    Background Large-scale comparison of genomic sequences requires reliable tools for the search of local alignments. Practical local aligners are in general fast, but heuristic, and hence sometimes miss significant matches. Results We present here the local pairwise aligner STELLAR that has full sensitivity for ε-alignments, i.e. guarantees to report all local alignments of a given minimal length and maximal error rate. The aligner is composed of two steps, filtering and verification. We apply the SWIFT algorithm for lossless filtering, and have developed a new verification strategy that we prove to be exact. Our results on simulated and real genomic data confirm and quantify the conjecture that heuristic tools like BLAST or BLAT miss a large percentage of significant local alignments. Conclusions STELLAR is very practical and fast on very long sequences which makes it a suitable new tool for finding local alignments between genomic sequences under the edit distance model. Binaries are freely available for Linux, Windows, and Mac OS X at http://www.seqan.de/projects/stellar. The source code is freely distributed with the SeqAn C++ library version 1.3 and later at http://www.seqan.de. PMID:22151882

  5. Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.

    2011-01-01

    Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans.

  6. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  7. Environmental corrections of a dual-induction logging while drilling tool in vertical wells

    NASA Astrophysics Data System (ADS)

    Kang, Zhengming; Ke, Shizhen; Jiang, Ming; Yin, Chengfang; Li, Anzong; Li, Junjian

    2018-04-01

    With the development of Logging While Drilling (LWD) technology, dual-induction LWD logging is not only widely applied in deviated wells and horizontal wells, but it is used commonly in vertical wells. Accordingly, it is necessary to simulate the response of LWD tools in vertical wells for logging interpretation. In this paper, the investigation characteristics, the effects of the tool structure, skin effect and drilling environment of a dual-induction LWD tool are simulated by the three-dimensional (3D) finite element method (FEM). In order to closely simulate the actual situation, real structure of the tool is taking into account. The results demonstrate that the influence of the background value of the tool structure can be eliminated. The values of deducting the background of a tool structure and analytical solution have a quantitative agreement in homogeneous formations. The effect of measurement frequency could be effectively eliminated by chart of skin effect correction. In addition, the measurement environment, borehole size, mud resistivity, shoulder bed, layer thickness and invasion, have an effect on the true resistivity. To eliminate these effects, borehole correction charts, shoulder bed correction charts and tornado charts are computed based on real tool structure. Based on correction charts, well logging data can be corrected automatically by a suitable interpolation method, which is convenient and fast. Verified with actual logging data in vertical wells, this method could obtain the true resistivity of formation.

  8. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  9. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  10. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to run on any engineer's desktop computer.

  11. LOS selective fading and AN/FRC-170(V) radio hybrid computer simulation phase A report

    NASA Astrophysics Data System (ADS)

    Klukis, M. K.; Lyon, T. I.; Walker, R.

    1981-09-01

    This report documents results of the first phase of modeling, simulation and study of the dual diversity AN/FRC-170(V) radio and frequency selective fading line of sight channel. Both hybrid computer and circuit technologies were used to develop a fast, accurate and flexible simulation tool to investigate changes and proposed improvements to the design of the AN/FRC-170(V) radio. In addition to the simulation study, a remote hybrid computer terminal was provided to DCEC for interactive study of the modeled radio and channel. Simulated performance of the radio for Rayleigh, line of sight two ray channels, and additive noise are included in the report.

  12. Analysis of C/E results of fission rate ratio measurements in several fast lead VENUS-F cores

    NASA Astrophysics Data System (ADS)

    Kochetkov, Anatoly; Krása, Antonín; Baeten, Peter; Vittiglio, Guido; Wagemans, Jan; Bécares, Vicente; Bianchini, Giancarlo; Fabrizio, Valentina; Carta, Mario; Firpo, Gabriele; Fridman, Emil; Sarotto, Massimo

    2017-09-01

    During the GUINEVERE FP6 European project (2006-2011), the zero-power VENUS water-moderated reactor was modified into VENUS-F, a mock-up of a lead cooled fast spectrum system with solid components that can be operated in both critical and subcritical mode. The Fast Reactor Experiments for hybrid Applications (FREYA) FP7 project was launched in 2011 to support the designs of the MYRRHA Accelerator Driven System (ADS) and the ALFRED Lead Fast Reactor (LFR). Three VENUS-F critical core configurations, simulating the complex MYRRHA core design and one configuration devoted to the LFR ALFRED core conditions were investigated in 2015. The MYRRHA related cores simulated step by step design peculiarities like the BeO reflector and in pile sections. For all of these cores the fuel assemblies were of a simple design consisting of 30% enriched metallic uranium, lead rodlets to simulate the coolant and Al2O3 rodlets to simulate the oxide fuel. Fission rate ratios of minor actinides such as Np-237, Am-241 as well as Pu-239, Pu-240, Pu-242 and U-238 to U-235 were measured in these VENUS-F critical assemblies with small fission chambers in specially designed locations, to determine the spectral indices in the different neutron spectrum conditions. The measurements have been analyzed using advanced computational tools including deterministic and stochastic codes and different nuclear data sets like JEFF-3.1, JEFF-3.2, ENDF/B7.1 and JENDL-4.0. The analysis of the C/E discrepancies will help to improve the nuclear data in the specific energy region of fast neutron reactor spectra.

  13. Coastal Amplification Laws for the French Tsunami Warning Center: Numerical Modeling and Fast Estimate of Tsunami Wave Heights Along the French Riviera

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Hébert, H.; Schindelé, F.; Reymond, D.

    2017-11-01

    Tsunami modeling tools in the French tsunami Warning Center operational context provide rapidly derived warning levels with a dimensionless variable at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observations in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The preliminary results for the Nice test site on the basis of nine historical and synthetic sources show a good agreement with the time-consuming high resolution modeling: the linear approximation is obtained within 1 min in general and provides estimates within a factor of two in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really assessed because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method is well suited for a fast first estimate of the coastal tsunami threat forecast.

  14. Coastal amplification laws for the French tsunami Warning Center: numerical modeling and fast estimate of tsunami wave heights along the French Riviera

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Schindelé, F.; Hebert, H.; Reymond, D.

    2017-12-01

    Tsunami modeling tools in the French tsunami Warning Center operational context provide for now warning levels with a no dimension scale, and at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observation in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The first encouraging results for the Nice test site on the basis of 9 historical and fake sources show a good agreement with the time-consuming high resolution modeling: the linear approximation provides within in general 1 minute estimates less a factor of 2 in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really appreciated because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method suits well for a fast first estimate of the coastal tsunami threat forecast.

  15. Coastal Amplification Laws for the French Tsunami Warning Center: Numerical Modeling and Fast Estimate of Tsunami Wave Heights Along the French Riviera

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Hébert, H.; Schindelé, F.; Reymond, D.

    2018-04-01

    Tsunami modeling tools in the French tsunami Warning Center operational context provide rapidly derived warning levels with a dimensionless variable at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observations in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The preliminary results for the Nice test site on the basis of nine historical and synthetic sources show a good agreement with the time-consuming high resolution modeling: the linear approximation is obtained within 1 min in general and provides estimates within a factor of two in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really assessed because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method is well suited for a fast first estimate of the coastal tsunami threat forecast.

  16. Impact of different policies on unhealthy dietary behaviors in an urban adult population: an agent-based simulation model.

    PubMed

    Zhang, Donglan; Giabbanelli, Philippe J; Arah, Onyebuchi A; Zimmerman, Frederick J

    2014-07-01

    Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems.

  17. Strategies for global optimization in photonics design.

    PubMed

    Vukovic, Ana; Sewell, Phillip; Benson, Trevor M

    2010-10-01

    This paper reports on two important issues that arise in the context of the global optimization of photonic components where large problem spaces must be investigated. The first is the implementation of a fast simulation method and associated matrix solver for assessing particular designs and the second, the strategies that a designer can adopt to control the size of the problem design space to reduce runtimes without compromising the convergence of the global optimization tool. For this study an analytical simulation method based on Mie scattering and a fast matrix solver exploiting the fast multipole method are combined with genetic algorithms (GAs). The impact of the approximations of the simulation method on the accuracy and runtime of individual design assessments and the consequent effects on the GA are also examined. An investigation of optimization strategies for controlling the design space size is conducted on two illustrative examples, namely, 60° and 90° waveguide bends based on photonic microstructures, and their effectiveness is analyzed in terms of a GA's ability to converge to the best solution within an acceptable timeframe. Finally, the paper describes some particular optimized solutions found in the course of this work.

  18. Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation

    Science.gov Websites

    for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home

  19. SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots

    NASA Astrophysics Data System (ADS)

    Boisse, I.; Bonfils, X.; Santos, N. C.

    2012-09-01

    We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap

  20. How to Run FAST Simulations.

    PubMed

    Zimmerman, M I; Bowman, G R

    2016-01-01

    Molecular dynamics (MD) simulations are a powerful tool for understanding enzymes' structures and functions with full atomistic detail. These physics-based simulations model the dynamics of a protein in solution and store snapshots of its atomic coordinates at discrete time intervals. Analysis of the snapshots from these trajectories provides thermodynamic and kinetic properties such as conformational free energies, binding free energies, and transition times. Unfortunately, simulating biologically relevant timescales with brute force MD simulations requires enormous computing resources. In this chapter we detail a goal-oriented sampling algorithm, called fluctuation amplification of specific traits, that quickly generates pertinent thermodynamic and kinetic information by using an iterative series of short MD simulations to explore the vast depths of conformational space. © 2016 Elsevier Inc. All rights reserved.

  1. Modeling of the UAE Wind Turbine for Refinement of FAST{_}AD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, J. M.

    The Unsteady Aerodynamics Experiment (UAE) research wind turbine was modeled both aerodynamically and structurally in the FAST{_}AD wind turbine design code, and its response to wind inflows was simulated for a sample of test cases. A study was conducted to determine why wind turbine load magnitude discrepancies-inconsistencies in aerodynamic force coefficients, rotor shaft torque, and out-of-plane bending moments at the blade root across a range of operating conditions-exist between load predictions made by FAST{_}AD and other modeling tools and measured loads taken from the actual UAE wind turbine during the NASA-Ames wind tunnel tests. The acquired experimental test data representmore » the finest, most accurate set of wind turbine aerodynamic and induced flow field data available today. A sample of the FAST{_}AD model input parameters most critical to the aerodynamics computations was also systematically perturbed to determine their effect on load and performance predictions. Attention was focused on the simpler upwind rotor configuration, zero yaw error test cases. Inconsistencies in input file parameters, such as aerodynamic performance characteristics, explain a noteworthy fraction of the load prediction discrepancies of the various modeling tools.« less

  2. Sub-half-micron contact window design with 3D photolithography simulator

    NASA Astrophysics Data System (ADS)

    Brainerd, Steve K.; Bernard, Douglas A.; Rey, Juan C.; Li, Jiangwei; Granik, Yuri; Boksha, Victor V.

    1997-07-01

    In state of the art IC design and manufacturing certain lithography layers have unique requirements. Latitudes and tolerances that apply to contacts and polysilicon gates are tight for such critical layers. Industry experts are discussing the most cost effective ways to use feature- oriented equipment and materials already developed for these layers. Such requirements introduce new dimensions into the traditionally challenging task for the photolithography engineer when considering various combinations of multiple factors to optimize and control the process. In addition, he/she faces a rapidly increasing cost of experiments, limited time and scarce access to equipment to conduct them. All the reasons presented above support simulation as an ideal method to satisfy these demands. However lithography engineers may be easily dissatisfied with a simulation tool when discovering disagreement between the simulation and experimental data. The problem is that several parameters used in photolithography simulation are very process specific. Calibration, i.e. matching experimental and simulation data using a specific set of procedures allows one to effectively use the simulation tool. We present results of a simulation based approach to optimize photolithography processes for sub-0.5 micron contact windows. Our approach consists of: (1) 3D simulation to explore different lithographic options, (2) calibration to a range of process conditions with extensive use of specifically developed optimization techniques. The choice of a 3D simulator is essential because of 3D nature of the problem of contact window design. We use DEPICT 4.1. This program performs fast aerial image simulation as presented before. For 3D exposure the program uses an extension to three-dimensions of the high numerical aperture model combined with Fast Fourier Transforms for maximum performance and accuracy. We use Kim (U.C. Berkeley) model and the fast marching Level Set method respectively for the calculation of resist development rates and resist surface movement during development process. Calibration efforts were aimed at matching experimental results on contact windows obtained after exposure of a binary mask. Additionally, simulation was applied to conduct quantitative analysis of PSM design capabilities, optical proximity correction, and stepper parameter optimization. Extensive experiments covered exposure (ASML 5500/100D stepper), pre- and post-exposure bake and development (2.38% TMAH, puddle process) of JSR IX725D2G and TOK iP3500 photoresists films on 200 mm test wafers. `Aquatar' was used as top antireflective coating, SEM pictures of developed patterns were analyzed and compared with simulation results for different values of defocus, exposure energies, numerical aperture and partial coherence.

  3. Simulation tools for scattering corrections in spectrally resolved x-ray computed tomography using McXtrace

    NASA Astrophysics Data System (ADS)

    Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer

    2018-03-01

    Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).

  4. Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology

    NASA Technical Reports Server (NTRS)

    Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan

    2012-01-01

    A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.

  5. Sound field simulation and acoustic animation in urban squares

    NASA Astrophysics Data System (ADS)

    Kang, Jian; Meng, Yan

    2005-04-01

    Urban squares are important components of cities, and the acoustic environment is important for their usability. While models and formulae for predicting the sound field in urban squares are important for their soundscape design and improvement, acoustic animation tools would be of great importance for designers as well as for public participation process, given that below a certain sound level, the soundscape evaluation depends mainly on the type of sounds rather than the loudness. This paper first briefly introduces acoustic simulation models developed for urban squares, as well as empirical formulae derived from a series of simulation. It then presents an acoustic animation tool currently being developed. In urban squares there are multiple dynamic sound sources, so that the computation time becomes a main concern. Nevertheless, the requirements for acoustic animation in urban squares are relatively low compared to auditoria. As a result, it is important to simplify the simulation process and algorithms. Based on a series of subjective tests in a virtual reality environment with various simulation parameters, a fast simulation method with acceptable accuracy has been explored. [Work supported by the European Commission.

  6. Tri-FAST Hardware-in-the-Loop Simulation. Volume I. Tri-FAST Hardware-in-the-Loop Simulation at the Advanced Simulation Center

    DTIC Science & Technology

    1979-03-28

    TECHNICAL REPORT T-79-43 TRI- FAST HARDWARE-IN-THE-LOOP SIMULATION Volume 1: Trn FAST Hardware-In-the. Loop Simulation at the Advanced Simulation...Identify by block number) Tri- FAST Hardware-in-the-Loop ACSL Advanced Simulation Center Simulation RF Target Models I a. AfIACT ( sin -oveme skit N nem...e n tdositr by block number) The purpose of this report is to document the Tri- FAST missile simulation development and the seeker hardware-in-the

  7. EBR-II Static Neutronic Calculations by PHISICS / MCNP6 codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paolo Balestra; Carlo Parisi; Andrea Alfonsi

    2016-02-01

    The International Atomic Energy Agency (IAEA) launched a Coordinated Research Project (CRP) on the Shutdown Heat Removal Tests (SHRT) performed in the '80s at the Experimental fast Breeder Reactor EBR-II, USA. The scope of the CRP is to improve and validate the simulation tools for the study and the design of the liquid metal cooled fast reactors. Moreover, training of the next generation of fast reactor analysts is being also considered the other scope of the CRP. In this framework, a static neutronic model was developed, using state-of-the art neutron transport codes like SCALE/PHISICS (deterministic solution) and MCNP6 (stochastic solution).more » Comparison between both solutions is briefly illustrated in this summary.« less

  8. The influence of preferential flow on pressure propagation and landslide triggering of the Rocca Pitigliana landslide

    NASA Astrophysics Data System (ADS)

    Shao, Wei; Bogaard, Thom; Bakker, Mark; Berti, Matteo

    2016-12-01

    The fast pore water pressure response to rain events is an important triggering factor for slope instability. The fast pressure response may be caused by preferential flow that bypasses the soil matrix. Currently, most of the hydro-mechanical models simulate pore water pressure using a single-permeability model, which cannot quantify the effects of preferential flow on pressure propagation and landslide triggering. Previous studies showed that a model based on the linear-diffusion equation can simulate the fast pressure propagation in near-saturated landslides such as the Rocca Pitigliana landslide. In such a model, the diffusion coefficient depends on the degree of saturation, which makes it difficult to use the model for predictions. In this study, the influence of preferential flow on pressure propagation and slope stability is investigated with a 1D dual-permeability model coupled with an infinite-slope stability approach. The dual-permeability model uses two modified Darcy-Richards equations to simultaneously simulate the matrix flow and preferential flow in hillslopes. The simulated pressure head is used in an infinite-slope stability analysis to identify the influence of preferential flow on the fast pressure response and landslide triggering. The dual-permeability model simulates the height and arrival of the pressure peak reasonably well. Performance of the dual-permeability model is as good as or better than the linear-diffusion model even though the dual-permeability model is calibrated for two single pulse rain events only, while the linear-diffusion model is calibrated for each rain event separately. In conclusion, the 1D dual-permeability model is a promising tool for landslides under similar conditions.

  9. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Häggström, Ida, E-mail: haeggsti@mskcc.org; Beattie, Bradley J.; Schmidtlein, C. Ross

    2016-06-15

    Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuationmore » are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dPETSTEP can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.« less

  10. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karr, Dale G.; Yu, Bingbin; Sirnivas, Senu

    To create long-term solutions for offshore wind turbines in a variety of environmental conditions, CAE tools are needed to model the design-driving loads that interact with an offshore wind turbine system during operation. This report describes our efforts in augmenting existing CAE tools used for offshore wind turbine analysis with a new module that can provide simulation capabilities for ice loading on the system. This augmentation was accomplished by creating an ice-loading module coupled to FAST8, the CAE tool maintained by the NREL for simulating land-based and offshore wind turbine dynamics. The new module includes both static and dynamic icemore » loading that can be applied during a dynamic simulation of the response of an offshore wind turbine. The ice forces can be prescribed, or influenced by the structure’s compliant response, or by the dynamics of both the structure and the ice floe. The new module covers ice failure modes of spalling, buckling, crushing, splitting, and bending. The supporting structure of wind turbines can be modeled as a vertical or sloping form at the waterline. The Inward Battered Guide Structure (IBGS) foundation designed by Keystone Engineering for the Great Lakes was used to study the ice models coupled to FAST8. The IBGS foundation ice loading simulations in FAST8 were compared to the baseline simulation case without ice loading. The ice conditions reflecting those from Lake Huron at Port Huron and Lake Michigan at North Manitou were studied under near rated wind speed of 12 m/s for the NREL 5-MW reference turbine. Simulations were performed on ice loading models 1 through 4 and ice model 6 with their respective sub-models. The purpose of ice model 5 is to investigate ice loading on sloping structures such as ice-cones on a monopile and is not suitable for multi-membered jacketed structures like the IBGS foundation. The key response parameters from the simulations, shear forces and moments from the tower base and IBGS foundation base, were compared. Ice models 1 and 6 do not significantly affect the tower fore-aft shear and moment. However, ice model 2 (dynamic analyses), model 3 (random ice loading), and model 4 (multiple ice failure zone loading) show increased effect on the tower fore-aft shear and moment with significant effect from ice model 3.1. In general ice loading creates large reaction forces and moments at the base of the IBGS foundation; the largest occurred in model 1.1 (steady creep ice indentation loading) followed by model 3.1 (random creep ice indentation loading). In general the power production from the ice loading cases had little deviation from the baseline case without ice loading. For ultimate limit state (ULS), ice model 1.1 ice and 3.1 appear to be the ice most critical models to consider at an early stage of design. Ice model 4 is an important tool for assessing structural fatigue.« less

  12. An enhanced lumped element electrical model of a double barrier memristive device

    NASA Astrophysics Data System (ADS)

    Solan, Enver; Dirkmann, Sven; Hansen, Mirko; Schroeder, Dietmar; Kohlstedt, Hermann; Ziegler, Martin; Mussenbrock, Thomas; Ochs, Karlheinz

    2017-05-01

    The massive parallel approach of neuromorphic circuits leads to effective methods for solving complex problems. It has turned out that resistive switching devices with a continuous resistance range are potential candidates for such applications. These devices are memristive systems—nonlinear resistors with memory. They are fabricated in nanotechnology and hence parameter spread during fabrication may aggravate reproducible analyses. This issue makes simulation models of memristive devices worthwhile. Kinetic Monte-Carlo simulations based on a distributed model of the device can be used to understand the underlying physical and chemical phenomena. However, such simulations are very time-consuming and neither convenient for investigations of whole circuits nor for real-time applications, e.g. emulation purposes. Instead, a concentrated model of the device can be used for both fast simulations and real-time applications, respectively. We introduce an enhanced electrical model of a valence change mechanism (VCM) based double barrier memristive device (DBMD) with a continuous resistance range. This device consists of an ultra-thin memristive layer sandwiched between a tunnel barrier and a Schottky-contact. The introduced model leads to very fast simulations by using usual circuit simulation tools while maintaining physically meaningful parameters. Kinetic Monte-Carlo simulations based on a distributed model and experimental data have been utilized as references to verify the concentrated model.

  13. Quantifying the Effect of Fast Charger Deployments on Electric Vehicle Utility and Travel Patterns via Advanced Simulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, E.; Neubauer, J.; Burton, E.

    The disparate characteristics between conventional (CVs) and battery electric vehicles (BEVs) in terms of driving range, refill/recharge time, and availability of refuel/recharge infrastructure inherently limit the relative utility of BEVs when benchmarked against traditional driver travel patterns. However, given a high penetration of high-power public charging combined with driver tolerance for rerouting travel to facilitate charging on long-distance trips, the difference in utility between CVs and BEVs could be marginalized. We quantify the relationships between BEV utility, the deployment of fast chargers, and driver tolerance for rerouting travel and extending travel durations by simulating BEVs operated over real-world travel patternsmore » using the National Renewable Energy Laboratory's Battery Lifetime Analysis and Simulation Tool for Vehicles (BLAST-V). With support from the U.S. Department of Energy's Vehicle Technologies Office, BLAST-V has been developed to include algorithms for estimating the available range of BEVs prior to the start of trips, for rerouting baseline travel to utilize public charging infrastructure when necessary, and for making driver travel decisions for those trips in the presence of available public charging infrastructure, all while conducting advanced vehicle simulations that account for battery electrical, thermal, and degradation response. Results from BLAST-V simulations on vehicle utility, frequency of inserted stops, duration of charging events, and additional time and distance necessary for rerouting travel are presented to illustrate how BEV utility and travel patterns can be affected by various fast charge deployments.« less

  14. An Alternative Frictional Boundary Condition for Computational Fluid Dynamics Simulation of Friction Stir Welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gaoqiang; Feng, Zhili; Zhu, Yucan

    For better application of numerical simulation in optimization and design of friction stir welding (FSW), this paper presents a new frictional boundary condition at the tool/workpiece interface for computational fluid dynamics (CFD) modeling of FSW. The proposed boundary condition is based on an implementation of the Coulomb friction model. Using the new boundary condition, the CFD simulation yields non-uniform distribution of contact state over the tool/workpiece interface, as validated by the experimental weld macrostructure. It is found that interfacial sticking state is present over large area at the tool-workpiece interface, while significant interfacial sliding occurs at the shoulder periphery, themore » lower part of pin side, and the periphery of pin bottom. Due to the interfacial sticking, a rotating flow zone is found under the shoulder, in which fast circular motion occurs. The diameter of the rotating flow zone is smaller than the shoulder diameter, which is attributed to the presence of the interfacial sliding at the shoulder periphery. For the simulated welding condition, the heat generation due to friction and plastic deformation makes up 54.4 and 45.6% of the total heat generation rate, respectively. In conclusion, the simulated temperature field is validated by the good agreement to the experimental measurements.« less

  15. An Alternative Frictional Boundary Condition for Computational Fluid Dynamics Simulation of Friction Stir Welding

    DOE PAGES

    Chen, Gaoqiang; Feng, Zhili; Zhu, Yucan; ...

    2016-07-11

    For better application of numerical simulation in optimization and design of friction stir welding (FSW), this paper presents a new frictional boundary condition at the tool/workpiece interface for computational fluid dynamics (CFD) modeling of FSW. The proposed boundary condition is based on an implementation of the Coulomb friction model. Using the new boundary condition, the CFD simulation yields non-uniform distribution of contact state over the tool/workpiece interface, as validated by the experimental weld macrostructure. It is found that interfacial sticking state is present over large area at the tool-workpiece interface, while significant interfacial sliding occurs at the shoulder periphery, themore » lower part of pin side, and the periphery of pin bottom. Due to the interfacial sticking, a rotating flow zone is found under the shoulder, in which fast circular motion occurs. The diameter of the rotating flow zone is smaller than the shoulder diameter, which is attributed to the presence of the interfacial sliding at the shoulder periphery. For the simulated welding condition, the heat generation due to friction and plastic deformation makes up 54.4 and 45.6% of the total heat generation rate, respectively. In conclusion, the simulated temperature field is validated by the good agreement to the experimental measurements.« less

  16. Naval electronic warfare simulation for effectiveness assessment and softkill programmability facility

    NASA Astrophysics Data System (ADS)

    Lançon, F.

    2011-06-01

    The Anti-ship Missile (ASM) threat to be faced by ships will become more diverse and difficult. Intelligence, rules of engagement constraints, fast reaction-time for effective softkill solution require specific tools to design Electronic Warfare (EW) systems and to integrate it onboard ship. SAGEM Company provides decoy launcher system [1] and its associated Naval Electronic Warfare Simulation tool (NEWS) to permit softkill effectiveness analysis for anti-ship missile defence. NEWS tool generates virtual environment for missile-ship engagement and counter-measure simulator over a wide spectrum: RF, IR, EO. It integrates EW Command & Control (EWC2) process which is implemented in decoy launcher system and performs Monte-Carlo batch processing to evaluate softkill effectiveness in different engagement situations. NEWS is designed to allow immediate EWC2 process integration from simulation to real decoy launcher system. By design, it allows the final operator to be able to program, test and integrate its own EWC2 module and EW library onboard, so intelligence of each user is protected and evolution of threat can be taken into account through EW library update. The objectives of NEWS tool are also to define a methodology for trial definition and trial data reduction. Growth potential would permit to design new concept for EWC2 programmability and real time effectiveness estimation in EW system. This tool can also be used for operator training purpose. This paper presents the architecture design, the softkill programmability facility concept and the flexibility for onboard integration on ship. The concept of this operationally focused simulation, which is to use only one tool for design, development, trial validation and operational use, will be demonstrated.

  17. Aeroelastic-Acoustics Simulation of Flight Systems

    NASA Technical Reports Server (NTRS)

    Gupta, kajal K.; Choi, S.; Ibrahim, A.

    2009-01-01

    This paper describes the details of a numerical finite element (FE) based analysis procedure and a resulting code for the simulation of the acoustics phenomenon arising from aeroelastic interactions. Both CFD and structural simulations are based on FE discretization employing unstructured grids. The sound pressure level (SPL) on structural surfaces is calculated from the root mean square (RMS) of the unsteady pressure and the acoustic wave frequencies are computed from a fast Fourier transform (FFT) of the unsteady pressure distribution as a function of time. The resulting tool proves to be unique as it is designed to analyze complex practical problems, involving large scale computations, in a routine fashion.

  18. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc; Bush, Brian; Penev, Michael

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  19. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

  20. The Development of a 3D LADAR Simulator Based on a Fast Target Impulse Response Generation Approach

    NASA Astrophysics Data System (ADS)

    Al-Temeemy, Ali Adnan

    2017-09-01

    A new laser detection and ranging (LADAR) simulator has been developed, using MATLAB and its graphical user interface, to simulate direct detection time of flight LADAR systems, and to produce 3D simulated scanning images under a wide variety of conditions. This simulator models each stage from the laser source to data generation and can be considered as an efficient simulation tool to use when developing LADAR systems and their data processing algorithms. The novel approach proposed for this simulator is to generate the actual target impulse response. This approach is fast and able to deal with high scanning requirements without losing the fidelity that accompanies increments in speed. This leads to a more efficient LADAR simulator and opens up the possibility for simulating LADAR beam propagation more accurately by using a large number of laser footprint samples. The approach is to select only the parts of the target that lie in the laser beam angular field by mathematically deriving the required equations and calculating the target angular ranges. The performance of the new simulator has been evaluated under different scanning conditions, the results showing significant increments in processing speeds in comparison to conventional approaches, which are also used in this study as a point of comparison for the results. The results also show the simulator's ability to simulate phenomena related to the scanning process, for example, type of noise, scanning resolution and laser beam width.

  1. MACHETE: Environment for Space Networking Evaluation

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John S.; Woo, Simon

    2010-01-01

    Space Exploration missions requires the design and implementation of space networking that differs from terrestrial networks. In a space networking architecture, interplanetary communication protocols need to be designed, validated and evaluated carefully to support different mission requirements. As actual systems are expensive to build, it is essential to have a low cost method to validate and verify mission/system designs and operations. This can be accomplished through simulation. Simulation can aid design decisions where alternative solutions are being considered, support trade-studies and enable fast study of what-if scenarios. It can be used to identify risks, verify system performance against requirements, and as an initial test environment as one moves towards emulation and actual hardware implementation of the systems. We describe the development of Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) and its use cases in supporting architecture trade studies, protocol performance and its role in hybrid simulation/emulation. The MACHETE environment contains various tools and interfaces such that users may select the set of tools tailored for the specific simulation end goal. The use cases illustrate tool combinations for simulating space networking in different mission scenarios. This simulation environment is useful in supporting space networking design for planned and future missions as well as evaluating performance of existing networks where non-determinism exist in data traffic and/or link conditions.

  2. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  3. PhySortR: a fast, flexible tool for sorting phylogenetic trees in R.

    PubMed

    Stephens, Timothy G; Bhattacharya, Debashish; Ragan, Mark A; Chan, Cheong Xin

    2016-01-01

    A frequent bottleneck in interpreting phylogenomic output is the need to screen often thousands of trees for features of interest, particularly robust clades of specific taxa, as evidence of monophyletic relationship and/or reticulated evolution. Here we present PhySortR, a fast, flexible R package for classifying phylogenetic trees. Unlike existing utilities, PhySortR allows for identification of both exclusive and non-exclusive clades uniting the target taxa based on tip labels (i.e., leaves) on a tree, with customisable options to assess clades within the context of the whole tree. Using simulated and empirical datasets, we demonstrate the potential and scalability of PhySortR in analysis of thousands of phylogenetic trees without a priori assumption of tree-rooting, and in yielding readily interpretable trees that unambiguously satisfy the query. PhySortR is a command-line tool that is freely available and easily automatable.

  4. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  5. A fast forward algorithm for real-time geosteering of azimuthal gamma-ray logging.

    PubMed

    Qin, Zhen; Pan, Heping; Wang, Zhonghao; Wang, Bintao; Huang, Ke; Liu, Shaohua; Li, Gang; Amara Konaté, Ahmed; Fang, Sinan

    2017-05-01

    Geosteering is an effective method to increase the reservoir drilling rate in horizontal wells. Based on the features of an azimuthal gamma-ray logging tool and strata spatial location, a fast forward calculation method of azimuthal gamma-ray logging is deduced by using the natural gamma ray distribution equation in formation. The response characteristics of azimuthal gamma-ray logging while drilling in the layered formation models with different thickness and position are simulated and summarized by using the method. The result indicates that the method calculates quickly, and when the tool nears a boundary, the method can be used to identify the boundary and determine the distance from the logging tool to the boundary in time. Additionally, the formation parameters of the algorithm in the field can be determined after a simple method is proposed based on the information of an offset well. Therefore, the forward method can be used for geosteering in the field. A field example validates that the forward method can be used to determine the distance from the azimuthal gamma-ray logging tool to the boundary for geosteering in real-time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A fast, parallel algorithm for distant-dependent calculation of crystal properties

    NASA Astrophysics Data System (ADS)

    Stein, Matthew

    2017-12-01

    A fast, parallel algorithm for distant-dependent calculation and simulation of crystal properties is presented along with speedup results and methods of application. An illustrative example is used to compute the Lennard-Jones lattice constants up to 32 significant figures for 4 ≤ p ≤ 30 in the simple cubic, face-centered cubic, body-centered cubic, hexagonal-close-pack, and diamond lattices. In most cases, the known precision of these constants is more than doubled, and in some cases, corrected from previously published figures. The tools and strategies to make this computation possible are detailed along with application to other potentials, including those that model defects.

  7. Time-Dependent Simulations of Fast-Wave Heated High-Non-Inductive-Fraction H-Mode Plasmas in the National Spherical Torus Experiment Upgrade

    NASA Astrophysics Data System (ADS)

    Taylor, Gary; Bertelli, Nicola; Gerhardt, Stefan P.; Hosea, Joel C.; Mueller, Dennis; Perkins, Rory J.; Poli, Francesca M.; Wilson, James R.; Raman, Roger

    2017-10-01

    30 MHz fast-wave heating may be an effective tool for non-inductively ramping low-current plasmas to a level suitable for initiating up to 12 MW of neutral beam injection on the National Spherical Tokamak Experiment Upgrade (NSTX-U). Previously on NSTX 30 MHz fast wave heating was shown to efficiently and rapidly heat electrons; at the NSTX maximum axial toroidal magnetic field (BT(0)) of 0.55 T, 1.4 MW of 30 MHz heating increased the central electron temperature from 0.2 to 2 keV in 30 ms and generated an H-mode plasma with a non-inductive fraction (fNI) ˜ 0.7 at a plasma current (Ip) of 300 kA. NSTX-U will operate at BT(0) up to 1 T, with up to 4 MW of 30 MHz power (Prf). Predictive TRANSP free boundary transport simulations, using the TORIC full wave spectral code to calculate the fast-wave heating and current drive, have been run for NSTX-U Ip = 300 kA H-mode plasmas. Favorable scaling of fNI with 30 MHz heating power is predicted, with fNI ≥ 1 for Prf ≥ 2 MW.

  8. Effects of MHD instabilities on neutral beam current drive

    NASA Astrophysics Data System (ADS)

    Podestà, M.; Gorelenkova, M.; Darrow, D. S.; Fredrickson, E. D.; Gerhardt, S. P.; White, R. B.

    2015-05-01

    Neutral beam injection (NBI) is one of the primary tools foreseen for heating, current drive (CD) and q-profile control in future fusion reactors such as ITER and a Fusion Nuclear Science Facility. However, fast ions from NBI may also provide the drive for energetic particle-driven instabilities (e.g. Alfvénic modes (AEs)), which in turn redistribute fast ions in both space and energy, thus hampering the control capabilities and overall efficiency of NB-driven current. Based on experiments on the NSTX tokamak (M. Ono et al 2000 Nucl. Fusion 40 557), the effects of AEs and other low-frequency magneto-hydrodynamic instabilities on NB-CD efficiency are investigated. A new fast ion transport model, which accounts for particle transport in phase space as required for resonant AE perturbations, is utilized to obtain consistent simulations of NB-CD through the tokamak transport code TRANSP. It is found that instabilities do indeed reduce the NB-driven current density over most of the plasma radius by up to ∼50%. Moreover, the details of the current profile evolution are sensitive to the specific model used to mimic the interaction between NB ions and instabilities. Implications for fast ion transport modeling in integrated tokamak simulations are briefly discussed.

  9. Effects of MHD instabilities on neutral beam current drive

    DOE PAGES

    Podestà, M.; Gorelenkova, M.; Darrow, D. S.; ...

    2015-04-17

    One of the primary tools foreseen for heating, current drive (CD) and q-profile control in future fusion reactors such as ITER and a Fusion Nuclear Science Facility is the neutral beam injection (NBI). However, fast ions from NBI may also provide the drive for energetic particle-driven instabilities (e.g. Alfvénic modes (AEs)), which in turn redistribute fast ions in both space and energy, thus hampering the control capabilities and overall efficiency of NB-driven current. Based on experiments on the NSTX tokamak (M. Ono et al 2000 Nucl. Fusion 40 557), the effects of AEs and other low-frequency magneto-hydrodynamic instabilities on NB-CDmore » efficiency are investigated. When looking at the new fast ion transport model, which accounts for particle transport in phase space as required for resonant AE perturbations, is utilized to obtain consistent simulations of NB-CD through the tokamak transport code TRANSP. It is found that instabilities do indeed reduce the NB-driven current density over most of the plasma radius by up to ~50%. Moreover, the details of the current profile evolution are sensitive to the specific model used to mimic the interaction between NB ions and instabilities. Finally, implications for fast ion transport modeling in integrated tokamak simulations are briefly discussed.« less

  10. Study on Ultra-deep Azimuthal Electromagnetic Resistivity LWD Tool by Influence Quantification on Azimuthal Depth of Investigation and Real Signal

    NASA Astrophysics Data System (ADS)

    Li, Kesai; Gao, Jie; Ju, Xiaodong; Zhu, Jun; Xiong, Yanchun; Liu, Shuai

    2018-05-01

    This paper proposes a new tool design of ultra-deep azimuthal electromagnetic (EM) resistivity logging while drilling (LWD) for deeper geosteering and formation evaluation, which can benefit hydrocarbon exploration and development. First, a forward numerical simulation of azimuthal EM resistivity LWD is created based on the fast Hankel transform (FHT) method, and its accuracy is confirmed under classic formation conditions. Then, a reasonable range of tool parameters is designed by analyzing the logging response. However, modern technological limitations pose challenges to selecting appropriate tool parameters for ultra-deep azimuthal detection under detectable signal conditions. Therefore, this paper uses grey relational analysis (GRA) to quantify the influence of tool parameters on voltage and azimuthal investigation depth. After analyzing thousands of simulation data under different environmental conditions, the random forest is used to fit data and identify an optimal combination of tool parameters due to its high efficiency and accuracy. Finally, the structure of the ultra-deep azimuthal EM resistivity LWD tool is designed with a theoretical azimuthal investigation depth of 27.42-29.89 m in classic different isotropic and anisotropic formations. This design serves as a reliable theoretical foundation for efficient geosteering and formation evaluation in high-angle and horizontal (HA/HZ) wells in the future.

  11. The new ATLAS Fast Calorimeter Simulation

    NASA Astrophysics Data System (ADS)

    Schaarschmidt, J.; ATLAS Collaboration

    2017-10-01

    Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.

  12. Telemetry-Enhancing Scripts

    NASA Technical Reports Server (NTRS)

    Maimone, Mark W.

    2009-01-01

    Scripts Providing a Cool Kit of Telemetry Enhancing Tools (SPACKLE) is a set of software tools that fill gaps in capabilities of other software used in processing downlinked data in the Mars Exploration Rovers (MER) flight and test-bed operations. SPACKLE tools have helped to accelerate the automatic processing and interpretation of MER mission data, enabling non-experts to understand and/or use MER query and data product command simulation software tools more effectively. SPACKLE has greatly accelerated some operations and provides new capabilities. The tools of SPACKLE are written, variously, in Perl or the C or C++ language. They perform a variety of search and shortcut functions that include the following: Generating text-only, Event Report-annotated, and Web-enhanced views of command sequences; Labeling integer enumerations with their symbolic meanings in text messages and engineering channels; Systematic detecting of corruption within data products; Generating text-only displays of data-product catalogs including downlink status; Validating and labeling of commands related to data products; Performing of convenient searches of detailed engineering data spanning multiple Martian solar days; Generating tables of initial conditions pertaining to engineering, health, and accountability data; Simplified construction and simulation of command sequences; and Fast time format conversions and sorting.

  13. Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.

    2002-01-01

    Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorier, Matthieu; Sisneros, Roberto; Bautista Gomez, Leonard

    While many parallel visualization tools now provide in situ visualization capabilities, the trend has been to feed such tools with large amounts of unprocessed output data and let them render everything at the highest possible resolution. This leads to an increased run time of simulations that still have to complete within a fixed-length job allocation. In this paper, we tackle the challenge of enabling in situ visualization under performance constraints. Our approach shuffles data across processes according to its content and filters out part of it in order to feed a visualization pipeline with only a reorganized subset of themore » data produced by the simulation. Our framework leverages fast, generic evaluation procedures to score blocks of data, using information theory, statistics, and linear algebra. It monitors its own performance and adapts dynamically to achieve appropriate visual fidelity within predefined performance constraints. Experiments on the Blue Waters supercomputer with the CM1 simulation show that our approach enables a 5 speedup with respect to the initial visualization pipeline and is able to meet performance constraints.« less

  15. ModFossa: A library for modeling ion channels using Python.

    PubMed

    Ferneyhough, Gareth B; Thibealut, Corey M; Dascalu, Sergiu M; Harris, Frederick C

    2016-06-01

    The creation and simulation of ion channel models using continuous-time Markov processes is a powerful and well-used tool in the field of electrophysiology and ion channel research. While several software packages exist for the purpose of ion channel modeling, most are GUI based, and none are available as a Python library. In an attempt to provide an easy-to-use, yet powerful Markov model-based ion channel simulator, we have developed ModFossa, a Python library supporting easy model creation and stimulus definition, complete with a fast numerical solver, and attractive vector graphics plotting.

  16. sedFlow - a tool for simulating fractional bedload transport and longitudinal profile evolution in mountain streams

    NASA Astrophysics Data System (ADS)

    Heimann, F. U. M.; Rickenmann, D.; Turowski, J. M.; Kirchner, J. W.

    2015-01-01

    Especially in mountainous environments, the prediction of sediment dynamics is important for managing natural hazards, assessing in-stream habitats and understanding geomorphic evolution. We present the new modelling tool {sedFlow} for simulating fractional bedload transport dynamics in mountain streams. sedFlow is a one-dimensional model that aims to realistically reproduce the total transport volumes and overall morphodynamic changes resulting from sediment transport events such as major floods. The model is intended for temporal scales from the individual event (several hours to few days) up to longer-term evolution of stream channels (several years). The envisaged spatial scale covers complete catchments at a spatial discretisation of several tens of metres to a few hundreds of metres. sedFlow can deal with the effects of streambeds that slope uphill in a downstream direction and uses recently proposed and tested approaches for quantifying macro-roughness effects in steep channels. sedFlow offers different options for bedload transport equations, flow-resistance relationships and other elements which can be selected to fit the current application in a particular catchment. Local grain-size distributions are dynamically adjusted according to the transport dynamics of each grain-size fraction. sedFlow features fast calculations and straightforward pre- and postprocessing of simulation data. The high simulation speed allows for simulations of several years, which can be used, e.g., to assess the long-term impact of river engineering works or climate change effects. In combination with the straightforward pre- and postprocessing, the fast calculations facilitate efficient workflows for the simulation of individual flood events, because the modeller gets the immediate results as direct feedback to the selected parameter inputs. The model is provided together with its complete source code free of charge under the terms of the GNU General Public License (GPL) (www.wsl.ch/sedFlow). Examples of the application of sedFlow are given in a companion article by Heimann et al. (2015).

  17. Electron emission from condensed phase material induced by fast protons.

    PubMed

    Shinpaugh, J L; McLawhorn, R A; McLawhorn, S L; Carnes, K D; Dingfelder, M; Travia, A; Toburen, L H

    2011-02-01

    Monte Carlo track simulation has become an important tool in radiobiology. Monte Carlo transport codes commonly rely on elastic and inelastic electron scattering cross sections determined using theoretical methods supplemented with gas-phase data; experimental condensed phase data are often unavailable or infeasible. The largest uncertainties in the theoretical methods exist for low-energy electrons, which are important for simulating electron track ends. To test the reliability of these codes to deal with low-energy electron transport, yields of low-energy secondary electrons ejected from thin foils have been measured following passage of fast protons. Fast ions, where interaction cross sections are well known, provide the initial spectrum of low-energy electrons that subsequently undergo elastic and inelastic scattering in the material before exiting the foil surface and being detected. These data, measured as a function of the energy and angle of the emerging electrons, can provide tests of the physics of electron transport. Initial measurements from amorphous solid water frozen to a copper substrate indicated substantial disagreement with MC simulation, although questions remained because of target charging. More recent studies, using different freezing techniques, do not exhibit charging, but confirm the disagreement seen earlier between theory and experiment. One now has additional data on the absolute differential electron yields from copper, aluminum and gold, as well as for thin films of frozen hydrocarbons. Representative data are presented.

  18. Electron emission from condensed phase material induced by fast protons†

    PubMed Central

    Shinpaugh, J. L.; McLawhorn, R. A.; McLawhorn, S. L.; Carnes, K. D.; Dingfelder, M.; Travia, A.; Toburen, L. H.

    2011-01-01

    Monte Carlo track simulation has become an important tool in radiobiology. Monte Carlo transport codes commonly rely on elastic and inelastic electron scattering cross sections determined using theoretical methods supplemented with gas-phase data; experimental condensed phase data are often unavailable or infeasible. The largest uncertainties in the theoretical methods exist for low-energy electrons, which are important for simulating electron track ends. To test the reliability of these codes to deal with low-energy electron transport, yields of low-energy secondary electrons ejected from thin foils have been measured following passage of fast protons. Fast ions, where interaction cross sections are well known, provide the initial spectrum of low-energy electrons that subsequently undergo elastic and inelastic scattering in the material before exiting the foil surface and being detected. These data, measured as a function of the energy and angle of the emerging electrons, can provide tests of the physics of electron transport. Initial measurements from amorphous solid water frozen to a copper substrate indicated substantial disagreement with MC simulation, although questions remained because of target charging. More recent studies, using different freezing techniques, do not exhibit charging, but confirm the disagreement seen earlier between theory and experiment. One now has additional data on the absolute differential electron yields from copper, aluminum and gold, as well as for thin films of frozen hydrocarbons. Representative data are presented. PMID:21183539

  19. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

  20. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  1. Feasibility assessment of the interactive use of a Monte Carlo algorithm in treatment planning for intraoperative electron radiation therapy

    NASA Astrophysics Data System (ADS)

    Guerra, Pedro; Udías, José M.; Herranz, Elena; Santos-Miranda, Juan Antonio; Herraiz, Joaquín L.; Valdivieso, Manlio F.; Rodríguez, Raúl; Calama, Juan A.; Pascau, Javier; Calvo, Felipe A.; Illana, Carlos; Ledesma-Carbayo, María J.; Santos, Andrés

    2014-12-01

    This work analysed the feasibility of using a fast, customized Monte Carlo (MC) method to perform accurate computation of dose distributions during pre- and intraplanning of intraoperative electron radiation therapy (IOERT) procedures. The MC method that was implemented, which has been integrated into a specific innovative simulation and planning tool, is able to simulate the fate of thousands of particles per second, and it was the aim of this work to determine the level of interactivity that could be achieved. The planning workflow enabled calibration of the imaging and treatment equipment, as well as manipulation of the surgical frame and insertion of the protection shields around the organs at risk and other beam modifiers. In this way, the multidisciplinary team involved in IOERT has all the tools necessary to perform complex MC dosage simulations adapted to their equipment in an efficient and transparent way. To assess the accuracy and reliability of this MC technique, dose distributions for a monoenergetic source were compared with those obtained using a general-purpose software package used widely in medical physics applications. Once accuracy of the underlying simulator was confirmed, a clinical accelerator was modelled and experimental measurements in water were conducted. A comparison was made with the output from the simulator to identify the conditions under which accurate dose estimations could be obtained in less than 3 min, which is the threshold imposed to allow for interactive use of the tool in treatment planning. Finally, a clinically relevant scenario, namely early-stage breast cancer treatment, was simulated with pre- and intraoperative volumes to verify that it was feasible to use the MC tool intraoperatively and to adjust dose delivery based on the simulation output, without compromising accuracy. The workflow provided a satisfactory model of the treatment head and the imaging system, enabling proper configuration of the treatment planning system and providing good accuracy in the dosage simulation.

  2. More Fight-Less Fuel: Reducing Fuel Burn through Ground Process Improvement

    DTIC Science & Technology

    2013-06-01

    These joint government and commercial air operations management suites are fast, accurate, and offer many of 33 same tools as SPADE. However, the U.S...passing hour of the day. Simulating the operations at an airfield is similar to a host of related operations management problems including restaurant...flight line may yield significant fuel and cost reductions. Focusing on the efficient use of ground resources through air operations management in a

  3. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Penev, M.; Melaina, M.

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  4. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    NASA Astrophysics Data System (ADS)

    España, S; Herraiz, J L; Vicente, E; Vaquero, J J; Desco, M; Udias, J M

    2009-03-01

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  5. NetCoffee: a fast and accurate global alignment approach to identify functionally conserved proteins in multiple networks.

    PubMed

    Hu, Jialu; Kehr, Birte; Reinert, Knut

    2014-02-15

    Owing to recent advancements in high-throughput technologies, protein-protein interaction networks of more and more species become available in public databases. The question of how to identify functionally conserved proteins across species attracts a lot of attention in computational biology. Network alignments provide a systematic way to solve this problem. However, most existing alignment tools encounter limitations in tackling this problem. Therefore, the demand for faster and more efficient alignment tools is growing. We present a fast and accurate algorithm, NetCoffee, which allows to find a global alignment of multiple protein-protein interaction networks. NetCoffee searches for a global alignment by maximizing a target function using simulated annealing on a set of weighted bipartite graphs that are constructed using a triplet approach similar to T-Coffee. To assess its performance, NetCoffee was applied to four real datasets. Our results suggest that NetCoffee remedies several limitations of previous algorithms, outperforms all existing alignment tools in terms of speed and nevertheless identifies biologically meaningful alignments. The source code and data are freely available for download under the GNU GPL v3 license at https://code.google.com/p/netcoffee/.

  6. How to resolve microsecond current fluctuations in single ion channels: The power of beta distributions

    PubMed Central

    Schroeder, Indra

    2015-01-01

    Abstract A main ingredient for the understanding of structure/function correlates of ion channels is the quantitative description of single-channel gating and conductance. However, a wealth of information provided from fast current fluctuations beyond the temporal resolution of the recording system is often ignored, even though it is close to the time window accessible to molecular dynamics simulations. This kind of current fluctuations provide a special technical challenge, because individual opening/closing or blocking/unblocking events cannot be resolved, and the resulting averaging over undetected events decreases the single-channel current. Here, I briefly summarize the history of fast-current fluctuation analysis and focus on the so-called “beta distributions.” This tool exploits characteristics of current fluctuation-induced excess noise on the current amplitude histograms to reconstruct the true single-channel current and kinetic parameters. A guideline for the analysis and recent applications demonstrate that a construction of theoretical beta distributions by Markov Model simulations offers maximum flexibility as compared to analytical solutions. PMID:26368656

  7. Building Airport Surface HITL Simulation Capability

    NASA Technical Reports Server (NTRS)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  8. Secure FAST: Security Enhancement in the NATO Time Sensitive Targeting Tool

    DTIC Science & Technology

    2010-11-01

    designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and authorisation in terms...level authentication and authorisation in terms of security. It uses operating system level security but does not provide application level security for...and collaboration tool, designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and

  9. On Fast Post-Processing of Global Positioning System Simulator Truth Data and Receiver Measurements and Solutions Data

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Day, John H. (Technical Monitor)

    2000-01-01

    Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.

  10. PyFly: A fast, portable aerodynamics simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Daniel; Ghommem, M.; Collier, Nathaniel O.

    Here, we present a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM), namely PyFly. UVLM computes the aerodynamic loads applied on lifting surfaces while capturing the unsteady effects such as the added mass forces, the growth of bound circulation, and the wake while assuming that the flow separation location is known a priori. This method is based on discretizing the body surface into a lattice of vortex rings and relies on the Biot–Savart law to construct the velocity field at every point in the simulated domain. We introduce the pointwise approximation approachmore » to simulate the interactions of the far-field vortices to overcome the computational burden associated with the classical implementation of UVLM. The computational framework uses the Python programming language to provide an easy to handle user interface while the computational kernels are written in Fortran. The mixed language approach enables high performance regarding solution time and great flexibility concerning easiness of code adaptation to different system configurations and applications. The computational tool predicts the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges) subject to incoming air. The aerodynamic simulator can also deal with enclosure effects, multi-body interactions, and B-spline representation of body shapes. Finally, we simulate different aerodynamic problems to illustrate the usefulness and effectiveness of PyFly.« less

  11. PyFly: A fast, portable aerodynamics simulator

    DOE PAGES

    Garcia, Daniel; Ghommem, M.; Collier, Nathaniel O.; ...

    2018-03-14

    Here, we present a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM), namely PyFly. UVLM computes the aerodynamic loads applied on lifting surfaces while capturing the unsteady effects such as the added mass forces, the growth of bound circulation, and the wake while assuming that the flow separation location is known a priori. This method is based on discretizing the body surface into a lattice of vortex rings and relies on the Biot–Savart law to construct the velocity field at every point in the simulated domain. We introduce the pointwise approximation approachmore » to simulate the interactions of the far-field vortices to overcome the computational burden associated with the classical implementation of UVLM. The computational framework uses the Python programming language to provide an easy to handle user interface while the computational kernels are written in Fortran. The mixed language approach enables high performance regarding solution time and great flexibility concerning easiness of code adaptation to different system configurations and applications. The computational tool predicts the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges) subject to incoming air. The aerodynamic simulator can also deal with enclosure effects, multi-body interactions, and B-spline representation of body shapes. Finally, we simulate different aerodynamic problems to illustrate the usefulness and effectiveness of PyFly.« less

  12. Soapy: an adaptive optics simulation written purely in Python for rapid concept development

    NASA Astrophysics Data System (ADS)

    Reeves, Andrew

    2016-07-01

    Soapy is a newly developed Adaptive Optics (AO) simulation which aims be a flexible and fast to use tool-kit for many applications in the field of AO. It is written purely in the Python language, adding to and taking advantage of the already rich ecosystem of scientific libraries and programs. The simulation has been designed to be extremely modular, such that each component can be used stand-alone for projects which do not require a full end-to-end simulation. Ease of use, modularity and code clarity have been prioritised at the expense of computational performance. Though this means the code is not yet suitable for large studies of Extremely Large Telescope AO systems, it is well suited to education, exploration of new AO concepts and investigations of current generation telescopes.

  13. Investigation of neutron interactions with Ge detectors

    NASA Astrophysics Data System (ADS)

    Baginova, Miloslava; Vojtyla, Pavol; Povinec, Pavel P.

    2018-07-01

    Interactions of neutrons with a high-purity germanium detector were studied experimentally and by simulations using the GEANT4 tool. Elastic and inelastic scattering of fast neutrons as well as neutron capture on Ge nuclei were observed. Peaks induced by inelastic scattering of neutrons on 70Ge, 72Ge, 73Ge, 74Ge and 76Ge were well visible in the γ-ray spectra. In addition, peaks due to inelastic scattering of neutrons on copper and lead nuclei, including the well-known peak of 208Pb at 2614.51 keV, were detected. The GEANT4 simulations showed that the simulated spectrum was in a good agreement with the experimental one. Differences between the simulated and the measured spectra were due to the high γ-ray intensity of the used neutron source, physics implemented in GEANT4 and contamination of the neutron source.

  14. Subnanosecond breakdown development in high-voltage pulse discharge: Effect of secondary electron emission

    NASA Astrophysics Data System (ADS)

    Alexandrov, A. L.; Schweigert, I. V.; Zakrevskiy, Dm. E.; Bokhan, P. A.; Gugin, P.; Lavrukhin, M.

    2017-10-01

    A subnanosecond breakdown in high-voltage pulse discharge may be a key tool for superfast commutation of high power devices. The breakdown in high-voltage open discharge at mid-high pressure in helium was studied in experiment and in kinetic simulations. The kinetic model of electron avalanche development was constructed, based on PIC-MCC simulations, including dynamics of electrons, ions and fast helium atoms, produced by ions scattering. Special attention was paid to electron emission processes from cathode, such as: photoemission by Doppler-shifted resonant photons, produced in excitation processes involving fast atoms; electron emission by ions and fast atoms bombardment of cathode; the secondary electron emission (SEE) by hot electrons from bulk plasma. The simulations show that the fast atoms accumulation is the main reason of emission growth at the early stage of breakdown, but at the final stage, when the voltage on plasma gap diminishes, namely the SEE is responsible for subnanosecond rate of current growth. It was shown that the characteristic time of the current growth can be controlled by the SEE yield. The influence of SEE yield for three types of cathode material (titanium, SiC, and CuAlMg-alloy) was tested. By changing the pulse voltage amplitude and gas pressure, the area of existence of subnanosecond breakdown is identified. It is shown that in discharge with SiC and CuAlMg-alloy cathodes (which have enhanced SEE) the current can increase with a subnanosecond characteristic time value as small as τs = 0.4 ns, for the pulse voltage amplitude of 5÷12 kV. An increase of gas pressure from 15 Torr to 30 Torr essentially decreases the time of of current front growth, whereas the pulse voltage variation weakly affects the results.

  15. Aeroelastic Modeling of Offshore Turbines and Support Structures in Hurricane-Prone Regions (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, R.

    US offshore wind turbines (OWTs) will likely have to contend with hurricanes and the associated loading conditions. Current industry standards do not account for these design load cases (DLCs), thus a new approach is required to guarantee that the OWTs achieve an appropriate level of reliability. In this study, a sequentially coupled aero-hydro-servo-elastic modeling technique was used to address two design approaches: 1.) The ABS (American Bureau of Shipping) approach; and 2.) The Hazard Curve or API (American Petroleum Institute) approach. The former employs IEC partial load factors (PSFs) and 100-yr return-period (RP) metocean events. The latter allows setting PSFsmore » and RP to a prescribed level of system reliability. The 500-yr RP robustness check (appearing in [2] and [3] upcoming editions) is a good indicator of the target reliability for L2 structures. CAE tools such as NREL's FAST and Bentley's' SACS (offshore analysis and design software) can be efficiently coupled to simulate system loads under hurricane DLCs. For this task, we augmented the latest FAST version (v. 8) to include tower aerodynamic drag that cannot be ignored in hurricane DLCs. In this project, a 6 MW turbine was simulated on a typical 4-legged jacket for a mid-Atlantic site. FAST-calculated tower base loads were fed to SACS at the interface level (transition piece); SACS added hydrodynamic and wind loads on the exposed substructure, and calculated mudline overturning moments, and member and joint utilization. Results show that CAE tools can be effectively used to compare design approaches for the design of OWTs in hurricane regions and to achieve a well-balanced design, where reliability levels and costs are optimized.« less

  16. A finite element model of rigid body structures actuated by dielectric elastomer actuators

    NASA Astrophysics Data System (ADS)

    Simone, F.; Linnebach, P.; Rizzello, G.; Seelecke, S.

    2018-06-01

    This paper presents on finite element (FE) modeling and simulation of dielectric elastomer actuators (DEAs) coupled with articulated structures. DEAs have proven to represent an effective transduction technology for the realization of large deformation, low-power consuming, and fast mechatronic actuators. However, the complex dynamic behavior of the material, characterized by nonlinearities and rate-dependent phenomena, makes it difficult to accurately model and design DEA systems. The problem is further complicated in case the DEA is used to activate articulated structures, which increase both system complexity and implementation effort of numerical simulation models. In this paper, we present a model based tool which allows to effectively implement and simulate complex articulated systems actuated by DEAs. A first prototype of a compact switch actuated by DEA membranes is chosen as reference study to introduce the methodology. The commercially available FE software COMSOL is used for implementing and coupling a physics-based dynamic model of the DEA with the external structure, i.e., the switch. The model is then experimentally calibrated and validated in both quasi-static and dynamic loading conditions. Finally, preliminary results on how to use the simulation tool to optimize the design are presented.

  17. An efficient 3-D eddy-current solver using an independent impedance method for transcranial magnetic stimulation.

    PubMed

    De Geeter, Nele; Crevecoeur, Guillaume; Dupre, Luc

    2011-02-01

    In many important bioelectromagnetic problem settings, eddy-current simulations are required. Examples are the reduction of eddy-current artifacts in magnetic resonance imaging and techniques, whereby the eddy currents interact with the biological system, like the alteration of the neurophysiology due to transcranial magnetic stimulation (TMS). TMS has become an important tool for the diagnosis and treatment of neurological diseases and psychiatric disorders. A widely applied method for simulating the eddy currents is the impedance method (IM). However, this method has to contend with an ill conditioned problem and consequently a long convergence time. When dealing with optimal design problems and sensitivity control, the convergence rate becomes even more crucial since the eddy-current solver needs to be evaluated in an iterative loop. Therefore, we introduce an independent IM (IIM), which improves the conditionality and speeds up the numerical convergence. This paper shows how IIM is based on IM and what are the advantages. Moreover, the method is applied to the efficient simulation of TMS. The proposed IIM achieves superior convergence properties with high time efficiency, compared to the traditional IM and is therefore a useful tool for accurate and fast TMS simulations.

  18. Abstract ID: 242 Simulation of a Fast Timing Micro-Pattern Gaseous Detector for TOF-PET.

    PubMed

    Radogna, Raffaella; Verwilligen, Piet

    2018-01-01

    Micro-Pattern Gas Detectors (MPGDs) are a new generation of gaseous detectors that have been developed thanks to advances in micro-structure technology. The main features of the MPGDs are: high rate capability (>50 MHz/cm 2 ); excellent spatial resolution (down to 50 μm); good time resolution (down to 3 ns); reduced radiation length, affordable costs, and possible flexible geometries. A new detector layout has been recently proposed that aims at combining both the high spatial resolution and high rate capability (100 MHz/cm 2 ) of the current state-of-the-art MPGDs with a high time resolution. This new type of MPGD is named the Fast Timing MPGD (FTM) detector [1,2]. The FTM developed for detecting charged particles can potentially reach sub-millimeter spatial resolution and 100 ps time resolution. This contribution introduces a Fast Timing MPGD technology optimized to detect photons, as an innovative PET imaging detector concept and emphases the importance of full detector simulation to guide the design of the detector geometry. The design and development of a new FTM, combining excellent time and spatial resolution, while exploiting the advantages of a reasonable energy resolution, will be a boost for the design of affordable TOF-PET scanner with improved image contrast. The use of such an affordable gas detector allows to instrument large areas in a cost-effective way, and to increase in image contrast for shorter scanning times (lowering the risk for the patient) and better diagnosis of the disease. In this report a dedicated simulation study is performed to optimize the detector design in the contest of the INFN project MPGD-Fatima. Results are obtained with ANSYS, COMSOL, GARFIELD++ and GEANT4 simulation tools. The final detector layout will be trade-off between fast time and good energy resolution. Copyright © 2017.

  19. Fast Simulation of Dynamic Ultrasound Images Using the GPU.

    PubMed

    Storve, Sigurd; Torp, Hans

    2017-10-01

    Simulated ultrasound data is a valuable tool for development and validation of quantitative image analysis methods in echocardiography. Unfortunately, simulation time can become prohibitive for phantoms consisting of a large number of point scatterers. The COLE algorithm by Gao et al. is a fast convolution-based simulator that trades simulation accuracy for improved speed. We present highly efficient parallelized CPU and GPU implementations of the COLE algorithm with an emphasis on dynamic simulations involving moving point scatterers. We argue that it is crucial to minimize the amount of data transfers from the CPU to achieve good performance on the GPU. We achieve this by storing the complete trajectories of the dynamic point scatterers as spline curves in the GPU memory. This leads to good efficiency when simulating sequences consisting of a large number of frames, such as B-mode and tissue Doppler data for a full cardiac cycle. In addition, we propose a phase-based subsample delay technique that efficiently eliminates flickering artifacts seen in B-mode sequences when COLE is used without enough temporal oversampling. To assess the performance, we used a laptop computer and a desktop computer, each equipped with a multicore Intel CPU and an NVIDIA GPU. Running the simulator on a high-end TITAN X GPU, we observed two orders of magnitude speedup compared to the parallel CPU version, three orders of magnitude speedup compared to simulation times reported by Gao et al. in their paper on COLE, and a speedup of 27000 times compared to the multithreaded version of Field II, using numbers reported in a paper by Jensen. We hope that by releasing the simulator as an open-source project we will encourage its use and further development.

  20. Grayscale Optical Correlator Workbench

    NASA Technical Reports Server (NTRS)

    Hanan, Jay; Zhou, Hanying; Chao, Tien-Hsin

    2006-01-01

    Grayscale Optical Correlator Workbench (GOCWB) is a computer program for use in automatic target recognition (ATR). GOCWB performs ATR with an accurate simulation of a hardware grayscale optical correlator (GOC). This simulation is performed to test filters that are created in GOCWB. Thus, GOCWB can be used as a stand-alone ATR software tool or in combination with GOC hardware for building (target training), testing, and optimization of filters. The software is divided into three main parts, denoted filter, testing, and training. The training part is used for assembling training images as input to a filter. The filter part is used for combining training images into a filter and optimizing that filter. The testing part is used for testing new filters and for general simulation of GOC output. The current version of GOCWB relies on the mathematical software tools from MATLAB binaries for performing matrix operations and fast Fourier transforms. Optimization of filters is based on an algorithm, known as OT-MACH, in which variables specified by the user are parameterized and the best filter is selected on the basis of an average result for correct identification of targets in multiple test images.

  1. Application Of Moldex3D For Thin-wall Injection Moulding Simulation

    NASA Astrophysics Data System (ADS)

    Šercer, Mladen; Godec, Damir; Bujanić, Božo

    2007-05-01

    The benefits associated with decreasing wall thicknesses below their current values are still measurable and desired even if the final wall thickness is nowhere near those of the aggressive portable electronics industry. It is important to note that gains in wall section reduction do not always occur without investment, in this case, in tooling and machinery upgrades. Equally important is the fact that productivity and performance benefits of reduced material usage, fast cycle times, and lighter weight can often outweigh most of the added costs. In order to eliminate unnecessary mould trials, minimize product development cycle, reduce overall costs and improve product quality, polymeric engineers use new CAE technology (Computer Aided Engineering). This technology is a simulation tool, which combines proven theories, material properties and process conditions to generate realistic simulations and produce valuable recommendations. Based on these recommendations, an optional combination of product design, material and process conditions can be identified. In this work, Moldex3D software was used for simulation of injection moulding in order to avoid potential moulding problems. The results gained from the simulation were used for the optimization of an existing product design, for mould development and for optimization of processing parameters, e.g. injection pressure, mould cavity temperature, etc.

  2. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  3. Judicious use of simulation technology in continuing medical education.

    PubMed

    Curtis, Michael T; DiazGranados, Deborah; Feldman, Moshe

    2012-01-01

    Use of simulation-based training is fast becoming a vital source of experiential learning in medical education. Although simulation is a common tool for undergraduate and graduate medical education curricula, the utilization of simulation in continuing medical education (CME) is still an area of growth. As more CME programs turn to simulation to address their training needs, it is important to highlight concepts of simulation technology that can help to optimize learning outcomes. This article discusses the role of fidelity in medical simulation. It provides support from a cross section of simulation training domains for determining the appropriate levels of fidelity, and it offers guidelines for creating an optimal balance of skill practice and realism for efficient training outcomes. After defining fidelity, 3 dimensions of fidelity, drawn from the human factors literature, are discussed in terms of their relevance to medical simulation. From this, research-based guidelines are provided to inform CME providers regarding the use of simulation in CME training. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.

  4. NREL's Water Power Software Makes a Splash; NREL Highlights, Research & Development, NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-06-01

    WEC-Sim is a DOE-funded software tool being jointly developed by NREL and SNL. WEC-Sim computationally models wave energy converters (WEC), devices that generate electricity using movement of water systems such as oceans, rivers, etc. There is great potential for WECs to generate electricity, but as of yet, the industry has yet to establish a commercially viable concept. Modeling, design, and simulations tools are essential to the successful development of WECs. Commercial WEC modeling software tools can't be modified by the user. In contrast, WEC-Sim is a free, open-source, and flexible enough to be modified to meet the rapidly evolving needsmore » of the WEC industry. By modeling the power generation performance and dynamic loads of WEC designs, WEC-Sim can help support the development of new WEC devices by optimizing designs for cost of energy and competitiveness. By being easily accessible, WEC-Sim promises to help level the playing field in the WEC industry. Importantly, WEC-Sim is also excellent at its job! In 2014, WEC-Sim was used in conjunction with NREL’s FAST modeling software to win a hydrodynamic modeling competition. WEC-Sim and FAST performed very well at predicting the motion of a test device in comparison to other modeling tools. The most recent version of WEC-Sim (v1.1) was released in April 2015.« less

  5. Modeling and Analysis of Actinide Diffusion Behavior in Irradiated Metal Fuel

    NASA Astrophysics Data System (ADS)

    Edelmann, Paul G.

    There have been numerous attempts to model fast reactor fuel behavior in the last 40 years. The US currently does not have a fully reliable tool to simulate the behavior of metal fuels in fast reactors. The experimental database necessary to validate the codes is also very limited. The DOE-sponsored Advanced Fuels Campaign (AFC) has performed various experiments that are ready for analysis. Current metal fuel performance codes are either not available to the AFC or have limitations and deficiencies in predicting AFC fuel performance. A modified version of a new fuel performance code, FEAST-Metal , was employed in this investigation with useful results. This work explores the modeling and analysis of AFC metallic fuels using FEAST-Metal, particularly in the area of constituent actinide diffusion behavior. The FEAST-Metal code calculations for this work were conducted at Los Alamos National Laboratory (LANL) in support of on-going activities related to sensitivity analysis of fuel performance codes. A sensitivity analysis of FEAST-Metal was completed to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. A modification was made to the FEAST-Metal constituent redistribution model to enable accommodation of newer AFC metal fuel compositions with verified results. Applicability of this modified model for sodium fast reactor metal fuel design is demonstrated.

  6. Limits to high-speed simulations of spiking neural networks using general-purpose computers.

    PubMed

    Zenke, Friedemann; Gerstner, Wulfram

    2014-01-01

    To understand how the central nervous system performs computations using recurrent neuronal circuitry, simulations have become an indispensable tool for theoretical neuroscience. To study neuronal circuits and their ability to self-organize, increasing attention has been directed toward synaptic plasticity. In particular spike-timing-dependent plasticity (STDP) creates specific demands for simulations of spiking neural networks. On the one hand a high temporal resolution is required to capture the millisecond timescale of typical STDP windows. On the other hand network simulations have to evolve over hours up to days, to capture the timescale of long-term plasticity. To do this efficiently, fast simulation speed is the crucial ingredient rather than large neuron numbers. Using different medium-sized network models consisting of several thousands of neurons and off-the-shelf hardware, we compare the simulation speed of the simulators: Brian, NEST and Neuron as well as our own simulator Auryn. Our results show that real-time simulations of different plastic network models are possible in parallel simulations in which numerical precision is not a primary concern. Even so, the speed-up margin of parallelism is limited and boosting simulation speeds beyond one tenth of real-time is difficult. By profiling simulation code we show that the run times of typical plastic network simulations encounter a hard boundary. This limit is partly due to latencies in the inter-process communications and thus cannot be overcome by increased parallelism. Overall, these results show that to study plasticity in medium-sized spiking neural networks, adequate simulation tools are readily available which run efficiently on small clusters. However, to run simulations substantially faster than real-time, special hardware is a prerequisite.

  7. FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.

    PubMed

    Gu, Ming; Chakrabartty, Shantanu

    2013-08-01

    This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).

  8. Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools

    NASA Astrophysics Data System (ADS)

    Sánchez Pineda, A.

    2015-12-01

    We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.

  9. A Multirate Control Strategy to the Slow Sensors Problem: An Interactive Simulation Tool for Controller Assisted Design

    PubMed Central

    Salt, Julián; Cuenca, Ángel; Palau, Francisco; Dormido, Sebastián

    2014-01-01

    In many control applications, the sensor technology used for the measurement of the variable to be controlled is not able to maintain a restricted sampling period. In this context, the assumption of regular and uniform sampling pattern is questionable. Moreover, if the control action updating can be faster than the output measurement frequency in order to fulfill the proposed closed loop behavior, the solution is usually a multirate controller. There are some known aspects to be careful of when a multirate system (MR) is going to be designed. The proper multiplicity between input-output sampling periods, the proper controller structure, the existence of ripples and others issues need to be considered. A useful way to save time and achieve good results is to have an assisted computer design tool. An interactive simulation tool to deal with MR seems to be the right solution. In this paper this kind of simulation application is presented. It allows an easy understanding of the performance degrading or improvement when changing the multirate sampling pattern parameters. The tool was developed using Sysquake, a Matlab-like language with fast execution and powerful graphic facilities. It can be delivered as an executable. In the paper a detailed explanation of MR treatment is also included and the design of four different MR controllers with flexible structure to be adapted to different schemes will also be presented. The Smith's predictor in these MR schemes is also explained, justified and used when time delays appear. Finally some interesting observations achieved using this interactive tool are included. PMID:24583971

  10. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  11. Numerical simulations of novel high-power high-brightness diode laser structures

    NASA Astrophysics Data System (ADS)

    Boucke, Konstantin; Rogg, Joseph; Kelemen, Marc T.; Poprawe, Reinhart; Weimann, Guenter

    2001-07-01

    One of the key topics in today's semiconductor laser development activities is to increase the brightness of high-power diode lasers. Although structures showing an increased brightness have been developed specific draw-backs of these structures lead to a still strong demand for investigation of alternative concepts. Especially for the investigation of basically novel structures easy-to-use and fast simulation tools are essential to avoid unnecessary, cost and time consuming experiments. A diode laser simulation tool based on finite difference representations of the Helmholtz equation in 'wide-angle' approximation and the carrier diffusion equation has been developed. An optimized numerical algorithm leads to short execution times of a few seconds per resonator round-trip on a standard PC. After each round-trip characteristics like optical output power, beam profile and beam parameters are calculated. A graphical user interface allows online monitoring of the simulation results. The simulation tool is used to investigate a novel high-power, high-brightness diode laser structure, the so-called 'Z-Structure'. In this structure an increased brightness is achieved by reducing the divergency angle of the beam by angular filtering: The round trip path of the beam is two times folded using internal total reflection at surfaces defined by a small index step in the semiconductor material, forming a stretched 'Z'. The sharp decrease of the reflectivity for angles of incidence above the angle of total reflection leads to a narrowing of the angular spectrum of the beam. The simulations of the 'Z-Structure' indicate an increase of the beam quality by a factor of five to ten compared to standard broad-area lasers.

  12. Phenomena Important in Molten Salt Reactor Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diamond, David J.; Brown, Nicholas R.; Denning, Richard

    The U.S. Nuclear Regulatory Commission (NRC) is preparing for the future licensing of advanced reactors that will be very different from current light water reactors. Part of the NRC preparation strategy is to identify the simulation tools that will be used for confirmatory safety analysis of normal operation and abnormal situations in those reactors. This report advances that strategy for reactors that will use molten salts (MSRs). This includes reactors with the fuel within the salt as well as reactors using solid fuel. Although both types are discussed in this report, the emphasis is on those reactors with liquid fuelmore » because of the perception that solid-fuel MSRs will be significantly easier to simulate. These liquid-fuel reactors include thermal and fast neutron spectrum alternatives. The specific designs discussed in the report are a subset of many designs being considered in the U.S. and elsewhere but they are considered the most likely to submit information to the NRC in the near future. The objective herein, is to understand the design of proposed molten salt reactors, how they will operate under normal or transient/accident conditions, and what will be the corresponding modeling needs of simulation tools that consider neutronics, heat transfer, fluid dynamics, and material composition changes in the molten salt. These tools will enable the NRC to eventually carry out confirmatory analyses that examine the validity and accuracy of applicant’s calculations and help determine the margin of safety in plant design.« less

  13. Hybrid MPI/OpenMP Implementation of the ORAC Molecular Dynamics Program for Generalized Ensemble and Fast Switching Alchemical Simulations.

    PubMed

    Procacci, Piero

    2016-06-27

    We present a new release (6.0β) of the ORAC program [Marsili et al. J. Comput. Chem. 2010, 31, 1106-1116] with a hybrid OpenMP/MPI (open multiprocessing message passing interface) multilevel parallelism tailored for generalized ensemble (GE) and fast switching double annihilation (FS-DAM) nonequilibrium technology aimed at evaluating the binding free energy in drug-receptor system on high performance computing platforms. The production of the GE or FS-DAM trajectories is handled using a weak scaling parallel approach on the MPI level only, while a strong scaling force decomposition scheme is implemented for intranode computations with shared memory access at the OpenMP level. The efficiency, simplicity, and inherent parallel nature of the ORAC implementation of the FS-DAM algorithm, project the code as a possible effective tool for a second generation high throughput virtual screening in drug discovery and design. The code, along with documentation, testing, and ancillary tools, is distributed under the provisions of the General Public License and can be freely downloaded at www.chim.unifi.it/orac .

  14. A Large number of fast cosmological simulations

    NASA Astrophysics Data System (ADS)

    Koda, Jun; Kazin, E.; Blake, C.

    2014-01-01

    Mock galaxy catalogs are essential tools to analyze large-scale structure data. Many independent realizations of mock catalogs are necessary to evaluate the uncertainties in the measurements. We perform 3600 cosmological simulations for the WiggleZ Dark Energy Survey to obtain the new improved Baron Acoustic Oscillation (BAO) cosmic distance measurements using the density field "reconstruction" technique. We use 1296^3 particles in a periodic box of 600/h Mpc on a side, which is the minimum requirement from the survey volume and observed galaxies. In order to perform such large number of simulations, we developed a parallel code using the COmoving Lagrangian Acceleration (COLA) method, which can simulate cosmological large-scale structure reasonably well with only 10 time steps. Our simulation is more than 100 times faster than conventional N-body simulations; one COLA simulation takes only 15 minutes with 216 computing cores. We have completed the 3600 simulations with a reasonable computation time of 200k core hours. We also present the results of the revised WiggleZ BAO distance measurement, which are significantly improved by the reconstruction technique.

  15. BlochSolver: A GPU-optimized fast 3D MRI simulator for experimentally compatible pulse sequences

    NASA Astrophysics Data System (ADS)

    Kose, Ryoichi; Kose, Katsumi

    2017-08-01

    A magnetic resonance imaging (MRI) simulator, which reproduces MRI experiments using computers, has been developed using two graphic-processor-unit (GPU) boards (GTX 1080). The MRI simulator was developed to run according to pulse sequences used in experiments. Experiments and simulations were performed to demonstrate the usefulness of the MRI simulator for three types of pulse sequences, namely, three-dimensional (3D) gradient-echo, 3D radio-frequency spoiled gradient-echo, and gradient-echo multislice with practical matrix sizes. The results demonstrated that the calculation speed using two GPU boards was typically about 7 TFLOPS and about 14 times faster than the calculation speed using CPUs (two 18-core Xeons). We also found that MR images acquired by experiment could be reproduced using an appropriate number of subvoxels, and that 3D isotropic and two-dimensional multislice imaging experiments for practical matrix sizes could be simulated using the MRI simulator. Therefore, we concluded that such powerful MRI simulators are expected to become an indispensable tool for MRI research and development.

  16. Modularized Parallel Neutron Instrument Simulation on the TeraGrid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Meili; Cobb, John W; Hagen, Mark E

    2007-01-01

    In order to build a bridge between the TeraGrid (TG), a national scale cyberinfrastructure resource, and neutron science, the Neutron Science TeraGrid Gateway (NSTG) is focused on introducing productive HPC usage to the neutron science community, primarily the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL). Monte Carlo simulations are used as a powerful tool for instrument design and optimization at SNS. One of the successful efforts of a collaboration team composed of NSTG HPC experts and SNS instrument scientists is the development of a software facility named PSoNI, Parallelizing Simulations of Neutron Instruments. Parallelizing the traditional serialmore » instrument simulation on TeraGrid resources, PSoNI quickly computes full instrument simulation at sufficient statistical levels in instrument de-sign. Upon SNS successful commissioning, to the end of 2007, three out of five commissioned instruments in SNS target station will be available for initial users. Advanced instrument study, proposal feasibility evalua-tion, and experiment planning are on the immediate schedule of SNS, which pose further requirements such as flexibility and high runtime efficiency on fast instrument simulation. PSoNI has been redesigned to meet the new challenges and a preliminary version is developed on TeraGrid. This paper explores the motivation and goals of the new design, and the improved software structure. Further, it describes the realized new fea-tures seen from MPI parallelized McStas running high resolution design simulations of the SEQUOIA and BSS instruments at SNS. A discussion regarding future work, which is targeted to do fast simulation for automated experiment adjustment and comparing models to data in analysis, is also presented.« less

  17. SU-C-204-01: A Fast Analytical Approach for Prompt Gamma and PET Predictions in a TPS for Proton Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, K; Herzog, M; Landry, G

    2015-06-15

    Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less

  18. Image simulation for HardWare In the Loop simulation in EO domain

    NASA Astrophysics Data System (ADS)

    Cathala, Thierry; Latger, Jean

    2015-10-01

    Infrared camera as a weapon sub system for automatic guidance is a key component for military carrier such as missile for example. The associated Image Processing, that controls the navigation, needs to be intensively assessed. Experimentation in the real world is very expensive. This is the main reason why hybrid simulation also called HardWare In the Loop (HWIL) is more and more required nowadays. In that field, IR projectors are able to cast IR fluxes of photons directly onto the IR camera of a given weapon system, typically a missile seeker head. Though in laboratory, the missile is so stimulated exactly like in the real world, provided a realistic simulation tool enables to perform synthetic images to be displayed by the IR projectors. The key technical challenge is to render the synthetic images at the required frequency. This paper focuses on OKTAL-SE experience in this domain through its product SE-FAST-HWIL. It shows the methodology and Return of Experience from OKTAL-SE. Examples are given, in the frame of the SE-Workbench. The presentation focuses on trials on real operational complex 3D cases. In particular, three important topics, that are very sensitive with regards to IG performance, are detailed: first the 3D sea surface representation, then particle systems rendering especially to simulate flares and at last sensor effects modelling. Beyond "projection mode", some information will be given on the SE-FAST-HWIL new capabilities dedicated to "injection mode".

  19. Simulating the behavior of patients who leave a public hospital emergency department without being seen by a physician: a cellular automaton and agent-based framework.

    PubMed

    Yousefi, Milad; Yousefi, Moslem; Fogliatto, F S; Ferreira, R P M; Kim, J H

    2018-01-11

    The objective of this study was to develop an agent based modeling (ABM) framework to simulate the behavior of patients who leave a public hospital emergency department (ED) without being seen (LWBS). In doing so, the study complements computer modeling and cellular automata (CA) techniques to simulate the behavior of patients in an ED. After verifying and validating the model by comparing it with data from a real case study, the significance of four preventive policies including increasing number of triage nurses, fast-track treatment, increasing the waiting room capacity and reducing treatment time were investigated by utilizing ordinary least squares regression. After applying the preventing policies in ED, an average of 42.14% reduction in the number of patients who leave without being seen and 6.05% reduction in the average length of stay (LOS) of patients was reported. This study is the first to apply CA in an ED simulation. Comparing the average LOS before and after applying CA with actual times from emergency department information system showed an 11% improvement. The simulation results indicated that the most effective approach to reduce the rate of LWBS is applying fast-track treatment. The ABM approach represents a flexible tool that can be constructed to reflect any given environment. It is also a support system for decision-makers to assess the relative impact of control strategies.

  20. Fast 2D Fluid-Analytical Simulation of IEDs and Plasma Uniformity in Multi-frequency CCPs

    NASA Astrophysics Data System (ADS)

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-10-01

    A fast 2D axisymmetric fluid-analytical model using the finite elements tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency argon capacitively coupled plasmas (CCPs). A bulk fluid plasma model which solves the time-dependent plasma fluid equations is coupled with an analytical sheath model which solves for the sheath parameters. The fluid-analytical results are used as input to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the wafer electrode. Each fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 minutes. The 2D multi-frequency fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel plate discharge, showing good agreement. Fluid-analytical simulations of a 2/60/162 MHz argon CCP with a typical asymmetric reactor geometry were also conducted. The low 2 MHz frequency controlled the sheath width and voltage while the higher frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. Adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge enhanced the plasma uniformity. This work was supported by the Department of Energy Office of Fusion Energy Science Contract DE-SC000193, and in part by gifts from Lam Research Corporation and Micron Corporation.

  1. Simulating the behavior of patients who leave a public hospital emergency department without being seen by a physician: a cellular automaton and agent-based framework

    PubMed Central

    Yousefi, Milad; Yousefi, Moslem; Fogliatto, F.S.; Ferreira, R.P.M.; Kim, J.H.

    2018-01-01

    The objective of this study was to develop an agent based modeling (ABM) framework to simulate the behavior of patients who leave a public hospital emergency department (ED) without being seen (LWBS). In doing so, the study complements computer modeling and cellular automata (CA) techniques to simulate the behavior of patients in an ED. After verifying and validating the model by comparing it with data from a real case study, the significance of four preventive policies including increasing number of triage nurses, fast-track treatment, increasing the waiting room capacity and reducing treatment time were investigated by utilizing ordinary least squares regression. After applying the preventing policies in ED, an average of 42.14% reduction in the number of patients who leave without being seen and 6.05% reduction in the average length of stay (LOS) of patients was reported. This study is the first to apply CA in an ED simulation. Comparing the average LOS before and after applying CA with actual times from emergency department information system showed an 11% improvement. The simulation results indicated that the most effective approach to reduce the rate of LWBS is applying fast-track treatment. The ABM approach represents a flexible tool that can be constructed to reflect any given environment. It is also a support system for decision-makers to assess the relative impact of control strategies. PMID:29340526

  2. Computer Simulation Of An In-Process Surface Finish Sensor.

    NASA Astrophysics Data System (ADS)

    Rakels, Jan H.

    1987-01-01

    It is generally accepted, that optical methods are the most promising for the in-process measurement of surface finish. These methods have the advantages of being non-contacting and fast data acquisition. Furthermore, these optical instruments can be easily retrofitted on existing machine-tools. In the Micro-Engineering Centre at the University of Warwick, an optical sensor has been developed which can measure the rms roughness, slope and wavelength of turned and precision ground surfaces during machining. The operation of this device is based upon the Kirchhoff-Fresnel diffraction integral. Application of this theory to ideal turned and ground surfaces is straightforward, and indeed the calculated diffraction patterns are in close agreement with patterns produced by an actual optical instrument. Since it is mathematically difficult to introduce real machine-tool behaviour into the diffraction integral, a computer program has been devised, which simulates the operation of the optical sensor. The program produces a diffraction pattern as a graphical output. Comparison between computer generated and actual diffraction patterns of the same surfaces show a high correlation. The main aim of this program is to construct an atlas, which maps known machine-tool errors versus optical diffraction patterns. This atlas can then be used for machine-tool condition diagnostics. It has been found that optical monitoring is very sensitive to minor defects. Therefore machine-tool detoriation can be detected before it is detrimental.

  3. Parallelization and automatic data distribution for nuclear reactor simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, L.M.

    1997-07-01

    Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directlymore » affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed.« less

  4. A computationally efficient method for full-core conjugate heat transfer modeling of sodium fast reactors

    DOE PAGES

    Hu, Rui; Yu, Yiqi

    2016-09-08

    For efficient and accurate temperature predictions of sodium fast reactor structures, a 3-D full-core conjugate heat transfer modeling capability is developed for an advanced system analysis tool, SAM. The hexagon lattice core is modeled with 1-D parallel channels representing the subassembly flow, and 2-D duct walls and inter-assembly gaps. The six sides of the hexagon duct wall and near-wall coolant region are modeled separately to account for different temperatures and heat transfer between coolant flow and each side of the duct wall. The Jacobian Free Newton Krylov (JFNK) solution method is applied to solve the fluid and solid field simultaneouslymore » in a fully coupled fashion. The 3-D full-core conjugate heat transfer modeling capability in SAM has been demonstrated by a verification test problem with 7 fuel assemblies in a hexagon lattice layout. In addition, the SAM simulation results are compared with RANS-based CFD simulations. Very good agreements have been achieved between the results of the two approaches.« less

  5. Extended friction elucidates the breakdown of fast water transport in graphene oxide membranes

    NASA Astrophysics Data System (ADS)

    Montessori, A.; Amadei, C. A.; Falcucci, G.; Sega, M.; Vecitis, C. D.; Succi, S.

    2016-12-01

    The understanding of water transport in graphene oxide (GO) membranes stands out as a major theoretical problem in graphene research. Notwithstanding the intense efforts devoted to the subject in the recent years, a consolidated picture of water transport in GO membranes is yet to emerge. By performing mesoscale simulations of water transport in ultrathin GO membranes, we show that even small amounts of oxygen functionalities can lead to a dramatic drop of the GO permeability, in line with experimental findings. The coexistence of bulk viscous dissipation and spatially extended molecular friction results in a major decrease of both slip and bulk flow, thereby suppressing the fast water transport regime observed in pristine graphene nanochannels. Inspection of the flow structure reveals an inverted curvature in the near-wall region, which connects smoothly with a parabolic profile in the bulk region. Such inverted curvature is a distinctive signature of the coexistence between single-particle zero-temperature (noiseless) Langevin friction and collective hydrodynamics. The present mesoscopic model with spatially extended friction may offer a computationally efficient tool for future simulations of water transport in nanomaterials.

  6. Measurement and Simulation of First-Orbit Fast-Ion D-Alpha Emission and the Application to Fast-Ion Loss Detection in the DIII-D Tokamak

    NASA Astrophysics Data System (ADS)

    Bolte, Nathan; Heidbrink, W. W.; Pace, D. C.; van Zeeland, M. A.; Chen, X.

    2015-11-01

    A new fast-ion diagnostic method uses passive emission of D-alpha radiation to determine fast-ion losses quantitatively. The passive fast-ion D-alpha simulation (P-FIDAsim) forward models the Doppler-shifted spectra of first-orbit fast ions that charge exchange with edge neutrals. Simulated spectra are up to 80 % correlated with experimental spectra. Calibrated spectra are used to estimate the 2D neutral density profile by inverting simulated spectra. The inferred neutral density shows the expected increase toward each x-point and an average value of 8 × 10 9 cm-3 at the plasma boundary and 1 × 10 11 cm-3 near the wall. Measuring and simulating first-orbit spectra effectively ``calibrates'' the system, allowing for the quantification of more general fast-ion losses. Sawtooth crashes are estimated to eject 1.2 % of the fast-ion inventory, in good agreement with a 1.7 % loss estimate made by TRANSP. Sightlines sensitive to passing ions observe larger sawtooth losses than sightlines sensitive to trapped ions. Supported by US DOE under SC-G903402, DE-FC02-04ER54698.

  7. Benchmark Simulations of the Thermal-Hydraulic Responses during EBR-II Inherent Safety Tests using SAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui; Sumner, Tyler S.

    2016-04-17

    An advanced system analysis tool SAM is being developed for fast-running, improved-fidelity, and whole-plant transient analyses at Argonne National Laboratory under DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. As an important part of code development, companion validation activities are being conducted to ensure the performance and validity of the SAM code. This paper presents the benchmark simulations of two EBR-II tests, SHRT-45R and BOP-302R, whose data are available through the support of DOE-NE’s Advanced Reactor Technology (ART) program. The code predictions of major primary coolant system parameter are compared with the test results. Additionally, the SAS4A/SASSYS-1 code simulationmore » results are also included for a code-to-code comparison.« less

  8. Hybrid deterministic/stochastic simulation of complex biochemical systems.

    PubMed

    Lecca, Paola; Bagagiolo, Fabio; Scarpa, Marina

    2017-11-21

    In a biological cell, cellular functions and the genetic regulatory apparatus are implemented and controlled by complex networks of chemical reactions involving genes, proteins, and enzymes. Accurate computational models are indispensable means for understanding the mechanisms behind the evolution of a complex system, not always explored with wet lab experiments. To serve their purpose, computational models, however, should be able to describe and simulate the complexity of a biological system in many of its aspects. Moreover, it should be implemented by efficient algorithms requiring the shortest possible execution time, to avoid enlarging excessively the time elapsing between data analysis and any subsequent experiment. Besides the features of their topological structure, the complexity of biological networks also refers to their dynamics, that is often non-linear and stiff. The stiffness is due to the presence of molecular species whose abundance fluctuates by many orders of magnitude. A fully stochastic simulation of a stiff system is computationally time-expensive. On the other hand, continuous models are less costly, but they fail to capture the stochastic behaviour of small populations of molecular species. We introduce a new efficient hybrid stochastic-deterministic computational model and the software tool MoBioS (MOlecular Biology Simulator) implementing it. The mathematical model of MoBioS uses continuous differential equations to describe the deterministic reactions and a Gillespie-like algorithm to describe the stochastic ones. Unlike the majority of current hybrid methods, the MoBioS algorithm divides the reactions' set into fast reactions, moderate reactions, and slow reactions and implements a hysteresis switching between the stochastic model and the deterministic model. Fast reactions are approximated as continuous-deterministic processes and modelled by deterministic rate equations. Moderate reactions are those whose reaction waiting time is greater than the fast reaction waiting time but smaller than the slow reaction waiting time. A moderate reaction is approximated as a stochastic (deterministic) process if it was classified as a stochastic (deterministic) process at the time at which it crosses the threshold of low (high) waiting time. A Gillespie First Reaction Method is implemented to select and execute the slow reactions. The performances of MoBios were tested on a typical example of hybrid dynamics: that is the DNA transcription regulation. The simulated dynamic profile of the reagents' abundance and the estimate of the error introduced by the fully deterministic approach were used to evaluate the consistency of the computational model and that of the software tool.

  9. The String Stability of a Trajectory-Based Interval Management Algorithm in the Midterm Airspace

    NASA Technical Reports Server (NTRS)

    Swieringa, Kurt A.

    2015-01-01

    NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides terminal controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain a precise spacing interval behind a target aircraft. As the percentage of IM equipped aircraft increases, controllers may provide IM clearances to sequences, or strings, of IM-equipped aircraft. It is important for these strings to maintain stable performance. This paper describes an analytic analysis of the string stability of the latest version of NASA's IM algorithm and a fast-time simulation designed to characterize the string performance of the IM algorithm. The analytic analysis showed that the spacing algorithm has stable poles, indicating that a spacing error perturbation will be reduced as a function of string position. The fast-time simulation investigated IM operations at two airports using constraints associated with the midterm airspace, including limited information of the target aircraft's intended speed profile and limited information of the wind forecast on the target aircraft's route. The results of the fast-time simulation demonstrated that the performance of the spacing algorithm is acceptable for strings of moderate length; however, there is some degradation in IM performance as a function of string position.

  10. Simulating polar bear energetics during a seasonal fast using a mechanistic model.

    PubMed

    Mathewson, Paul D; Porter, Warren P

    2013-01-01

    In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.

  11. Simulating Polar Bear Energetics during a Seasonal Fast Using a Mechanistic Model

    PubMed Central

    Mathewson, Paul D.; Porter, Warren P.

    2013-01-01

    In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal’s energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change. PMID:24019883

  12. Freud: a software suite for high-throughput simulation analysis

    NASA Astrophysics Data System (ADS)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  13. SOAP: A Tool for the Fast Computation of Photometry and Radial Velocity Induced by Stellar Spots

    NASA Astrophysics Data System (ADS)

    Boisse, I.; Bonfils, X.; Santos, N. C.; Figueira, P.

    2013-04-01

    Dark spots and bright plages are present on the surface of dwarf stars from spectral types F to M, even in their low-active phase (like the Sun). Their appearance and disappearance on the stellar photosphere, combined with the stellar rotation, may lead to errors and uncertainties in the characterization of planets both in radial velocity (RV) and photometry. Spot Oscillation and Planet (SOAP) is a tool offered to the community that enables to simulate spots and plages on rotating stars and computes their impact on RV and photometric measurements. This tool will help to understand the challenges related to the knowledge of stellar activity for the next decade: detect telluric planets in the habitable zone of their stars (from G to M dwarfs), understand the activity in the low-mass end of M dwarf (on which future projects, like SPIRou or CARMENES, will focus), limitation to the characterization of the exoplanetary atmosphere (from the ground or with Spitzer, JWST), search for planets around young stars. These can be simulated with SOAP in order to search for indices and corrections to the effect of activity.

  14. A Cost Framework for the Economic Feasibility of Wide-Scale Biochar Production

    NASA Astrophysics Data System (ADS)

    Pourhashem, G.; Masiello, C. A.; Medlock, K. B., III

    2017-12-01

    Biochar is a product of biomass pyrolysis, one of the main thermal pathways of producing biofuels. In addition to sequestering carbon, biochar's soil application helps sustainable agriculture by enhancing soil's structure and ecological functions, as well as lowering NO release from fertilized soils. However, wide-scale biochar land amendment has been limited in part due to its high cost. To examine biochar's cost dynamics, we develop a comprehensive framework for a representative biochar production facility and identify system inputs that are the key drivers of cost and profitability. We assess the production cost of fast and slow pyrolysis-biochar considering a range of parameters e.g. biomass type, process design and scale. We analyzed techno-economic cost data for producing biochar using simulated data from academic literature, and active producer data collected under confidentiality agreement. The combined approach was used to enhance the depth of the dataset and allowed for a reasonable check on published simulated data. Fast and slow pyrolysis have different biofuel and biochar yields and profit. A slow pyrolysis facility recovers its expenses mainly through biochar sale while a fast pyrolysis facility generates its primary revenue through biofuel sale, largely considering biochar a byproduct. Unlike fast pyrolysis that has received most attention in techno-economic studies, publicly available techno-economic data of slow pyrolysis is sparse. This limits the ability to run a thorough cost-benefit analysis to inform the feasibility of wider adoption of biochar for capturing its carbon sequestration and broader environmental benefits. Our model allows for consideration of various market-based policy instruments and can be used as an analytical decision making tool for investors and policy makers to estimate the cost and optimum facility size. This dynamic framework can also be adapted to account for the availability of new data as technology improves and industry evolves. Our study helps identify pyrolysis pathways that are most economically suitable for scaling up biochar production for ecosystem carbon storage and environmental improvement. Finally, we discuss the market development or policy strategies that can make biochar an attractive environmental mitigation tool for decision makers.

  15. NONMEMory: a run management tool for NONMEM.

    PubMed

    Wilkins, Justin J

    2005-06-01

    NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.

  16. Design of a final approach spacing tool for TRACON air traffic control

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh

    1989-01-01

    This paper describes an automation tool that assists air traffic controllers in the Terminal Radar Approach Control (TRACON) Facilities in providing safe and efficient sequencing and spacing of arrival traffic. The automation tool, referred to as the Final Approach Spacing Tool (FAST), allows the controller to interactively choose various levels of automation and advisory information ranging from predicted time errors to speed and heading advisories for controlling time error. FAST also uses a timeline to display current scheduling and sequencing information for all aircraft in the TRACON airspace. FAST combines accurate predictive algorithms and state-of-the-art mouse and graphical interface technology to present advisory information to the controller. Furthermore, FAST exchanges various types of traffic information and communicates with automation tools being developed for the Air Route Traffic Control Center. Thus it is part of an integrated traffic management system for arrival traffic at major terminal areas.

  17. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    NASA Astrophysics Data System (ADS)

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; White, R. B.

    2017-09-01

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. In this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that has been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Additional information from the actual experiment enables further tuning of the model’s parameters to achieve a close match with measurements.

  18. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less

  19. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    DOE PAGES

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; ...

    2017-07-20

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less

  20. Magnetohydrodynamic modes analysis and control of Fusion Advanced Studies Torus high-current scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villone, F.; Mastrostefano, S.; Calabrò, G.

    2014-08-15

    One of the main FAST (Fusion Advanced Studies Torus) goals is to have a flexible experiment capable to test tools and scenarios for safe and reliable tokamak operation, in order to support ITER and help the final DEMO design. In particular, in this paper, we focus on operation close to a possible border of stability related to low-q operation. To this purpose, a new FAST scenario has then been designed at I{sub p} = 10 MA, B{sub T} = 8.5 T, q{sub 95} ≈ 2.3. Transport simulations, carried out by using the code JETTO and the first principle transport model GLF23, indicate that, under these conditions, FASTmore » could achieve an equivalent Q ≈ 3.5. FAST will be equipped with a set of internal active coils for feedback control, which will produce magnetic perturbation with toroidal number n = 1 or n = 2. Magnetohydrodynamic (MHD) mode analysis and feedback control simulations performed with the codes MARS, MARS-F, CarMa (both assuming the presence of a perfect conductive wall and using the exact 3D resistive wall structure) show the possibility of the FAST conductive structures to stabilize n = 1 ideal modes. This leaves therefore room for active mitigation of the resistive mode (down to a characteristic time of 1 ms) for safety purposes, i.e., to avoid dangerous MHD-driven plasma disruption, when working close to the machine limits and magnetic and kinetic energy density not far from reactor values.« less

  1. The development of the Final Approach Spacing Tool (FAST): A cooperative controller-engineer design approach

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Davis, Thomas J.

    1995-01-01

    Historically, the development of advanced automation for air traffic control in the United States has excluded the input of the air traffic controller until the need of the development process. In contrast, the development of the Final Approach Spacing Tool (FAST), for the terminal area controller, has incorporated the end-user in early, iterative testing. This paper describes a cooperative between the controller and the developer to create a tool which incorporates the complexity of the air traffic controller's job. This approach to software development has enhanced the usability of FAST and has helped smooth the introduction of FAST into the operational environment.

  2. Electromagnetic Launch Vehicle Fairing and Acoustic Blanket Model of Received Power Using FEKO

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to electromagnetically sensitive spacecraft. This study employs the multilevel fast multipole method (MLFMM) from a commercial electromagnetic tool, FEKO, to model the fairing electromagnetic environment in the presence of an internal transmitter with improved accuracy over industry applied techniques. This fairing model includes material properties representative of acoustic blanketing commonly used in vehicles. Equivalent surface material models within FEKO were successfully applied to simulate the test case. Finally, a simplified model is presented using Nicholson Ross Weir derived blanket material properties. These properties are implemented with the coated metal option to reduce the model to one layer within the accuracy of the original three layer simulation.

  3. A Fast Visible-Infrared Imaging Radiometer Suite Simulator for Cloudy Atmopheres

    NASA Technical Reports Server (NTRS)

    Liu, Chao; Yang, Ping; Nasiri, Shaima L.; Platnick, Steven; Meyer, Kerry G.; Wang, Chen Xi; Ding, Shouguo

    2015-01-01

    A fast instrument simulator is developed to simulate the observations made in cloudy atmospheres by the Visible Infrared Imaging Radiometer Suite (VIIRS). The correlated k-distribution (CKD) technique is used to compute the transmissivity of absorbing atmospheric gases. The bulk scattering properties of ice clouds used in this study are based on the ice model used for the MODIS Collection 6 ice cloud products. Two fast radiative transfer models based on pre-computed ice cloud look-up-tables are used for the VIIRS solar and infrared channels. The accuracy and efficiency of the fast simulator are quantify in comparison with a combination of the rigorous line-by-line (LBLRTM) and discrete ordinate radiative transfer (DISORT) models. Relative errors are less than 2 for simulated TOA reflectances for the solar channels and the brightness temperature differences for the infrared channels are less than 0.2 K. The simulator is over three orders of magnitude faster than the benchmark LBLRTM+DISORT model. Furthermore, the cloudy atmosphere reflectances and brightness temperatures from the fast VIIRS simulator compare favorably with those from VIIRS observations.

  4. ProbFAST: Probabilistic functional analysis system tool.

    PubMed

    Silva, Israel T; Vêncio, Ricardo Z N; Oliveira, Thiago Y K; Molfetta, Greice A; Silva, Wilson A

    2010-03-30

    The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE) of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  5. ProbFAST: Probabilistic Functional Analysis System Tool

    PubMed Central

    2010-01-01

    Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE) of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast. PMID:20353576

  6. Designing and Implementing an OVERFLOW Reader for ParaView and Comparing Performance Between Central Processing Units and Graphical Processing Units

    NASA Technical Reports Server (NTRS)

    Chawner, David M.; Gomez, Ray J.

    2010-01-01

    In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.

  7. GAMETES: a fast, direct algorithm for generating pure, strict, epistatic models with random architectures.

    PubMed

    Urbanowicz, Ryan J; Kiralis, Jeff; Sinnott-Armstrong, Nicholas A; Heberling, Tamra; Fisher, Jonathan M; Moore, Jason H

    2012-10-01

    Geneticists who look beyond single locus disease associations require additional strategies for the detection of complex multi-locus effects. Epistasis, a multi-locus masking effect, presents a particular challenge, and has been the target of bioinformatic development. Thorough evaluation of new algorithms calls for simulation studies in which known disease models are sought. To date, the best methods for generating simulated multi-locus epistatic models rely on genetic algorithms. However, such methods are computationally expensive, difficult to adapt to multiple objectives, and unlikely to yield models with a precise form of epistasis which we refer to as pure and strict. Purely and strictly epistatic models constitute the worst-case in terms of detecting disease associations, since such associations may only be observed if all n-loci are included in the disease model. This makes them an attractive gold standard for simulation studies considering complex multi-locus effects. We introduce GAMETES, a user-friendly software package and algorithm which generates complex biallelic single nucleotide polymorphism (SNP) disease models for simulation studies. GAMETES rapidly and precisely generates random, pure, strict n-locus models with specified genetic constraints. These constraints include heritability, minor allele frequencies of the SNPs, and population prevalence. GAMETES also includes a simple dataset simulation strategy which may be utilized to rapidly generate an archive of simulated datasets for given genetic models. We highlight the utility and limitations of GAMETES with an example simulation study using MDR, an algorithm designed to detect epistasis. GAMETES is a fast, flexible, and precise tool for generating complex n-locus models with random architectures. While GAMETES has a limited ability to generate models with higher heritabilities, it is proficient at generating the lower heritability models typically used in simulation studies evaluating new algorithms. In addition, the GAMETES modeling strategy may be flexibly combined with any dataset simulation strategy. Beyond dataset simulation, GAMETES could be employed to pursue theoretical characterization of genetic models and epistasis.

  8. Electromagnetic Launch Vehicle Fairing and Acoustic Blanket Model of Received Power Using FEKO

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to sensitive spacecraft. This paper employees the Multilevel Fast Multipole Method (MLFMM) feature of a commercial electromagnetic tool to model the fairing electromagnetic environment in the presence of an internal transmitter. This work is an extension of the perfect electric conductor model that was used to represent the bare aluminum internal fairing cavity. This fairing model includes typical acoustic blanketing commonly used in vehicle fairings. Representative material models within FEKO were successfully used to simulate the test case.

  9. A New Improved and Extended Version of the Multicell Bacterial Simulator gro.

    PubMed

    Gutiérrez, Martín; Gregorio-Godoy, Paula; Pérez Del Pulgar, Guillermo; Muñoz, Luis E; Sáez, Sandra; Rodríguez-Patón, Alfonso

    2017-08-18

    gro is a cell programming language developed in Klavins Lab for simulating colony growth and cell-cell communication. It is used as a synthetic biology prototyping tool for simulating multicellular biocircuits and microbial consortia. In this work, we present several extensions made to gro that improve the performance of the simulator, make it easier to use, and provide new functionalities. The new version of gro is between 1 and 2 orders of magnitude faster than the original version. It is able to grow microbial colonies with up to 10 5 cells in less than 10 min. A new library, CellEngine, accelerates the resolution of spatial physical interactions between growing and dividing cells by implementing a new shoving algorithm. A genetic library, CellPro, based on Probabilistic Timed Automata, simulates gene expression dynamics using simplified and easy to compute digital proteins. We also propose a more convenient language specification layer, ProSpec, based on the idea that proteins drive cell behavior. CellNutrient, another library, implements Monod-based growth and nutrient uptake functionalities. The intercellular signaling management was improved and extended in a library called CellSignals. Finally, bacterial conjugation, another local cell-cell communication process, was added to the simulator. To show the versatility and potential outreach of this version of gro, we provide studies and novel examples ranging from synthetic biology to evolutionary microbiology. We believe that the upgrades implemented for gro have made it into a powerful and fast prototyping tool capable of simulating a large variety of systems and synthetic biology designs.

  10. TU-EF-304-03: 4D Monte Carlo Robustness Test for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, K; Sterpin, E; Lee, J

    Purpose: Breathing motion and approximate dose calculation engines may increase proton range uncertainties. We address these two issues using a comprehensive 4D robustness evaluation tool based on an efficient Monte Carlo (MC) engine, which can simulate breathing with no significant increase in computation time. Methods: To assess the robustness of the treatment plan, multiple scenarios of uncertainties are simulated, taking into account the systematic and random setup errors, range uncertainties, and organ motion. Our fast MC dose engine, called MCsquare, implements optimized models on a massively-parallel computation architecture and allows us to accurately simulate a scenario in less than onemore » minute. The deviations of the uncertainty scenarios are then reported on a DVH-band and compared to the nominal plan.The robustness evaluation tool is illustrated in a lung case by comparing three 60Gy treatment plans. First, a plan is optimized on a PTV obtained by extending the CTV with an 8mm margin, in order to take into account systematic geometrical uncertainties, like in our current practice in radiotherapy. No specific strategy is employed to correct for tumor and organ motions. The second plan involves a PTV generated from the ITV, which encompasses the tumor volume in all breathing phases. The last plan results from robust optimization performed on the ITV, with robustness parameters of 3% for tissue density and 8 mm for positioning errors. Results: The robustness test revealed that the first two plans could not properly cover the target in the presence of uncertainties. CTV-coverage (D95) in the three plans ranged respectively between 39.4–55.5Gy, 50.2–57.5Gy, and 55.1–58.6Gy. Conclusion: A realistic robustness verification tool based on a fast MC dose engine has been developed. This test is essential to assess the quality of proton therapy plan and very useful to study various planning strategies for mobile tumors. This work is partly funded by IBA (Louvain-la-Neuve, Belgium)« less

  11. Fast-ion D(alpha) measurements and simulations in DIII-D

    NASA Astrophysics Data System (ADS)

    Luo, Yadong

    The fast-ion Dalpha diagnostic measures the Doppler-shifted Dalpha light emitted by neutralized fast ions. For a favorable viewing geometry, the bright interferences from beam neutrals, halo neutrals, and edge neutrals span over a small wavelength range around the Dalpha rest wavelength and are blocked by a vertical bar at the exit focal plane of the spectrometer. Background subtraction and fitting techniques eliminate various contaminants in the spectrum. Fast-ion data are acquired with a time evolution of ˜1 ms, spatial resolution of ˜5 cm, and energy resolution of ˜10 keV. A weighted Monte Carlo simulation code models the fast-ion Dalpha spectra based on the fast-ion distribution function from other sources. In quiet plasmas, the spectral shape is in excellent agreement and absolute magnitude also has reasonable agreement. The fast-ion D alpha signal has the expected dependencies on plasma and neutral beam parameters. The neutral particle diagnostic and neutron diagnostic corroborate the fast-ion Dalpha measurements. The relative spatial profile is in agreement with the simulated profile based on the fast-ion distribution function from the TRANSP analysis code. During ion cyclotron heating, fast ions with high perpendicular energy are accelerated, while those with low perpendicular energy are barely affected. The spatial profile is compared with the simulated profiles based on the fast-ion distribution functions from the CQL Fokker-Planck code. In discharges with Alfven instabilities, both the spatial profile and spectral shape suggests that fast ions are redistributed. The flattened fast-ion Dalpha profile is in agreement with the fast-ion pressure profile.

  12. Virtual reality for dermatologic surgery: virtually a reality in the 21st century.

    PubMed

    Gladstone, H B; Raugi, G J; Berg, D; Berkley, J; Weghorst, S; Ganter, M

    2000-01-01

    In the 20th century, virtual reality has predominantly played a role in training pilots and in the entertainment industry. Despite much publicity, virtual reality did not live up to its perceived potential. During the past decade, it has also been applied for medical uses, particularly as training simulators, for minimally invasive surgery. Because of advances in computer technology, virtual reality is on the cusp of becoming an effective medical educational tool. At the University of Washington, we are developing a virtual reality soft tissue surgery simulator. Based on fast finite element modeling and using a personal computer, this device can simulate three-dimensional human skin deformations with real-time tactile feedback. Although there are many cutaneous biomechanical challenges to solve, it will eventually provide more realistic dermatologic surgery training for medical students and residents than the currently used models.

  13. Analysis of Tire Tractive Performance on Deformable Terrain by Finite Element-Discrete Element Method

    NASA Astrophysics Data System (ADS)

    Nakashima, Hiroshi; Takatsu, Yuzuru

    The goal of this study is to develop a practical and fast simulation tool for soil-tire interaction analysis, where finite element method (FEM) and discrete element method (DEM) are coupled together, and which can be realized on a desktop PC. We have extended our formerly proposed dynamic FE-DE method (FE-DEM) to include practical soil-tire system interaction, where not only the vertical sinkage of a tire, but also the travel of a driven tire was considered. Numerical simulation by FE-DEM is stable, and the relationships between variables, such as load-sinkage and sinkage-travel distance, and the gross tractive effort and running resistance characteristics, are obtained. Moreover, the simulation result is accurate enough to predict the maximum drawbar pull for a given tire, once the appropriate parameter values are provided. Therefore, the developed FE-DEM program can be applied with sufficient accuracy to interaction problems in soil-tire systems.

  14. On Fast Post-Processing of Global Positioning System Simulator Truth Data and Receiver Measurements and Solutions Data

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Day, John H. (Technical Monitor)

    2000-01-01

    Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.

  15. Design and Analysis of an Axisymmetric Phased Array Fed Gregorian Reflector System for Limited Scanning

    DTIC Science & Technology

    2016-01-22

    Numerical electromagnetic simulations based on the multilevel fast multipole method (MLFMM) were used to analyze and optimize the antenna...and are not necessarily endorsed by the United States Government. numerical simulations with the multilevel fast multipole method (MLFMM...and optimized using numerical simulations conducted with the multilevel fast multipole method (MLFMM) using FEKO software (www.feko.info). The

  16. Adaptive multiple super fast simulated annealing for stochastic microstructure reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryu, Seun; Lin, Guang; Sun, Xin

    2013-01-01

    Fast image reconstruction from statistical information is critical in image fusion from multimodality chemical imaging instrumentation to create high resolution image with large domain. Stochastic methods have been used widely in image reconstruction from two point correlation function. The main challenge is to increase the efficiency of reconstruction. A novel simulated annealing method is proposed for fast solution of image reconstruction. Combining the advantage of very fast cooling schedules, dynamic adaption and parallelization, the new simulation annealing algorithm increases the efficiencies by several orders of magnitude, making the large domain image fusion feasible.

  17. Rotary fast tool servo system and methods

    DOEpatents

    Montesanti, Richard C.; Trumper, David L.

    2007-10-02

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. A pair of position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  18. Rotary fast tool servo system and methods

    DOEpatents

    Montesanti, Richard C [Cambridge, MA; Trumper, David L [Plaistow, NH; Kirtley, Jr., James L.

    2009-08-18

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. One or more position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  19. Fast 2D fluid-analytical simulation of ion energy distributions and electromagnetic effects in multi-frequency capacitive discharges

    NASA Astrophysics Data System (ADS)

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-12-01

    A fast 2D axisymmetric fluid-analytical plasma reactor model using the finite elements simulation tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency capacitive argon discharges. A bulk fluid plasma model, which solves the time-dependent plasma fluid equations for the ion continuity and electron energy balance, is coupled with an analytical sheath model, which solves for the sheath parameters. The time-independent Helmholtz equation is used to solve for the fields and a gas flow model solves for the steady-state pressure, temperature and velocity of the neutrals. The results of the fluid-analytical model are used as inputs to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the target electrode. Each 2D fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 min. The multi-frequency 2D fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel-plate discharge, showing good agreement. We also conducted fluid-analytical simulations of a multi-frequency argon capacitively coupled plasma (CCP) with a typical asymmetric reactor geometry at 2/60/162 MHz. The low frequency 2 MHz power controlled the sheath width and sheath voltage while the high frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. We noticed that adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge can enhance the plasma uniformity. We found that multiple frequencies were not only useful for controlling IEDs but also plasma uniformity in CCP reactors.

  20. Computer-aided trauma simulation system with haptic feedback is easy and fast for oral-maxillofacial surgeons to learn and use.

    PubMed

    Schvartzman, Sara C; Silva, Rebeka; Salisbury, Ken; Gaudilliere, Dyani; Girod, Sabine

    2014-10-01

    Computer-assisted surgical (CAS) planning tools have become widely available in craniomaxillofacial surgery, but are time consuming and often require professional technical assistance to simulate a case. An initial oral and maxillofacial (OM) surgical user experience was evaluated with a newly developed CAS system featuring a bimanual sense of touch (haptic). Three volunteer OM surgeons received a 5-minute verbal introduction to the use of a newly developed haptic-enabled planning system. The surgeons were instructed to simulate mandibular fracture reductions of 3 clinical cases, within a 15-minute time limit and without a time limit, and complete a questionnaire to assess their subjective experience with the system. Standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome were compared. After the 5-minute instruction, all 3 surgeons were able to use the system independently. The analysis of standardized anatomic measurements showed that the simulation results within a 15-minute time limit were not significantly different from those without a time limit. Mean differences between measurements of surgical and simulated fracture reductions were within current resolution limitations in collision detection, segmentation of computed tomographic scans, and haptic devices. All 3 surgeons reported that the system was easy to learn and use and that they would be comfortable integrating it into their daily clinical practice for trauma cases. A CAS system with a haptic interface that capitalizes on touch and force feedback experience similar to operative procedures is fast and easy for OM surgeons to learn and use. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. All rights reserved.

  1. The plant leaf movement analyzer (PALMA): a simple tool for the analysis of periodic cotyledon and leaf movement in Arabidopsis thaliana.

    PubMed

    Wagner, Lucas; Schmal, Christoph; Staiger, Dorothee; Danisman, Selahattin

    2017-01-01

    The analysis of circadian leaf movement rhythms is a simple yet effective method to study effects of treatments or gene mutations on the circadian clock of plants. Currently, leaf movements are analysed using time lapse photography and subsequent bioinformatics analyses of leaf movements. Programs that are used for this purpose either are able to perform one function (i.e. leaf tip detection or rhythm analysis) or their function is limited to specific computational environments. We developed a leaf movement analysis tool-PALMA-that works in command line and combines image extraction with rhythm analysis using Fast Fourier transformation and non-linear least squares fitting. We validated PALMA in both simulated time series and in experiments using the known short period mutant sensitivity to red light reduced 1 ( srr1 - 1 ). We compared PALMA with two established leaf movement analysis tools and found it to perform equally well. Finally, we tested the effect of reduced iron conditions on the leaf movement rhythms of wild type plants. Here, we found that PALMA successfully detected period lengthening under reduced iron conditions. PALMA correctly estimated the period of both simulated and real-life leaf movement experiments. As a platform-independent console-program that unites both functions needed for the analysis of circadian leaf movements it is a valid alternative to existing leaf movement analysis tools.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotomayor, Marcos

    Hair cell mechanotransduction happens in tens of microseconds, involves forces of a few picoNewtons, and is mediated by nanometer-scale molecular conformational changes. As proteins involved in this process become identified and their high resolution structures become available, multiple tools are being used to explore their “single-molecule responses” to force. Optical tweezers and atomic force microscopy offer exquisite force and extension resolution, but cannot reach the high loading rates expected for high frequency auditory stimuli. Molecular dynamics (MD) simulations can reach these fast time scales, and also provide a unique view of the molecular events underlying protein mechanics, but its predictionsmore » must be experimentally verified. Thus a combination of simulations and experiments might be appropriate to study the molecular mechanics of hearing. Here I review the basics of MD simulations and the different methods used to apply force and study protein mechanics in silico. Simulations of tip link proteins are used to illustrate the advantages and limitations of this method.« less

  3. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model

    PubMed Central

    Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.

    2014-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410

  4. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  5. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model.

    PubMed

    Saul, Katherine R; Hu, Xiao; Goehler, Craig M; Vidt, Meghan E; Daly, Melissa; Velisar, Anca; Murray, Wendy M

    2015-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms.

  6. GrowYourIC: A Step Toward a Coherent Model of the Earth's Inner Core Seismic Structure

    NASA Astrophysics Data System (ADS)

    Lasbleis, Marine; Waszek, Lauren; Day, Elizabeth A.

    2017-11-01

    A complex inner core structure has been well established from seismic studies, showing radial and lateral heterogeneities at various length scales. Yet no geodynamic model is able to explain all the features observed. One of the main limits for this is the lack of tools to compare seismic observations and numerical models successfully. We use here a new Python tool called GrowYourIC to compare models of inner core structure. We calculate properties of geodynamic models of the inner core along seismic raypaths, for random or user-specified data sets. We test kinematic models which simulate fast lateral translation, superrotation, and differential growth. We explore first the influence on a real inner core data set, which has a sparse coverage of the inner core boundary. Such a data set is however able to successfully constrain the hemispherical boundaries due to a good sampling of latitudes. Combining translation and rotation could explain some of the features of the boundaries separating the inner core hemispheres. The depth shift of the boundaries, observed by some authors, seems unlikely to be modeled by a fast translation but could be produced by slow translation associated with superrotation.

  7. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  8. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  9. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  10. 48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... data describing vehicle usage required by the Federal Automotive Statistical Tool (FAST) by October 15 of each year. FAST is accessed through http://fastweb.inel.gov/. (End of clause) [68 FR 43334, July...

  11. Fast Simulation of the Impact Parameter Calculation of Electrons through Pair Production

    NASA Astrophysics Data System (ADS)

    Bang, Hyesun; Kweon, MinJung; Huh, Kyoung Bum; Pachmayer, Yvonne

    2018-05-01

    A fast simulation method is introduced that reduces tremendously the time required for the impact parameter calculation, a key observable in physics analyses of high energy physics experiments and detector optimisation studies. The impact parameter of electrons produced through pair production was calculated considering key related processes using the Bethe-Heitler formula, the Tsai formula and a simple geometric model. The calculations were performed at various conditions and the results were compared with those from full GEANT4 simulations. The computation time using this fast simulation method is 104 times shorter than that of the full GEANT4 simulation.

  12. Fast simulation of yttrium-90 bremsstrahlung photons with GATE.

    PubMed

    Rault, Erwann; Staelens, Steven; Van Holen, Roel; De Beenhouwer, Jan; Vandenberghe, Stefaan

    2010-06-01

    Multiple investigators have recently reported the use of yttrium-90 (90Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging for the dosimetry of targeted radionuclide therapies. Because Monte Carlo (MC) simulations are useful for studying SPECT imaging, this study investigates the MC simulation of 90Y bremsstrahlung photons in SPECT. To overcome the computationally expensive simulation of electrons, the authors propose a fast way to simulate the emission of 90Y bremsstrahlung photons based on prerecorded bremsstrahlung photon probability density functions (PDFs). The accuracy of bremsstrahlung photon simulation is evaluated in two steps. First, the validity of the fast bremsstrahlung photon generator is checked. To that end, fast and analog simulations of photons emitted from a 90Y point source in a water phantom are compared. The same setup is then used to verify the accuracy of the bremsstrahlung photon simulations, comparing the results obtained with PDFs generated from both simulated and measured data to measurements. In both cases, the energy spectra and point spread functions of the photons detected in a scintillation camera are used. Results show that the fast simulation method is responsible for a 5% overestimation of the low-energy fluence (below 75 keV) of the bremsstrahlung photons detected using a scintillation camera. The spatial distribution of the detected photons is, however, accurately reproduced with the fast method and a computational acceleration of approximately 17-fold is achieved. When measured PDFs are used in the simulations, the simulated energy spectrum of photons emitted from a point source of 90Y in a water phantom and detected in a scintillation camera closely approximates the measured spectrum. The PSF of the photons imaged in the 50-300 keV energy window is also accurately estimated with a 12.4% underestimation of the full width at half maximum and 4.5% underestimation of the full width at tenth maximum. Despite its limited accuracy, the fast bremsstrahlung photon generator is well suited for the simulation of bremsstrahlung photons emitted in large homogeneous organs, such as the liver, and detected in a scintillation camera. The computational acceleration makes it very useful for future investigations of 90Y bremsstrahlung SPECT imaging.

  13. SAM Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactormore » concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.« less

  14. Development and application of incrementally complex tools for wind turbine aerodynamics

    NASA Astrophysics Data System (ADS)

    Gundling, Christopher H.

    Advances and availability of computational resources have made wind farm design using simulation tools a reality. Wind farms are battling two issues, affecting the cost of energy, that will make or break many future investments in wind energy. The most significant issue is the power reduction of downstream turbines operating in the wake of upstream turbines. The loss of energy from wind turbine wakes is difficult to predict and the underestimation of energy losses due to wakes has been a common problem throughout the industry. The second issue is a shorter lifetime of blades and past failures of gearboxes due to increased fluctuations in the unsteady loading of waked turbines. The overall goal of this research is to address these problems by developing a platform for a multi-fidelity wind turbine aerodynamic performance and wake prediction tool. Full-scale experiments in the field have dramatically helped researchers understand the unique issues inside a large wind farm, but experimental methods can only be used to a limited extent due to the cost of such field studies and the size of wind farms. The uncertainty of the inflow is another inherent drawback of field experiments. Therefore, computational fluid dynamics (CFD) predictions, strategically validated using carefully performed wind farm field campaigns, are becoming a more standard design practice. The developed CFD models include a blade element model (BEM) code with a free-vortex wake, an actuator disk or line based method with large eddy simulations (LES) and a fully resolved rotor based method with detached eddy simulations (DES) and adaptive mesh refinement (AMR). To create more realistic simulations, performance of a one-way coupling between different mesoscale atmospheric boundary layer (ABL) models and the three microscale CFD solvers is tested. These methods are validated using data from incrementally complex test cases that include the NREL Phase VI wind tunnel test, the Sexbierum wind farm and the Lillgrund offshore wind farm. By cross-comparing the lowest complexity free-vortex method with the higher complexity methods, a fast and accurate simulation tool has been generated that can perform wind farm simulations in a few hours.

  15. Development of the Next Generation of Biogeochemistry Simulations Using EMSL's NWChem Molecular Modeling Software

    NASA Astrophysics Data System (ADS)

    Bylaska, E. J.; Kowalski, K.; Apra, E.; Govind, N.; Valiev, M.

    2017-12-01

    Methods of directly simulating the behavior of complex strongly interacting atomic systems (molecular dynamics, Monte Carlo) have provided important insight into the behavior of nanoparticles, biogeochemical systems, mineral/fluid systems, nanoparticles, actinide systems and geofluids. The limitation of these methods to even wider applications is the difficulty of developing accurate potential interactions in these systems at the molecular level that capture their complex chemistry. The well-developed tools of quantum chemistry and physics have been shown to approach the accuracy required. However, despite the continuous effort being put into improving their accuracy and efficiency, these tools will be of little value to condensed matter problems without continued improvements in techniques to traverse and sample the high-dimensional phase space needed to span the ˜10^12 time scale differences between molecular simulation and chemical events. In recent years, we have made considerable progress in developing electronic structure and AIMD methods tailored to treat biochemical and geochemical problems, including very efficient implementations of many-body methods, fast exact exchange methods, electron-transfer methods, excited state methods, QM/MM, and new parallel algorithms that scale to +100,000 cores. The poster will focus on the fundamentals of these methods and the realities in terms of system size, computational requirements and simulation times that are required for their application to complex biogeochemical systems.

  16. The Monash University Interactive Simple Climate Model

    NASA Astrophysics Data System (ADS)

    Dommenget, D.

    2013-12-01

    The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.

  17. Performance Evaluation of the Approaches and Algorithms Using Hamburg Airport Operations

    NASA Technical Reports Server (NTRS)

    Zhu, Zhifan; Okuniek, Nikolai; Gerdes, Ingrid; Schier, Sebastian; Lee, Hanbong; Jung, Yoon

    2016-01-01

    The German Aerospace Center (DLR) and the National Aeronautics and Space Administration (NASA) have been independently developing and testing their own concepts and tools for airport surface traffic management. Although these concepts and tools have been tested individually for European and US airports, they have never been compared or analyzed side-by-side. This paper presents the collaborative research devoted to the evaluation and analysis of two different surface management concepts. Hamburg Airport was used as a common test bed airport for the study. First, two independent simulations using the same traffic scenario were conducted; one by the DLR team using the Controller Assistance for Departure Optimization (CADEO) and the Taxi Routing for Aircraft: Creation and Controlling (TRACC) in a real-time simulation environment, and one by the NASA team based on the Spot and Runway Departure Advisor (SARDA) in a fast-time simulation environment. A set of common performance metrics was defined. The simulation results showed that both approaches produced operational benefits in efficiency, such as reducing taxi times, while maintaining runway throughput. Both approaches generated the gate pushback schedule to meet the runway schedule, such that the runway utilization was maximized. The conflict-free taxi guidance by TRACC helped avoid taxi conflicts and reduced taxiing stops, but the taxi benefit needed be assessed together with runway throughput to analyze the overall performance objective.

  18. Performance Evaluation of the Approaches and Algorithms for Hamburg Airport Operations

    NASA Technical Reports Server (NTRS)

    Zhu, Zhifan; Okuniek, Nikolai; Gerdes, Ingrid; Schier, Sebastian; Lee, Hanbong; Jung, Yoon

    2016-01-01

    The German Aerospace Center (DLR) and the National Aeronautics and Space Administration (NASA) have been independently developing and testing their own concepts and tools for airport surface traffic management. Although these concepts and tools have been tested individually for European and US airports, they have never been compared or analyzed side-by-side. This paper presents the collaborative research devoted to the evaluation and analysis of two different surface management concepts. Hamburg Airport was used as a common test bed airport for the study. First, two independent simulations using the same traffic scenario were conducted: one by the DLR team using the Controller Assistance for Departure Optimization (CADEO) and the Taxi Routing for Aircraft: Creation and Controlling (TRACC) in a real-time simulation environment, and one by the NASA team based on the Spot and Runway Departure Advisor (SARDA) in a fast-time simulation environment. A set of common performance metrics was defined. The simulation results showed that both approaches produced operational benefits in efficiency, such as reducing taxi times, while maintaining runway throughput. Both approaches generated the gate pushback schedule to meet the runway schedule, such that the runway utilization was maximized. The conflict-free taxi guidance by TRACC helped avoid taxi conflicts and reduced taxiing stops, but the taxi benefit needed be assessed together with runway throughput to analyze the overall performance objective.

  19. Performance Evaluation of the Approaches and Algorithms using Hamburg Airport Operations

    NASA Technical Reports Server (NTRS)

    Zhu, Zhifan; Lee, Hanbong; Jung, Yoon; Okuniek, Nikolai; Gerdes, Ingrid; Schier, Sebastian

    2016-01-01

    The German Aerospace Center (DLR) and the National Aeronautics and Space Administration (NASA) have been independently developing and testing their own concepts and tools for airport surface traffic management. Although these concepts and tools have been tested individually for European and US airports, they have never been compared or analyzed side-by-side. This paper presents the collaborative research devoted to the evaluation and analysis of two different surface management concepts. Hamburg Airport was used as a common test bed airport for the study. First, two independent simulations using the same traffic scenario were conducted: one by the DLR team using the Controller Assistance for Departure Optimization (CADEO) and the Taxi Routing for Aircraft58; Creation and Controlling (TRACC) in a real-time simulation environment, and one by the NASA team based on the Spot and Runway Departure Advisor (SARDA) in a fast-time simulation environment. A set of common performance metrics was defined. The simulation results showed that both approaches produced operational benefits in efficiency, such as reducing taxi times, while maintaining runway throughput. Both approaches generated the gate pushback schedule to meet the runway schedule, such that the runway utilization was maximized. The conflict-free taxi guidance by TRACC helped avoid taxi conflicts and reduced taxiing stops, but the taxi benefit needed be assessed together with runway throughput to analyze the overall performance objective.

  20. Noise in Neuronal and Electronic Circuits: A General Modeling Framework and Non-Monte Carlo Simulation Techniques.

    PubMed

    Kilinc, Deniz; Demir, Alper

    2017-08-01

    The brain is extremely energy efficient and remarkably robust in what it does despite the considerable variability and noise caused by the stochastic mechanisms in neurons and synapses. Computational modeling is a powerful tool that can help us gain insight into this important aspect of brain mechanism. A deep understanding and computational design tools can help develop robust neuromorphic electronic circuits and hybrid neuroelectronic systems. In this paper, we present a general modeling framework for biological neuronal circuits that systematically captures the nonstationary stochastic behavior of ion channels and synaptic processes. In this framework, fine-grained, discrete-state, continuous-time Markov chain models of both ion channels and synaptic processes are treated in a unified manner. Our modeling framework features a mechanism for the automatic generation of the corresponding coarse-grained, continuous-state, continuous-time stochastic differential equation models for neuronal variability and noise. Furthermore, we repurpose non-Monte Carlo noise analysis techniques, which were previously developed for analog electronic circuits, for the stochastic characterization of neuronal circuits both in time and frequency domain. We verify that the fast non-Monte Carlo analysis methods produce results with the same accuracy as computationally expensive Monte Carlo simulations. We have implemented the proposed techniques in a prototype simulator, where both biological neuronal and analog electronic circuits can be simulated together in a coupled manner.

  1. An efficient numerical method for solving the Boltzmann equation in multidimensions

    NASA Astrophysics Data System (ADS)

    Dimarco, Giacomo; Loubère, Raphaël; Narski, Jacek; Rey, Thomas

    2018-01-01

    In this paper we deal with the extension of the Fast Kinetic Scheme (FKS) (Dimarco and Loubère, 2013 [26]) originally constructed for solving the BGK equation, to the more challenging case of the Boltzmann equation. The scheme combines a robust and fast method for treating the transport part based on an innovative Lagrangian technique supplemented with conservative fast spectral schemes to treat the collisional operator by means of an operator splitting approach. This approach along with several implementation features related to the parallelization of the algorithm permits to construct an efficient simulation tool which is numerically tested against exact and reference solutions on classical problems arising in rarefied gas dynamic. We present results up to the 3 D × 3 D case for unsteady flows for the Variable Hard Sphere model which may serve as benchmark for future comparisons between different numerical methods for solving the multidimensional Boltzmann equation. For this reason, we also provide for each problem studied details on the computational cost and memory consumption as well as comparisons with the BGK model or the limit model of compressible Euler equations.

  2. Suppression of Alfvénic modes with off-axis NBI

    NASA Astrophysics Data System (ADS)

    Fredrickson, Eric; Bell, R.; Diallo, A.; Leblanc, B.; Podesta, M.; Levinton, F.; Yuh, H.; Liu, D.

    2016-10-01

    GAE are seen on NSTX-U in the frequency range from 1 to 3 MHz with injection of the more perpendicular, NSTX neutral beam sources. A new result is that injection of any of the new, more tangential, neutral beam sources with tangency radii larger than the magnetic axis suppress this GAE activity. Simulations of beam deposition and slowing down with the TRANSP code indicate that these new sources deposit fast ions with 0.9

  3. Nine time steps: ultra-fast statistical consistency testing of the Community Earth System Model (pyCECT v3.0)

    NASA Astrophysics Data System (ADS)

    Milroy, Daniel J.; Baker, Allison H.; Hammerling, Dorit M.; Jessup, Elizabeth R.

    2018-02-01

    The Community Earth System Model Ensemble Consistency Test (CESM-ECT) suite was developed as an alternative to requiring bitwise identical output for quality assurance. This objective test provides a statistical measurement of consistency between an accepted ensemble created by small initial temperature perturbations and a test set of CESM simulations. In this work, we extend the CESM-ECT suite with an inexpensive and robust test for ensemble consistency that is applied to Community Atmospheric Model (CAM) output after only nine model time steps. We demonstrate that adequate ensemble variability is achieved with instantaneous variable values at the ninth step, despite rapid perturbation growth and heterogeneous variable spread. We refer to this new test as the Ultra-Fast CAM Ensemble Consistency Test (UF-CAM-ECT) and demonstrate its effectiveness in practice, including its ability to detect small-scale events and its applicability to the Community Land Model (CLM). The new ultra-fast test facilitates CESM development, porting, and optimization efforts, particularly when used to complement information from the original CESM-ECT suite of tools.

  4. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    NASA Astrophysics Data System (ADS)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel yet unverified survey cadences (e.g. the baseline LSST cadence) that sparsely spread the observations required for detection over several days or weeks.

  5. 41 CFR 102-34.335 - How do I submit information to the General Services Administration (GSA) for the Federal Fleet...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...

  6. 41 CFR 102-34.335 - How do I submit information to the General Services Administration (GSA) for the Federal Fleet...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...

  7. 41 CFR 102-34.335 - How do I submit information to the General Services Administration (GSA) for the Federal Fleet...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...

  8. 41 CFR 102-34.335 - How do I submit information to the General Services Administration (GSA) for the Federal Fleet...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...

  9. 41 CFR 102-34.335 - How do I submit information to the General Services Administration (GSA) for the Federal Fleet...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... through the Federal Automotive Statistical Tool (FAST), an Internet-based reporting tool. To find out how to submit motor vehicle data to GSA through FAST, consult the instructions from your agency fleet...; and (5) Fuel used. Note to § 102-34.335: The FAST system is also used by agency Fleet Managers to...

  10. pypet: A Python Toolkit for Data Management of Parameter Explorations

    PubMed Central

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  11. A Compact Synchronous Cellular Model of Nonlinear Calcium Dynamics: Simulation and FPGA Synthesis Results.

    PubMed

    Soleimani, Hamid; Drakakis, Emmanuel M

    2017-06-01

    Recent studies have demonstrated that calcium is a widespread intracellular ion that controls a wide range of temporal dynamics in the mammalian body. The simulation and validation of such studies using experimental data would benefit from a fast large scale simulation and modelling tool. This paper presents a compact and fully reconfigurable cellular calcium model capable of mimicking Hopf bifurcation phenomenon and various nonlinear responses of the biological calcium dynamics. The proposed cellular model is synthesized on a digital platform for a single unit and a network model. Hardware synthesis, physical implementation on FPGA, and theoretical analysis confirm that the proposed cellular model can mimic the biological calcium behaviors with considerably low hardware overhead. The approach has the potential to speed up large-scale simulations of slow intracellular dynamics by sharing more cellular units in real-time. To this end, various networks constructed by pipelining 10 k to 40 k cellular calcium units are compared with an equivalent simulation run on a standard PC workstation. Results show that the cellular hardware model is, on average, 83 times faster than the CPU version.

  12. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    PubMed

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  13. Using Delft3D to Simulate Current Energy Conversion

    NASA Astrophysics Data System (ADS)

    James, S. C.; Chartrand, C.; Roberts, J.

    2015-12-01

    As public concern with renewable energy increases, current energy conversion (CEC) technology is being developed to optimize energy output and minimize environmental impact. CEC turbines generate energy from tidal and current systems and create wakes that interact with turbines located downstream of a device. The placement of devices can greatly influence power generation and structural reliability. CECs can also alter the ecosystem process surrounding the turbines, such as flow regimes, sediment dynamics, and water quality. Software is needed to investigate specific CEC sites to simulate power generation and hydrodynamic responses of a flow through a CEC turbine array. This work validates Delft3D against several flume experiments by simulating the power generation and hydrodynamic response of flow through a turbine or actuator disc(s). Model parameters are then calibrated against these data sets to reproduce momentum removal and wake recovery data with 3-D flow simulations. Simulated wake profiles and turbulence intensities compare favorably to the experimental data and demonstrate the utility and accuracy of a fast-running tool for future siting and analysis of CEC arrays in complex domains.

  14. Inverse simulation system for manual-controlled rendezvous and docking based on artificial neural network

    NASA Astrophysics Data System (ADS)

    Zhou, Wanmeng; Wang, Hua; Tang, Guojin; Guo, Shuai

    2016-09-01

    The time-consuming experimental method for handling qualities assessment cannot meet the increasing fast design requirements for the manned space flight. As a tool for the aircraft handling qualities research, the model-predictive-control structured inverse simulation (MPC-IS) has potential applications in the aerospace field to guide the astronauts' operations and evaluate the handling qualities more effectively. Therefore, this paper establishes MPC-IS for the manual-controlled rendezvous and docking (RVD) and proposes a novel artificial neural network inverse simulation system (ANN-IS) to further decrease the computational cost. The novel system was obtained by replacing the inverse model of MPC-IS with the artificial neural network. The optimal neural network was trained by the genetic Levenberg-Marquardt algorithm, and finally determined by the Levenberg-Marquardt algorithm. In order to validate MPC-IS and ANN-IS, the manual-controlled RVD experiments on the simulator were carried out. The comparisons between simulation results and experimental data demonstrated the validity of two systems and the high computational efficiency of ANN-IS.

  15. AC loss modelling and experiment of two types of low-inductance solenoidal coils

    NASA Astrophysics Data System (ADS)

    Liang, Fei; Yuan, Weijia; Zhang, Min; Zhang, Zhenyu; Li, Jianwei; Venuturumilli, Sriharsha; Patel, Jay

    2016-11-01

    Low-inductance solenoidal coils, which usually refer to the nonintersecting type and the braid type, have already been employed to build superconducting fault current limiters because of their fast recovery and low inductance characteristics. However, despite their usage there is still no systematical simulation work concerning the AC loss characteristics of the coils built with 2G high temperature superconducting tapes perhaps because of their complicated structure. In this paper, a new method is proposed to simulate both types of coils with 2D axisymmetric models solved by H formulation. Following the simulation work, AC losses of both types of low inductance solenoidal coils are compared numerically and experimentally, which verify that the model works well in simulating non-inductive coils. Finally, simulation works show that pitch has significant impact to AC loss of both types of coils and the inter-layer separation has different impact to the AC loss of braid type of coil in case of different applied currents. The model provides an effective tool for the design optimisation of SFCLs built with non-inductive solenoidal coils.

  16. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  17. Progress on the DPASS project

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.; Svidzinski, V. A.

    2015-11-01

    A novel project to develop Disruption Prediction And Simulation Suite (DPASS) of comprehensive computational tools to predict, model, and analyze disruption events in tokamaks has been recently started at FAR-TECH Inc. DPASS will eventually address the following aspects of the disruption problem: MHD, plasma edge dynamics, plasma-wall interaction, generation and losses of runaway electrons. DPASS uses the 3-D Disruption Simulation Code (DSC-3D) as a core tool and will have a modular structure. DSC is a one fluid non-linear, time-dependent 3D MHD code to simulate dynamics of tokamak plasma surrounded by pure vacuum B-field in the real geometry of a conducting tokamak vessel. DSC utilizes the adaptive meshless technique with adaptation to the moving plasma boundary, with accurate magnetic flux conservation and resolution of the plasma surface current. DSC has also an option to neglect the plasma inertia to eliminate fast magnetosonic scale. This option can be turned on/off as needed. During Phase I of the project, two modules will be developed: the computational module for modeling the massive gas injection and main plasma respond; and the module for nanoparticle plasma jet injection as an innovative disruption mitigation scheme. We will report on this development progress. Work is supported by the US DOE SBIR grant # DE-SC0013727.

  18. Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond

    PubMed Central

    Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias

    2015-01-01

    In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028

  19. SimPhospho: a software tool enabling confident phosphosite assignment.

    PubMed

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  20. Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.

    PubMed

    Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias

    2015-01-01

    In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.

  1. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role ofmore » expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related to VHTRs, sodium-cooled fast reactors, and light-water reactors. These experiments range from relatively low-cost benchtop experiments for investigating individual phenomena to large electrically-heated integral facilities for investigating reactor accidents and transients.« less

  2. A Global System for Transportation Simulation and Visualization in Emergency Evacuation Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Wei; Liu, Cheng; Thomas, Neil

    2015-01-01

    Simulation-based studies are frequently used for evacuation planning and decision making processes. Given the transportation systems complexity and data availability, most evacuation simulation models focus on certain geographic areas. With routine improvement of OpenStreetMap road networks and LandScanTM global population distribution data, we present WWEE, a uniform system for world-wide emergency evacuation simulations. WWEE uses unified data structure for simulation inputs. It also integrates a super-node trip distribution model as the default simulation parameter to improve the system computational performance. Two levels of visualization tools are implemented for evacuation performance analysis, including link-based macroscopic visualization and vehicle-based microscopic visualization. Formore » left-hand and right-hand traffic patterns in different countries, the authors propose a mirror technique to experiment with both scenarios without significantly changing traffic simulation models. Ten cities in US, Europe, Middle East, and Asia are modeled for demonstration. With default traffic simulation models for fast and easy-to-use evacuation estimation and visualization, WWEE also retains the capability of interactive operation for users to adopt customized traffic simulation models. For the first time, WWEE provides a unified platform for global evacuation researchers to estimate and visualize their strategies performance of transportation systems under evacuation scenarios.« less

  3. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  4. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  5. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions

    PubMed Central

    Taylor, Richard L.; Bentley, Christopher D. B.; Pedernales, Julen S.; Lamata, Lucas; Solano, Enrique; Carvalho, André R. R.; Hope, Joseph J.

    2017-01-01

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10−5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period. PMID:28401945

  6. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions.

    PubMed

    Taylor, Richard L; Bentley, Christopher D B; Pedernales, Julen S; Lamata, Lucas; Solano, Enrique; Carvalho, André R R; Hope, Joseph J

    2017-04-12

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10 -5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period.

  7. libRoadRunner: a high performance SBML simulation and analysis library

    PubMed Central

    Somogyi, Endre T.; Bouteiller, Jean-Marie; Glazier, James A.; König, Matthias; Medley, J. Kyle; Swat, Maciej H.; Sauro, Herbert M.

    2015-01-01

    Motivation: This article presents libRoadRunner, an extensible, high-performance, cross-platform, open-source software library for the simulation and analysis of models expressed using Systems Biology Markup Language (SBML). SBML is the most widely used standard for representing dynamic networks, especially biochemical networks. libRoadRunner is fast enough to support large-scale problems such as tissue models, studies that require large numbers of repeated runs and interactive simulations. Results: libRoadRunner is a self-contained library, able to run both as a component inside other tools via its C++ and C bindings, and interactively through its Python interface. Its Python Application Programming Interface (API) is similar to the APIs of MATLAB (www.mathworks.com) and SciPy (http://www.scipy.org/), making it fast and easy to learn. libRoadRunner uses a custom Just-In-Time (JIT) compiler built on the widely used LLVM JIT compiler framework. It compiles SBML-specified models directly into native machine code for a variety of processors, making it appropriate for solving extremely large models or repeated runs. libRoadRunner is flexible, supporting the bulk of the SBML specification (except for delay and non-linear algebraic equations) including several SBML extensions (composition and distributions). It offers multiple deterministic and stochastic integrators, as well as tools for steady-state analysis, stability analysis and structural analysis of the stoichiometric matrix. Availability and implementation: libRoadRunner binary distributions are available for Mac OS X, Linux and Windows. The library is licensed under Apache License Version 2.0. libRoadRunner is also available for ARM-based computers such as the Raspberry Pi. http://www.libroadrunner.org provides online documentation, full build instructions, binaries and a git source repository. Contacts: hsauro@u.washington.edu or somogyie@indiana.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26085503

  8. libRoadRunner: a high performance SBML simulation and analysis library.

    PubMed

    Somogyi, Endre T; Bouteiller, Jean-Marie; Glazier, James A; König, Matthias; Medley, J Kyle; Swat, Maciej H; Sauro, Herbert M

    2015-10-15

    This article presents libRoadRunner, an extensible, high-performance, cross-platform, open-source software library for the simulation and analysis of models expressed using Systems Biology Markup Language (SBML). SBML is the most widely used standard for representing dynamic networks, especially biochemical networks. libRoadRunner is fast enough to support large-scale problems such as tissue models, studies that require large numbers of repeated runs and interactive simulations. libRoadRunner is a self-contained library, able to run both as a component inside other tools via its C++ and C bindings, and interactively through its Python interface. Its Python Application Programming Interface (API) is similar to the APIs of MATLAB ( WWWMATHWORKSCOM: ) and SciPy ( HTTP//WWWSCIPYORG/: ), making it fast and easy to learn. libRoadRunner uses a custom Just-In-Time (JIT) compiler built on the widely used LLVM JIT compiler framework. It compiles SBML-specified models directly into native machine code for a variety of processors, making it appropriate for solving extremely large models or repeated runs. libRoadRunner is flexible, supporting the bulk of the SBML specification (except for delay and non-linear algebraic equations) including several SBML extensions (composition and distributions). It offers multiple deterministic and stochastic integrators, as well as tools for steady-state analysis, stability analysis and structural analysis of the stoichiometric matrix. libRoadRunner binary distributions are available for Mac OS X, Linux and Windows. The library is licensed under Apache License Version 2.0. libRoadRunner is also available for ARM-based computers such as the Raspberry Pi. http://www.libroadrunner.org provides online documentation, full build instructions, binaries and a git source repository. hsauro@u.washington.edu or somogyie@indiana.edu Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  9. TU-H-CAMPUS-IeP1-01: Bias and Computational Efficiency of Variance Reduction Methods for the Monte Carlo Simulation of Imaging Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, D; Badano, A; Sempau, J

    Purpose: Variance reduction techniques (VRTs) are employed in Monte Carlo simulations to obtain estimates with reduced statistical uncertainty for a given simulation time. In this work, we study the bias and efficiency of a VRT for estimating the response of imaging detectors. Methods: We implemented Directed Sampling (DS), preferentially directing a fraction of emitted optical photons directly towards the detector by altering the isotropic model. The weight of each optical photon is appropriately modified to maintain simulation estimates unbiased. We use a Monte Carlo tool called fastDETECT2 (part of the hybridMANTIS open-source package) for optical transport, modified for VRT. Themore » weight of each photon is calculated as the ratio of original probability (no VRT) and the new probability for a particular direction. For our analysis of bias and efficiency, we use pulse height spectra, point response functions, and Swank factors. We obtain results for a variety of cases including analog (no VRT, isotropic distribution), and DS with 0.2 and 0.8 optical photons directed towards the sensor plane. We used 10,000, 25-keV primaries. Results: The Swank factor for all cases in our simplified model converged fast (within the first 100 primaries) to a stable value of 0.9. The root mean square error per pixel for DS VRT for the point response function between analog and VRT cases was approximately 5e-4. Conclusion: Our preliminary results suggest that DS VRT does not affect the estimate of the mean for the Swank factor. Our findings indicate that it may be possible to design VRTs for imaging detector simulations to increase computational efficiency without introducing bias.« less

  10. Practical scheme for error control using feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarovar, Mohan; Milburn, Gerard J.; Ahn, Charlene

    2004-05-01

    We describe a scheme for quantum-error correction that employs feedback and weak measurement rather than the standard tools of projective measurement and fast controlled unitary gates. The advantage of this scheme over previous protocols [for example, Ahn et al. Phys. Rev. A 65, 042301 (2001)], is that it requires little side processing while remaining robust to measurement inefficiency, and is therefore considerably more practical. We evaluate the performance of our scheme by simulating the correction of bit flips. We also consider implementation in a solid-state quantum-computation architecture and estimate the maximal error rate that could be corrected with current technology.

  11. NREL Software Aids Offshore Wind Turbine Designs (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-10-01

    NREL researchers are supporting offshore wind power development with computer models that allow detailed analyses of both fixed and floating offshore wind turbines. While existing computer-aided engineering (CAE) models can simulate the conditions and stresses that a land-based wind turbine experiences over its lifetime, offshore turbines require the additional considerations of variations in water depth, soil type, and wind and wave severity, which also necessitate the use of a variety of support-structure types. NREL's core wind CAE tool, FAST, models the additional effects of incident waves, sea currents, and the foundation dynamics of the support structures.

  12. Calibration of a portable HPGe detector using MCNP code for the determination of 137Cs in soils.

    PubMed

    Gutiérrez-Villanueva, J L; Martín-Martín, A; Peña, V; Iniguez, M P; de Celis, B; de la Fuente, R

    2008-10-01

    In situ gamma spectrometry provides a fast method to determine (137)Cs inventories in soils. To improve the accuracy of the estimates, one can use not only the information on the photopeak count rates but also on the peak to forward-scatter ratios. Before applying this procedure to field measurements, a calibration including several experimental simulations must be carried out in the laboratory. In this paper it is shown that Monte Carlo methods are a valuable tool to minimize the number of experimental measurements needed for the calibration.

  13. 3D printed microfluidic mixer for point-of-care diagnosis of anemia.

    PubMed

    Plevniak, Kimberly; Campbell, Matthew; Mei He

    2016-08-01

    3D printing has been an emerging fabrication tool in prototyping and manufacturing. We demonstrated a 3D microfluidic simulation guided computer design and 3D printer prototyping for quick turnaround development of microfluidic 3D mixers, which allows fast self-mixing of reagents with blood through capillary force. Combined with smartphone, the point-of-care diagnosis of anemia from finger-prick blood has been successfully implemented and showed consistent results with clinical measurements. Capable of 3D fabrication flexibility and smartphone compatibility, this work presents a novel diagnostic strategy for advancing personalized medicine and mobile healthcare.

  14. High Bandwidth Rotary Fast Tool Servos and a Hybrid Rotary/Linear Electromagnetic Actuator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montesanti, Richard Clement

    2005-09-01

    This thesis describes the development of two high bandwidth short-stroke rotary fast tool servos and the hybrid rotary/linear electromagnetic actuator developed for one of them. Design insights, trade-o® methodologies, and analytical tools are developed for precision mechanical systems, power and signal electronic systems, control systems, normal-stress electromagnetic actuators, and the dynamics of the combined systems.

  15. Fire training in a virtual-reality environment

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Jurgen; Bucken, Arno

    2005-03-01

    Although fire is very common in our daily environment - as a source of energy at home or as a tool in industry - most people cannot estimate the danger of a conflagration. Therefore it is important to train people in combating fire. Beneath training with propane simulators or real fires and real extinguishers, fire training can be performed in virtual reality, which means a pollution-free and fast way of training. In this paper we describe how to enhance a virtual-reality environment with a real-time fire simulation and visualisation in order to establish a realistic emergency-training system. The presented approach supports extinguishing of the virtual fire including recordable performance data as needed in teletraining environments. We will show how to get realistic impressions of fire using advanced particle-simulation and how to use the advantages of particles to trigger states in a modified cellular automata used for the simulation of fire-behaviour. Using particle systems that interact with cellular automata it is possible to simulate a developing, spreading fire and its reaction on different extinguishing agents like water, CO2 or oxygen. The methods proposed in this paper have been implemented and successfully tested on Cosimir, a commercial robot-and VR-simulation-system.

  16. Efficiency optimization of a fast Poisson solver in beam dynamics simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Dawei; Pöplau, Gisela; van Rienen, Ursula

    2016-01-01

    Calculating the solution of Poisson's equation relating to space charge force is still the major time consumption in beam dynamics simulations and calls for further improvement. In this paper, we summarize a classical fast Poisson solver in beam dynamics simulations: the integrated Green's function method. We introduce three optimization steps of the classical Poisson solver routine: using the reduced integrated Green's function instead of the integrated Green's function; using the discrete cosine transform instead of discrete Fourier transform for the Green's function; using a novel fast convolution routine instead of an explicitly zero-padded convolution. The new Poisson solver routine preserves the advantages of fast computation and high accuracy. This provides a fast routine for high performance calculation of the space charge effect in accelerators.

  17. Next generation simulation tools: the Systems Biology Workbench and BioSPICE integration.

    PubMed

    Sauro, Herbert M; Hucka, Michael; Finney, Andrew; Wellock, Cameron; Bolouri, Hamid; Doyle, John; Kitano, Hiroaki

    2003-01-01

    Researchers in quantitative systems biology make use of a large number of different software packages for modelling, analysis, visualization, and general data manipulation. In this paper, we describe the Systems Biology Workbench (SBW), a software framework that allows heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' capabilities via a fast binary encoded-message system. Our goal was to create a simple, high performance, opensource software infrastructure which is easy to implement and understand. SBW enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe in this paper the SBW architecture, a selection of current modules, including Jarnac, JDesigner, and SBWMeta-tool, and the close integration of SBW into BioSPICE, which enables both frameworks to share tools and compliment and strengthen each others capabilities.

  18. Iterative method for in situ measurement of lens aberrations in lithographic tools using CTC-based quadratic aberration model.

    PubMed

    Liu, Shiyuan; Xu, Shuang; Wu, Xiaofei; Liu, Wei

    2012-06-18

    This paper proposes an iterative method for in situ lens aberration measurement in lithographic tools based on a quadratic aberration model (QAM) that is a natural extension of the linear model formed by taking into account interactions among individual Zernike coefficients. By introducing a generalized operator named cross triple correlation (CTC), the quadratic model can be calculated very quickly and accurately with the help of fast Fourier transform (FFT). The Zernike coefficients up to the 37th order or even higher are determined by solving an inverse problem through an iterative procedure from several through-focus aerial images of a specially designed mask pattern. The simulation work has validated the theoretical derivation and confirms that such a method is simple to implement and yields a superior quality of wavefront estimate, particularly for the case when the aberrations are relatively large. It is fully expected that this method will provide a useful practical means for the in-line monitoring of the imaging quality of lithographic tools.

  19. Needs challenge software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-07-01

    New hardware and software tools build on existing platforms and add performance and ease-of-use benefits as the struggle to find and produce hydrocarbons at the lowest cost becomes more and more competitive. Software tools now provide geoscientists and petroleum engineers with a better understanding of reservoirs from the shape and makeup of formation to behavior projections as hydrocarbons are extracted. Petroleum software tools allow scientists to simulate oil flow, predict the life expectancy of a reservoir, and even help determine how to extend the life and economic viability of the reservoir. The requirement of the petroleum industry to find andmore » extract petroleum more efficiently drives the solutions provided by software and service companies. To one extent or another, most of the petroleum software products available today have achieved an acceptable level of competency. Innovative, high-impact products from small, focussed companies often were bought out by larger companies with deeper pockets if their developers couldn`t fund their expansion. Other products disappeared from the scene, because they were unable to evolve fast enough to compete. There are still enough small companies around producing excellent products to prevent the marketplace from feeling too narrow and lacking in choice. Oil companies requiring specific solutions to their problems have helped fund product development within the commercial sector. As the industry has matured, strategic alliances between vendors, both hardware and software, have provided market advantages, often combining strengths to enter new and undeveloped areas for technology. The pace of technological development has been fast and constant.« less

  20. Numerical modeling of axi-symmetrical cold forging process by ``Pseudo Inverse Approach''

    NASA Astrophysics Data System (ADS)

    Halouani, A.; Li, Y. M.; Abbes, B.; Guo, Y. Q.

    2011-05-01

    The incremental approach is widely used for the forging process modeling, it gives good strain and stress estimation, but it is time consuming. A fast Inverse Approach (IA) has been developed for the axi-symmetric cold forging modeling [1-2]. This approach exploits maximum the knowledge of the final part's shape and the assumptions of proportional loading and simplified tool actions make the IA simulation very fast. The IA is proved very useful for the tool design and optimization because of its rapidity and good strain estimation. However, the assumptions mentioned above cannot provide good stress estimation because of neglecting the loading history. A new approach called "Pseudo Inverse Approach" (PIA) was proposed by Batoz, Guo et al.. [3] for the sheet forming modeling, which keeps the IA's advantages but gives good stress estimation by taking into consideration the loading history. Our aim is to adapt the PIA for the cold forging modeling in this paper. The main developments in PIA are resumed as follows: A few intermediate configurations are generated for the given tools' positions to consider the deformation history; the strain increment is calculated by the inverse method between the previous and actual configurations. An incremental algorithm of the plastic integration is used in PIA instead of the total constitutive law used in the IA. An example is used to show the effectiveness and limitations of the PIA for the cold forging process modeling.

  1. Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.

    2013-01-01

    The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.

  2. Modeling the Transfer Function for the Dark Energy Survey

    DOE PAGES

    Chang, C.

    2015-03-04

    We present a forward-modeling simulation framework designed to model the data products from the Dark Energy Survey (DES). This forward-model process can be thought of as a transfer function—a mapping from cosmological/astronomical signals to the final data products used by the scientists. Using output from the cosmological simulations (the Blind Cosmology Challenge), we generate simulated images (the Ultra Fast Image Simulator) and catalogs representative of the DES data. In this work we demonstrate the framework by simulating the 244 deg 2 coadd images and catalogs in five bands for the DES Science Verification data. The simulation output is compared withmore » the corresponding data to show that major characteristics of the images and catalogs can be captured. We also point out several directions of future improvements. Two practical examples—star-galaxy classification and proximity effects on object detection—are then used to illustrate how one can use the simulations to address systematics issues in data analysis. With clear understanding of the simplifications in our model, we show that one can use the simulations side-by-side with data products to interpret the measurements. This forward modeling approach is generally applicable for other upcoming and future surveys. It provides a powerful tool for systematics studies that is sufficiently realistic and highly controllable.« less

  3. Analyzing asteroid reflectance spectra with numerical tools based on scattering simulations

    NASA Astrophysics Data System (ADS)

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-04-01

    We are developing a set of numerical tools that can be used in analyzing the reflectance spectra of granular materials such as the regolith surface of atmosphereless Solar system objects. Our goal is to be able to explain, with realistic numerical scattering models, the spectral features arising when materials are intimately mixed together. We include the space-weathering -type effects in our simulations, i.e., mixing host mineral locally with small inclusions of another material in small proportions. Our motivation for this study comes from the present lack of such tools. The current common practice is to apply a semi-physical approximate model such as some variation of Hapke models [e.g., 1] or the Shkuratov model [2]. These models are expressed in a closed form so that they are relatively fast to apply. They are based on simplifications on the radiative transfer theory. The problem is that the validity of the model is not always guaranteed, and the derived physical properties related to particle scattering properties can be unrealistic [3]. We base our numerical tool into a chain of scattering simulations. Scattering properties of small inclusions inside an absorbing host matrix can be derived using exact methods solving the Maxwell equations of the system. The next step, scattering by a single regolith grain, is solved using a geometrical optics method accounting for surface reflections, internal absorption, and possibly the internal diffuse scattering. The third step involves the radiative transfer simulations of these regolith grains in a macroscopic planar element. The chain can be continued next with shadowing simulation over the target surface elements, and finally by integrating the bidirectional reflectance distribution function over the object's shape. Most of the tools in the proposed chain already exist, and one practical task for us is to tie these together into an easy-to-use toolchain that can be publicly distributed. We plan to open the abovementioned toolchain as a web-based open service. Acknowledgments: The research is funded by the ERC Advanced Grant No. 320773 (SAEMPL) References: [1] B. Hapke, Icarus 195, 918-926, 2008. [2] Yu. Shkuratov et al, Icarus 137, 235-246, 1999. [3] Yu. Shkuratov et al, JQSRT 113, 2431-2456, 2012. [4] K. Muinonen et al, JQSRT 110, 1628-1639, 2009.

  4. FastMag: Fast micromagnetic simulator for complex magnetic structures (invited)

    NASA Astrophysics Data System (ADS)

    Chang, R.; Li, S.; Lubarda, M. V.; Livshitz, B.; Lomakin, V.

    2011-04-01

    A fast micromagnetic simulator (FastMag) for general problems is presented. FastMag solves the Landau-Lifshitz-Gilbert equation and can handle multiscale problems with a high computational efficiency. The simulator derives its high performance from efficient methods for evaluating the effective field and from implementations on massively parallel graphics processing unit (GPU) architectures. FastMag discretizes the computational domain into tetrahedral elements and therefore is highly flexible for general problems. The magnetostatic field is computed via the superposition principle for both volume and surface parts of the computational domain. This is accomplished by implementing efficient quadrature rules and analytical integration for overlapping elements in which the integral kernel is singular. Thus, discretized superposition integrals are computed using a nonuniform grid interpolation method, which evaluates the field from N sources at N collocated observers in O(N) operations. This approach allows handling objects of arbitrary shape, allows easily calculating of the field outside the magnetized domains, does not require solving a linear system of equations, and requires little memory. FastMag is implemented on GPUs with ?> GPU-central processing unit speed-ups of 2 orders of magnitude. Simulations are shown of a large array of magnetic dots and a recording head fully discretized down to the exchange length, with over a hundred million tetrahedral elements on an inexpensive desktop computer.

  5. ORBIT modelling of fast particle redistribution induced by sawtooth instability

    NASA Astrophysics Data System (ADS)

    Kim, Doohyun; Podestà, Mario; Poli, Francesca; Princeton Plasma Physics Laboratory Team

    2017-10-01

    Initial tests on NSTX-U show that introducing energy selectivity for sawtooth (ST) induced fast ion redistribution improves the agreement between experimental and simulated quantities, e.g. neutron rate. Thus, it is expected that a proper description of the fast particle redistribution due to ST can improve the modelling of ST instability and interpretation of experiments using a transport code. In this work, we use ORBIT code to characterise the redistribution of fast particles. In order to simulate a ST crash, a spatial and temporal displacement is implemented as ξ (ρ , t , θ , ϕ) = ∑ξmn (ρ , t) cos (mθ + nϕ) to produce perturbed magnetic fields from the equilibrium field B-> , δB-> = ∇ × (ξ-> × B->) , which affect the fast particle distribution. From ORBIT simulations, we find suitable amplitudes of ξ for each ST crash to reproduce the experimental results. The comparison of the simulation and the experimental results will be discussed as well as the dependence of fast ion redistribution on fast ion phase space variables (i.e. energy, magnetic moment and toroidal angular momentum). Work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences under Contract Number DE-AC02-09CH11466.

  6. Numerical Study of the Plasticity-Induced Stabilization Effect on Martensitic Transformations in Shape Memory Alloys

    NASA Astrophysics Data System (ADS)

    Junker, Philipp; Hempel, Philipp

    2017-12-01

    It is well known that plastic deformations in shape memory alloys stabilize the martensitic phase. Furthermore, the knowledge concerning the plastic state is crucial for a reliable sustainability analysis of construction parts. Numerical simulations serve as a tool for the realistic investigation of the complex interactions between phase transformations and plastic deformations. To account also for irreversible deformations, we expand an energy-based material model by including a non-linear isotropic hardening plasticity model. An implementation of this material model into commercial finite element programs, e.g., Abaqus, offers the opportunity to analyze entire structural components at low costs and fast computation times. Along with the theoretical derivation and expansion of the model, several simulation results for various boundary value problems are presented and interpreted for improved construction designing.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Hong; Liu, Jian; Xiao, Jianyuan

    Particle-in-cell (PIC) simulation is the most important numerical tool in plasma physics. However, its long-term accuracy has not been established. To overcome this difficulty, we developed a canonical symplectic PIC method for the Vlasov-Maxwell system by discretising its canonical Poisson bracket. A fast local algorithm to solve the symplectic implicit time advance is discovered without root searching or global matrix inversion, enabling applications of the proposed method to very large-scale plasma simulations with many, e.g. 10(9), degrees of freedom. The long-term accuracy and fidelity of the algorithm enables us to numerically confirm Mouhot and Villani's theory and conjecture on nonlinearmore » Landau damping over several orders of magnitude using the PIC method, and to calculate the nonlinear evolution of the reflectivity during the mode conversion process from extraordinary waves to Bernstein waves.« less

  8. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability andmore » accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.« less

  9. Challenges in process marginality for advanced technology nodes and tackling its contributors

    NASA Astrophysics Data System (ADS)

    Narayana Samy, Aravind; Schiwon, Roberto; Seltmann, Rolf; Kahlenberg, Frank; Katakamsetty, Ushasree

    2013-10-01

    Process margin is getting critical in the present node shrinkage scenario due to the physical limits reached (Rayleigh's criterion) using ArF lithography tools. K1 is used to its best for better resolution and to enhance the process margin (28nm metal patterning k1=0.31). In this paper, we would like to give an overview of various contributors in the advanced technology nodes which limit the process margins and how the challenges have been tackled in a modern foundry model. Advanced OPC algorithms are used to make the design content at the mask optimum for patterning. However, as we work at the physical limit, critical features (Hot-spots) are very susceptible to litho process variations. Furthermore, etch can have a significant impact as well. Pattern that still looks healthy at litho can fail due to etch interactions. This makes the traditional 2D contour output from ORC tools not able to predict accurately all defects and hence not able to fully correct it in the early mask tapeout phase. The above makes a huge difference in the fast ramp-up and high yield in a competitive foundry market. We will explain in this paper how the early introduction of 3D resist model based simulation of resist profiles (resist top-loss, bottom bridging, top-rounding, etc.,) helped in our prediction and correction of hot-spots in the early 28nm process development phase. The paper also discusses about the other overall process window reduction contributors due to mask 3D effects, wafer topography (focus shifts/variations) and how this has been addressed with different simulation efforts in a fast and timely manner.

  10. On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic

    Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model librarymore » in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.« less

  11. Predicting fruit fly's sensing rate with insect flight simulations.

    PubMed

    Chang, Song; Wang, Z Jane

    2014-08-05

    Without sensory feedback, flies cannot fly. Exactly how various feedback controls work in insects is a complex puzzle to solve. What do insects measure to stabilize their flight? How often and how fast must insects adjust their wings to remain stable? To gain insights into algorithms used by insects to control their dynamic instability, we develop a simulation tool to study free flight. To stabilize flight, we construct a control algorithm that modulates wing motion based on discrete measurements of the body-pitch orientation. Our simulations give theoretical bounds on both the sensing rate and the delay time between sensing and actuation. Interpreting our findings together with experimental results on fruit flies' reaction time and sensory motor reflexes, we conjecture that fruit flies sense their kinematic states every wing beat to stabilize their flight. We further propose a candidate for such a control involving the fly's haltere and first basalar motor neuron. Although we focus on fruit flies as a case study, the framework for our simulation and discrete control algorithms is applicable to studies of both natural and man-made fliers.

  12. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    PubMed

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. Copyright © 2011 Wiley Periodicals, Inc.

  13. Quasi-optical analysis of a far-infrared spatio-spectral space interferometer concept

    NASA Astrophysics Data System (ADS)

    Bracken, C.; O'Sullivan, C.; Murphy, J. A.; Donohoe, A.; Savini, G.; Lightfoot, J.; Juanola-Parramon, R.; Fisica Consortium

    2016-07-01

    FISICA (Far-Infrared Space Interferometer Critical Assessment) was a three year study of a far-infrared spatio-spectral double-Fourier interferometer concept. One of the aims of the FISICA study was to set-out a baseline optical design for such a system, and to use a model of the system to simulate realistic telescope beams for use with an end-to-end instrument simulator. This paper describes a two-telescope (and hub) baseline optical design that fulfils the requirements of the FISICA science case, while minimising the optical mass of the system. A number of different modelling techniques were required for the analysis: fast approximate simulation tools such as ray tracing and Gaussian beam methods were employed for initial analysis, with GRASP physical optics used for higher accuracy in the final analysis. Results are shown for the predicted far-field patterns of the telescope primary mirrors under illumination by smooth walled rectangular feed horns. Far-field patterns for both on-axis and off-axis detectors are presented and discussed.

  14. Numerical simulation of water hammer in low pressurized pipe: comparison of SimHydraulics and Lax-Wendroff method with experiment

    NASA Astrophysics Data System (ADS)

    Himr, D.

    2013-04-01

    Article describes simulation of unsteady flow during water hammer with two programs, which use different numerical approaches to solve ordinary one dimensional differential equations describing the dynamics of hydraulic elements and pipes. First one is Matlab-Simulink-SimHydraulics, which is a commercial software developed to solve the dynamics of general hydraulic systems. It defines them with block elements. The other software is called HYDRA and it is based on the Lax-Wendrff numerical method, which serves as a tool to solve the momentum and continuity equations. This program was developed in Matlab by Brno University of Technology. Experimental measurements were performed on a simple test rig, which consists of an elastic pipe with strong damping connecting two reservoirs. Water hammer is induced with fast closing the valve. Physical properties of liquid and pipe elasticity parameters were considered in both simulations, which are in very good agreement and differences in comparison with experimental data are minimal.

  15. Simulating pump-probe photoelectron and absorption spectroscopy on the attosecond timescale with time-dependent density functional theory.

    PubMed

    De Giovannini, Umberto; Brunetto, Gustavo; Castro, Alberto; Walkenhorst, Jessica; Rubio, Angel

    2013-05-10

    Molecular absorption and photoelectron spectra can be efficiently predicted with real-time time-dependent density functional theory. We show herein how these techniques can be easily extended to study time-resolved pump-probe experiments, in which a system response (absorption or electron emission) to a probe pulse is measured in an excited state. This simulation tool helps with the interpretation of fast-evolving attosecond time-resolved spectroscopic experiments, in which electronic motion must be followed at its natural timescale. We show how the extra degrees of freedom (pump-pulse duration, intensity, frequency, and time delay), which are absent in a conventional steady-state experiment, provide additional information about electronic structure and dynamics that improve characterization of a system. As an extension of this approach, time-dependent 2D spectroscopy can also be simulated, in principle, for large-scale structures and extended systems. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    NASA Technical Reports Server (NTRS)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  17. Improved FastICA algorithm in fMRI data analysis using the sparsity property of the sources.

    PubMed

    Ge, Ruiyang; Wang, Yubao; Zhang, Jipeng; Yao, Li; Zhang, Hang; Long, Zhiying

    2016-04-01

    As a blind source separation technique, independent component analysis (ICA) has many applications in functional magnetic resonance imaging (fMRI). Although either temporal or spatial prior information has been introduced into the constrained ICA and semi-blind ICA methods to improve the performance of ICA in fMRI data analysis, certain types of additional prior information, such as the sparsity, has seldom been added to the ICA algorithms as constraints. In this study, we proposed a SparseFastICA method by adding the source sparsity as a constraint to the FastICA algorithm to improve the performance of the widely used FastICA. The source sparsity is estimated through a smoothed ℓ0 norm method. We performed experimental tests on both simulated data and real fMRI data to investigate the feasibility and robustness of SparseFastICA and made a performance comparison between SparseFastICA, FastICA and Infomax ICA. Results of the simulated and real fMRI data demonstrated the feasibility and robustness of SparseFastICA for the source separation in fMRI data. Both the simulated and real fMRI experimental results showed that SparseFastICA has better robustness to noise and better spatial detection power than FastICA. Although the spatial detection power of SparseFastICA and Infomax did not show significant difference, SparseFastICA had faster computation speed than Infomax. SparseFastICA was comparable to the Infomax algorithm with a faster computation speed. More importantly, SparseFastICA outperformed FastICA in robustness and spatial detection power and can be used to identify more accurate brain networks than FastICA algorithm. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Laser-heating and Radiance Spectrometry for the Study of Nuclear Materials in Conditions Simulating a Nuclear Power Plant Accident.

    PubMed

    Manara, Dario; Soldi, Luca; Mastromarino, Sara; Boboridis, Kostantinos; Robba, Davide; Vlahovic, Luka; Konings, Rudy

    2017-12-14

    Major and severe accidents have occurred three times in nuclear power plants (NPPs), at Three Mile Island (USA, 1979), Chernobyl (former USSR, 1986) and Fukushima (Japan, 2011). Research on the causes, dynamics, and consequences of these mishaps has been performed in a few laboratories worldwide in the last three decades. Common goals of such research activities are: the prevention of these kinds of accidents, both in existing and potential new nuclear power plants; the minimization of their eventual consequences; and ultimately, a full understanding of the real risks connected with NPPs. At the European Commission Joint Research Centre's Institute for Transuranium Elements, a laser-heating and fast radiance spectro-pyrometry facility is used for the laboratory simulation, on a small scale, of NPP core meltdown, the most common type of severe accident (SA) that can occur in a nuclear reactor as a consequence of a failure of the cooling system. This simulation tool permits fast and effective high-temperature measurements on real nuclear materials, such as plutonium and minor actinide-containing fission fuel samples. In this respect, and in its capability to produce large amount of data concerning materials under extreme conditions, the current experimental approach is certainly unique. For current and future concepts of NPP, example results are presented on the melting behavior of some different types of nuclear fuels: uranium-plutonium oxides, carbides, and nitrides. Results on the high-temperature interaction of oxide fuels with containment materials are also briefly shown.

  19. Automatic optimization of well locations in a North Sea fractured chalk reservoir using a front tracking reservoir simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rian, D.T.; Hage, A.

    1994-12-31

    A numerical simulator is often used as a reservoir management tool. One of its main purposes is to aid in the evaluation of number of wells, well locations and start time for wells. Traditionally, the optimization of a field development is done by a manual trial and error process. In this paper, an example of an automated technique is given. The core in the automization process is the reservoir simulator Frontline. Frontline is based on front tracking techniques, which makes it fast and accurate compared to traditional finite difference simulators. Due to its CPU-efficiency the simulator has been coupled withmore » an optimization module, which enables automatic optimization of location of wells, number of wells and start-up times. The simulator was used as an alternative method in the evaluation of waterflooding in a North Sea fractured chalk reservoir. Since Frontline, in principle, is 2D, Buckley-Leverett pseudo functions were used to represent the 3rd dimension. The area full field simulation model was run with up to 25 wells for 20 years in less than one minute of Vax 9000 CPU-time. The automatic Frontline evaluation indicated that a peripheral waterflood could double incremental recovery compared to a central pattern drive.« less

  20. Cognitive learning: a machine learning approach for automatic process characterization from design

    NASA Astrophysics Data System (ADS)

    Foucher, J.; Baderot, J.; Martinez, S.; Dervilllé, A.; Bernard, G.

    2018-03-01

    Cutting edge innovation requires accurate and fast process-control to obtain fast learning rate and industry adoption. Current tools available for such task are mainly manual and user dependent. We present in this paper cognitive learning, which is a new machine learning based technique to facilitate and to speed up complex characterization by using the design as input, providing fast training and detection time. We will focus on the machine learning framework that allows object detection, defect traceability and automatic measurement tools.

  1. Examining the validity of the ActivPAL monitor in measuring posture and ambulatory movement in children.

    PubMed

    Aminian, Saeideh; Hinckson, Erica A

    2012-10-02

    Decreasing sedentary activities that involve prolonged sitting may be an important strategy to reduce obesity and other physical and psychosocial health problems in children. The first step to understanding the effect of sedentary activities on children's health is to objectively assess these activities with a valid measurement tool. To examine the validity of the ActivPAL monitor in measuring sitting/lying, standing, and walking time, transition counts and step counts in children in a laboratory setting. Twenty five healthy elementary school children (age 9.9 ± 0.3 years; BMI 18.2 ± 1.9; mean ± SD) were randomly recruited across the Auckland region, New Zealand. Children were fitted with ActivPAL monitors and observed during simulated free-living activities involving sitting/lying, standing and walking, followed by treadmill and over-ground activities at various speeds (slow, normal, fast) against video observation (criterion measure). The ActivPAL sit-to-stand and stand-to-sit transition counts and steps were also compared with video data. The accuracy of step counts measured by the ActivPAL was also compared against the New Lifestyles NL-2000 and the Yamax Digi-Walker SW-200 pedometers. We observed a perfect correlation between the ActivPAL monitor in time spent sitting/lying, standing, and walking in simulated free-living activities with direct observation. Correlations between the ActivPAL and video observation in total numbers of sit-to-stand and stand-to-sit transitions were high (r = 0.99 ± 0.01). Unlike pedometers, the ActivPAL did not misclassify fidgeting as steps taken. Strong correlations (r = 0.88-1.00) between ActivPAL step counts and video observation in both treadmill and over-ground slow and normal walking were also observed. During treadmill and over-ground fast walking and running, the correlations were low (r = 0.21-0.46). The ActivPAL monitor is a valid measurement tool for assessing time spent sitting/lying, standing, and walking, sit-to-stand and stand-to-sit transition counts and step counts in slow and normal walking. The device did not measure accurately steps taken during treadmill and over-ground fast walking and running in children.

  2. Graphical Tests for Power Comparison of Competing Designs.

    PubMed

    Hofmann, H; Follett, L; Majumder, M; Cook, D

    2012-12-01

    Lineups have been established as tools for visual testing similar to standard statistical inference tests, allowing us to evaluate the validity of graphical findings in an objective manner. In simulation studies lineups have been shown as being efficient: the power of visual tests is comparable to classical tests while being much less stringent in terms of distributional assumptions made. This makes lineups versatile, yet powerful, tools in situations where conditions for regular statistical tests are not or cannot be met. In this paper we introduce lineups as a tool for evaluating the power of competing graphical designs. We highlight some of the theoretical properties and then show results from two studies evaluating competing designs: both studies are designed to go to the limits of our perceptual abilities to highlight differences between designs. We use both accuracy and speed of evaluation as measures of a successful design. The first study compares the choice of coordinate system: polar versus cartesian coordinates. The results show strong support in favor of cartesian coordinates in finding fast and accurate answers to spotting patterns. The second study is aimed at finding shift differences between distributions. Both studies are motivated by data problems that we have recently encountered, and explore using simulated data to evaluate the plot designs under controlled conditions. Amazon Mechanical Turk (MTurk) is used to conduct the studies. The lineups provide an effective mechanism for objectively evaluating plot designs.

  3. Experimental studies on brain hematoma detection and oxygenation monitoring using PRM/NIR sensors

    NASA Astrophysics Data System (ADS)

    Zheng, Liu; Lee, Hyo Sang; Wilson, David A.; Hanley, Daniel F.; Lokos, Sandor; Kim, Jin

    1997-08-01

    Real time noninvasive head injury detection is needed in critical care facilities and triage site with limited resources. One tool missing right now is a small and fast noninvasive sensor which can help urgent care workers to (1) diagnose the location and severity of the injury, (2) to perform on site pre-hospital treatment if necessary, and (3) to make a decision on what kind of further medical action is needed. On the other hand, continuous monitoring of cerebral blood oxygenation is also needed in intensive care unit and in operation rooms. Pseudo-random modulation/near infrared sensor (PRM/NIR sensor) is developed to address these issues. It relies on advanced techniques in diode laser cw modulation and time resolved spectroscopy to perform fast and noninvasive brain tissue diagnostics. Phantom experiments have been conducted to study the feasibility of the sensor. Brain's optical properties are simulated with solutions of intralipid and ink. Hematomas are simulated with bags of paint and hemoglobin immersed in the solution of varies sizes, depths, and orientations. Effects of human skull and hair are studied experimentally. In animal experiment, the sensor was used to monitor the cerebral oxygenation change due to hypercapnia, hypoxia, and hyperventilation. Good correlations were found between NIR measurement parameters and physiological changes induced to the animals.

  4. Kinetics of inactivation and dilution effects on the mass balance of fungal phytopathogens in anaerobic digesters.

    PubMed

    Plöchl, Matthias; Heiermann, Monika; Rodemann, Bernd; Bandte, Martina; Büttner, Carmen

    2014-01-15

    Knowledge of fate and behavior of plant pathogens in the biogas production chain is limited and hampers the estimation and evaluation of the potential phytosanitary risk if digestate is spread on arable land as a fertilizer. Therefore, simulation is an appropriate tool to demonstrate the effects which influence the steady state of pathogen infected plant material in both digesters and digestate. Simple approaches of kinetics of inactivation and mass balances of infected material were carried out considering single-step as well as two-step digestion. The simulation revealed a very fast to fast reduction of infected material after a singular feeding, reaching a cutback to less than 1% of input within 4 days even for D90-values of 68 h. Steady state mass balances below input rate could be calculated with D90-values of less than 2 h at a continuous hourly feeding. At higher D90-values steady state mass balances exceed the input rate but are still clearly below the sum of input mass. Dilution further decreases mass balances to values 10(-5) to 10(-6) Mg m(-3) for first-step digestion and 10(-8) to 10(-9) for second-step. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Gamma oscillations in a nonlinear regime: a minimal model approach using heterogeneous integrate-and-fire networks.

    PubMed

    Bathellier, Brice; Carleton, Alan; Gerstner, Wulfram

    2008-12-01

    Fast oscillations and in particular gamma-band oscillation (20-80 Hz) are commonly observed during brain function and are at the center of several neural processing theories. In many cases, mathematical analysis of fast oscillations in neural networks has been focused on the transition between irregular and oscillatory firing viewed as an instability of the asynchronous activity. But in fact, brain slice experiments as well as detailed simulations of biological neural networks have produced a large corpus of results concerning the properties of fully developed oscillations that are far from this transition point. We propose here a mathematical approach to deal with nonlinear oscillations in a network of heterogeneous or noisy integrate-and-fire neurons connected by strong inhibition. This approach involves limited mathematical complexity and gives a good sense of the oscillation mechanism, making it an interesting tool to understand fast rhythmic activity in simulated or biological neural networks. A surprising result of our approach is that under some conditions, a change of the strength of inhibition only weakly influences the period of the oscillation. This is in contrast to standard theoretical and experimental models of interneuron network gamma oscillations (ING), where frequency tightly depends on inhibition strength, but it is similar to observations made in some in vitro preparations in the hippocampus and the olfactory bulb and in some detailed network models. This result is explained by the phenomenon of suppression that is known to occur in strongly coupled oscillating inhibitory networks but had not yet been related to the behavior of oscillation frequency.

  6. Development and validation of a two-dimensional fast-response flood estimation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judi, David R; Mcpherson, Timothy N; Burian, Steven J

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. Themore » simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.« less

  7. Development of a Wind Plant Large-Eddy Simulation with Measurement-Driven Atmospheric Inflow: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quon, Eliot; Churchfield, Matthew; Cheung, Lawrence

    This paper details the development of an aeroelastic wind plant model with large-eddy simulation (LES). The chosen LES solver is the Simulator for Wind Farm Applications (SOWFA) based on the OpenFOAM framework, coupled to NREL's comprehensive aeroelastic analysis tool, FAST. An atmospheric boundary layer (ABL) precursor simulation was constructed based on assessments of meteorological tower, lidar, and radar data over a 3-hour window. This precursor was tuned to the specific atmospheric conditions that occurred both prior to and during the measurement campaign, enabling capture of a night-to-day transition in the turbulent ABL. In the absence of height-varying temperature measurements, spatiallymore » averaged radar data were sufficient to characterize the atmospheric stability of the wind plant in terms of the shear profile, and near-ground temperature sensors provided a reasonable estimate of the ground heating rate describing the morning transition. A full aeroelastic simulation was then performed for a subset of turbines within the wind plant, driven by the precursor. Analysis of two turbines within the array, one directly waked by the other, demonstrated good agreement with measured time-averaged loads.« less

  8. Development of a Wind Plant Large-Eddy Simulation with Measurement-Driven Atmospheric Inflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quon, Eliot W.; Churchfield, Matthew J.; Cheung, Lawrence

    This paper details the development of an aeroelastic wind plant model with large-eddy simulation (LES). The chosen LES solver is the Simulator for Wind Farm Applications (SOWFA) based on the OpenFOAM framework, coupled to NREL's comprehensive aeroelastic analysis tool, FAST. An atmospheric boundary layer (ABL) precursor simulation was constructed based on assessments of meteorological tower, lidar, and radar data over a 3-hour window. This precursor was tuned to the specific atmospheric conditions that occurred both prior to and during the measurement campaign, enabling capture of a night-to-day transition in the turbulent ABL. In the absence of height-varying temperature measurements, spatiallymore » averaged radar data were sufficient to characterize the atmospheric stability of the wind plant in terms of the shear profile, and near-ground temperature sensors provided a reasonable estimate of the ground heating rate describing the morning transition. A full aeroelastic simulation was then performed for a subset of turbines within the wind plant, driven by the precursor. Analysis of two turbines within the array, one directly waked by the other, demonstrated good agreement with measured time-averaged loads.« less

  9. An integrated system for dissolution studies and magnetic resonance imaging of controlled release, polymer-based dosage forms-a tool for quantitative assessment of hydrogel formation processes.

    PubMed

    Kulinowski, Piotr; Dorozyński, Przemysław; Jachowicz, Renata; Weglarz, Władysław P

    2008-11-04

    Controlled release (CR) dosage forms are often based on polymeric matrices, e.g., sustained-release tablets and capsules. It is crucial to visualise and quantify processes of the hydrogel formation during the standard dissolution study. A method for imaging of CR, polymer-based dosage forms during dissolution study in vitro is presented. Imaging was performed in a non-invasive way by means of the magnetic resonance imaging (MRI). This study was designed to simulate in vivo conditions regarding temperature, volume, state and composition of dissolution media. Two formulations of hydrodynamically balanced systems (HBS) were chosen as model CR dosage forms. HBS release active substance in stomach while floating on the surface of the gastric content. Time evolutions of the diffusion region, hydrogel formation region and "dry core" region were obtained during a dissolution study of L-dopa as a model drug in two simulated gastric fluids (i.e. in fed and fasted state). This method seems to be a very promising tool for examining properties of new formulations of CR, polymer-based dosage forms or for comparison of generic and originator dosage forms before carrying out bioequivalence studies.

  10. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  11. Simulation of atmospheric and terrestrial background signatures for detection and tracking scenarios

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2015-10-01

    In the fields of early warning, one is depending on reliable image exploitation: Only if the applied detection and tracking algorithms work efficiently, the threat approach alert can be given fast enough to ensure an automatic initiation of the countermeasure. In order to evaluate the performance of those algorithms for a certain electro-optical (EO) sensor system, test sequences need to be created as realistic and comprehensive as possible. Since both, background and target signature, depend on the environmental conditions, a detailed knowledge of the meteorology and climatology is necessary. Trials for measuring these environmental characteristics serve as a solid basis, but might only constitute the conditions during a rather short period of time. To represent the entire variation of meteorology and climatology that the future system will be exposed to, the application of comprehensive atmospheric modelling tools is essential. This paper gives an introduction of the atmospheric modelling tools that are currently used at Fraunhofer IOSB to simulate spectral background signatures in the infrared (IR) range. It is also demonstrated, how those signatures are affected by changing atmospheric and climatic conditions. In conclusion - and with a special focus on the modelling of different cloud types - sources of error and limits are discussed.

  12. Measurement and simulation of passive fast-ion D-alpha emission from the DIII-D tokamak

    DOE PAGES

    Bolte, Nathan G.; Heidbrink, William W.; Pace, David; ...

    2016-09-14

    Spectra of passive fast-ion D-alpha (FIDA) light from beam ions that charge exchange with background neutrals are measured and simulated. The fast ions come from three sources: ions that pass through the diagnostic sightlines on their first full orbit, an axisymmetric confined population, and ions that are expelled into the edge region by instabilities. A passive FIDA simulation (P-FIDASIM) is developed as a forward model for the spectra of the first-orbit fast ions and consists of an experimentally-validated beam deposition model, an ion orbit-following code, a collisional-radiative model, and a synthetic spectrometer. Model validation consists of the simulation of 86more » experimental spectra that are obtained using 6 different neutral beam fast-ion sources and 13 different lines of sight. Calibrated spectra are used to estimate the neutral density throughout the cross-section of the tokamak. The resulting 2D neutral density shows the expected increase toward each X-point with average neutral densities of 8 X 10 9 cm -3 at the plasma boundary and 1 X 10 11 cm -3 near the wall. Here, fast ions that are on passing orbits are expelled by the sawtooth instability more readily than trapped ions. In a sample discharge, approximately 1% of the fast-ion population is ejected into the high neutral density region per sawtooth crash.« less

  13. Constraining anomalous Higgs boson couplings to the heavy-flavor fermions using matrix element techniques

    NASA Astrophysics Data System (ADS)

    Gritsan, Andrei V.; Röntsch, Raoul; Schulze, Markus; Xiao, Meng

    2016-09-01

    In this paper, we investigate anomalous interactions of the Higgs boson with heavy fermions, employing shapes of kinematic distributions. We study the processes p p →t t ¯+H , b b ¯+H , t q +H , and p p →H →τ+τ- and present applications of event generation, reweighting techniques for fast simulation of anomalous couplings, as well as matrix element techniques for optimal sensitivity. We extend the matrix element likelihood approach (MELA) technique, which proved to be a powerful matrix element tool for Higgs boson discovery and characterization during Run I of the LHC, and implement all analysis tools in the JHU generator framework. A next-to-leading-order QCD description of the p p →t t ¯+H process allows us to investigate the performance of the MELA in the presence of extra radiation. Finally, projections for LHC measurements through the end of Run III are presented.

  14. Remote Visualization and Remote Collaboration On Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    A new technology has been developed for remote visualization that provides remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as fluid dynamics simulations or measurements). Based on this technology, some World Wide Web sites on the Internet are providing fluid dynamics data for educational or testing purposes. This technology is also being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics and wind tunnel testing. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit).

  15. Fast micromagnetic simulations on GPU—recent advances made with \\mathsf{mumax}^3

    NASA Astrophysics Data System (ADS)

    Leliaert, J.; Dvornik, M.; Mulkers, J.; De Clercq, J.; Milošević, M. V.; Van Waeyenberge, B.

    2018-03-01

    In the last twenty years, numerical modeling has become an indispensable part of magnetism research. It has become a standard tool for both the exploration of new systems and for the interpretation of experimental data. In the last five years, the capabilities of micromagnetic modeling have dramatically increased due to the deployment of graphical processing units (GPU), which have sped up calculations to a factor of 200. This has enabled many studies which were previously unfeasible. In this topical review, we give an overview of this modeling approach and show how it has contributed to the forefront of current magnetism research.

  16. Active thermography in qualitative evaluation of protective materials.

    PubMed

    Gralewicz, Grzegorz; Wiecek, Bogusław

    2009-01-01

    This is a study of the possibilities of a qualitative evaluation of protective materials with active thermography. It presents a simulation of a periodic excitation of a multilayer composite material. Tests were conducted with lock-in thermography on Kevlar composite consisting of 16 layers of Kevlar fabric reinforced with formaldehyde resin with implanted delamination defects. Lock-in thermography is a versatile tool for nondestructive evaluation. It is a fast, remote and nondestructive procedure. Hence, it was used to detect delaminations in the composite structure of materials used in the production of components designed for personal protection. This method directly contributes to an improvement in safety.

  17. Gas Gun Model and Comparison to Experimental Performance of Pipe Guns Operating with Light Propellant Gases and Large Cryogenic Pellets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, J. R.; Carmichael, J. R.; Gebhart, T. E.

    Injection of multiple large (~10 to 30 mm diameter) shattered pellets into ITER plasmas is presently part of the scheme planned to mitigate the deleterious effects of disruptions on the vessel components. To help in the design and optimize performance of the pellet injectors for this application, a model referred to as “the gas gun simulator” has been developed and benchmarked against experimental data. The computer code simulator is a Java program that models the gas-dynamics characteristics of a single-stage gas gun. Following a stepwise approach, the code utilizes a variety of input parameters to incrementally simulate and analyze themore » dynamics of the gun as the projectile is launched down the barrel. Using input data, the model can calculate gun performance based on physical characteristics, such as propellant-gas and fast-valve properties, barrel geometry, and pellet mass. Although the model is fundamentally generic, the present version is configured to accommodate cryogenic pellets composed of H2, D2, Ne, Ar, and mixtures of them and light propellant gases (H2, D2, and He). The pellets are solidified in situ in pipe guns that consist of stainless steel tubes and fast-acting valves that provide the propellant gas for pellet acceleration (to speeds ~200 to 700 m/s). The pellet speed is the key parameter in determining the response time of a shattered pellet system to a plasma disruption event. The calculated speeds from the code simulations of experiments were typically in excellent agreement with the measured values. With the gas gun simulator validated for many test shots and over a wide range of physical and operating parameters, it is a valuable tool for optimization of the injector design, including the fast valve design (orifice size and volume) for any operating pressure (~40 bar expected for the ITER application) and barrel length for any pellet size (mass, diameter, and length). Key design parameters and proposed values for the pellet injectors for the ITER disruption mitigation systems are discussed.« less

  18. Gas Gun Model and Comparison to Experimental Performance of Pipe Guns Operating with Light Propellant Gases and Large Cryogenic Pellets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combs, S. K.; Reed, J. R.; Lyttle, M. S.

    2016-01-01

    Injection of multiple large (~10 to 30 mm diameter) shattered pellets into ITER plasmas is presently part of the scheme planned to mitigate the deleterious effects of disruptions on the vessel components. To help in the design and optimize performance of the pellet injectors for this application, a model referred to as “the gas gun simulator” has been developed and benchmarked against experimental data. The computer code simulator is a Java program that models the gas-dynamics characteristics of a single-stage gas gun. Following a stepwise approach, the code utilizes a variety of input parameters to incrementally simulate and analyze themore » dynamics of the gun as the projectile is launched down the barrel. Using input data, the model can calculate gun performance based on physical characteristics, such as propellant-gas and fast-valve properties, barrel geometry, and pellet mass. Although the model is fundamentally generic, the present version is configured to accommodate cryogenic pellets composed of H2, D2, Ne, Ar, and mixtures of them and light propellant gases (H2, D2, and He). The pellets are solidified in situ in pipe guns that consist of stainless steel tubes and fast-acting valves that provide the propellant gas for pellet acceleration (to speeds ~200 to 700 m/s). The pellet speed is the key parameter in determining the response time of a shattered pellet system to a plasma disruption event. The calculated speeds from the code simulations of experiments were typically in excellent agreement with the measured values. With the gas gun simulator validated for many test shots and over a wide range of physical and operating parameters, it is a valuable tool for optimization of the injector design, including the fast valve design (orifice size and volume) for any operating pressure (~40 bar expected for the ITER application) and barrel length for any pellet size (mass, diameter, and length). Key design parameters and proposed values for the pellet injectors for the ITER disruption mitigation systems are discussed.« less

  19. Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan

    2012-01-01

    The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.

  20. Kinetic-MHD hybrid simulation of fishbone modes excited by fast ions on the experimental advanced superconducting tokamak (EAST)

    NASA Astrophysics Data System (ADS)

    Pei, Youbin; Xiang, Nong; Hu, Youjun; Todo, Y.; Li, Guoqiang; Shen, Wei; Xu, Liqing

    2017-03-01

    Kinetic-MagnetoHydroDynamic hybrid simulations are carried out to investigate fishbone modes excited by fast ions on the Experimental Advanced Superconducting Tokamak. The simulations use realistic equilibrium reconstructed from experiment data with the constraint of the q = 1 surface location (q is the safety factor). Anisotropic slowing down distribution is used to model the distribution of the fast ions from neutral beam injection. The resonance condition is used to identify the interaction between the fishbone mode and the fast ions, which shows that the fishbone mode is simultaneously in resonance with the bounce motion of the trapped particles and the transit motion of the passing particles. Both the passing and trapped particles are important in destabilizing the fishbone mode. The simulations show that the mode frequency chirps down as the mode reaches the nonlinear stage, during which there is a substantial flattening of the perpendicular pressure of fast ions, compared with that of the parallel pressure. For passing particles, the resonance remains within the q = 1 surface, while, for trapped particles, the resonant location moves out radially during the nonlinear evolution. In addition, parameter scanning is performed to examine the dependence of the linear frequency and growth rate of fishbones on the pressure and injection velocity of fast ions.

  1. Simulations of electron transport and ignition for direct-drive fast-ignition targets

    NASA Astrophysics Data System (ADS)

    Solodov, A. A.; Anderson, K. S.; Betti, R.; Gotcheva, V.; Myatt, J.; Delettrez, J. A.; Skupsky, S.; Theobald, W.; Stoeckl, C.

    2008-11-01

    The performance of high-gain, fast-ignition fusion targets is investigated using one-dimensional hydrodynamic simulations of implosion and two-dimensional (2D) hybrid fluid-particle simulations of hot-electron transport, ignition, and burn. The 2D/3D hybrid-particle-in-cell code LSP [D. R. Welch et al., Nucl. Instrum. Methods Phys. Res. A 464, 134 (2001)] and the 2D fluid code DRACO [P. B. Radha et al., Phys. Plasmas 12, 056307 (2005)] are integrated to simulate the hot-electron transport and heating for direct-drive fast-ignition targets. LSP simulates the transport of hot electrons from the place where they are generated to the dense fuel core where their energy is absorbed. DRACO includes the physics required to simulate compression, ignition, and burn of fast-ignition targets. The self-generated resistive magnetic field is found to collimate the hot-electron beam, increase the coupling efficiency of hot electrons with the target, and reduce the minimum energy required for ignition. Resistive filamentation of the hot-electron beam is also observed. The minimum energy required for ignition is found for hot electrons with realistic angular spread and Maxwellian energy-distribution function.

  2. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian; Robertson, Amy; Jonkman, Jason

    2016-08-01

    The open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed by the University of Maine, and FEAMooring is a finite-element-based mooring dynamics module developed by Texas A&M University. This paper summarizes the work performed to verify and validate these modules against other mooring models and measured test data to assess their reliability and accuracy. The quality of the fairlead load predictions by the open-source mooring modules MoorDyn and FEAMooring appear to be largely equivalent to what is predicted by themore » commercial tool OrcaFlex. Both mooring dynamic model predictions agree well with the experimental data, considering the given limitations in the accuracy of the platform hydrodynamic load calculation and the quality of the measurement data.« less

  3. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F.; Andersen, Morten T.; Robertson, Amy N.

    2016-07-01

    The open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed by the University of Maine, and FEAMooring is a finite-element-based mooring dynamics module developed by Texas A&M University. This paper summarizes the work performed to verify and validate these modules against other mooring models and measured test data to assess their reliability and accuracy. The quality of the fairlead load predictions by the open-source mooring modules MoorDyn and FEAMooring appear to be largely equivalent to what is predicted by themore » commercial tool OrcaFlex. Both mooring dynamic model predictions agree well with the experimental data, considering the given limitations in the accuracy of the platform hydrodynamic load calculation and the quality of the measurement data.« less

  4. Neural network Hilbert transform based filtered backprojection for fast inline x-ray inspection

    NASA Astrophysics Data System (ADS)

    Janssens, Eline; De Beenhouwer, Jan; Van Dael, Mattias; De Schryver, Thomas; Van Hoorebeke, Luc; Verboven, Pieter; Nicolai, Bart; Sijbers, Jan

    2018-03-01

    X-ray imaging is an important tool for quality control since it allows to inspect the interior of products in a non-destructive way. Conventional x-ray imaging, however, is slow and expensive. Inline x-ray inspection, on the other hand, can pave the way towards fast and individual quality control, provided that a sufficiently high throughput can be achieved at a minimal cost. To meet these criteria, an inline inspection acquisition geometry is proposed where the object moves and rotates on a conveyor belt while it passes a fixed source and detector. Moreover, for this acquisition geometry, a new neural-network-based reconstruction algorithm is introduced: the neural network Hilbert transform based filtered backprojection. The proposed algorithm is evaluated both on simulated and real inline x-ray data and has shown to generate high quality reconstructions of 400  ×  400 reconstruction pixels within 200 ms, thereby meeting the high throughput criteria.

  5. In-silico wear prediction for knee replacements--methodology and corroboration.

    PubMed

    Strickland, M A; Taylor, M

    2009-07-22

    The capability to predict in-vivo wear of knee replacements is a valuable pre-clinical analysis tool for implant designers. Traditionally, time-consuming experimental tests provided the principal means of investigating wear. Today, computational models offer an alternative. However, the validity of these models has not been demonstrated across a range of designs and test conditions, and several different formulas are in contention for estimating wear rates, limiting confidence in the predictive power of these in-silico models. This study collates and retrospectively simulates a wide range of experimental wear tests using fast rigid-body computational models with extant wear prediction algorithms, to assess the performance of current in-silico wear prediction tools. The number of tests corroborated gives a broader, more general assessment of the performance of these wear-prediction tools, and provides better estimates of the wear 'constants' used in computational models. High-speed rigid-body modelling allows a range of alternative algorithms to be evaluated. Whilst most cross-shear (CS)-based models perform comparably, the 'A/A+B' wear model appears to offer the best predictive power amongst existing wear algorithms. However, the range and variability of experimental data leaves considerable uncertainty in the results. More experimental data with reduced variability and more detailed reporting of studies will be necessary to corroborate these models with greater confidence. With simulation times reduced to only a few minutes, these models are ideally suited to large-volume 'design of experiment' or probabilistic studies (which are essential if pre-clinical assessment tools are to begin addressing the degree of variation observed clinically and in explanted components).

  6. Characterizing acid diffusion lengths in chemically amplified resists from measurements of deprotection kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patil, Abhijit A.; Pandey, Yogendra Narayan; Doxastakis, Manolis

    2014-10-01

    The acid-catalyzed deprotection of glassy poly(4-hydroxystyrene-co-tertbutyl acrylate) films was studied with infrared absorbance spectroscopy and stochastic simulations. Experimental data were interpreted with a simple description of subdiffusive acid transport coupled to second-order acid loss. This model predicts key attributes of observed deprotection rates, such as fast reaction at short times, slow reaction at long times, and a nonlinear dependence on acid loading. Fickian diffusion is approached by increasing the post-exposure bake temperature or adding plasticizing agents to the polymer resin. These findings demonstrate that acid mobility and overall deprotection kinetics are coupled to glassy matrix dynamics. To complement the analysismore » of bulk kinetics, acid diffusion lengths were calculated from the anomalous transport model and compared with nanopattern line widths. The consistent scaling between experiments and simulations suggests that the anomalous diffusion model could be further developed into a predictive lithography tool.« less

  7. High speed stereovision setup for position and motion estimation of fertilizer particles leaving a centrifugal spreader.

    PubMed

    Hijazi, Bilal; Cool, Simon; Vangeyte, Jürgen; Mertens, Koen C; Cointault, Frédéric; Paindavoine, Michel; Pieters, Jan G

    2014-11-13

    A 3D imaging technique using a high speed binocular stereovision system was developed in combination with corresponding image processing algorithms for accurate determination of the parameters of particles leaving the spinning disks of centrifugal fertilizer spreaders. Validation of the stereo-matching algorithm using a virtual 3D stereovision simulator indicated an error of less than 2 pixels for 90% of the particles. The setup was validated using the cylindrical spread pattern of an experimental spreader. A 2D correlation coefficient of 90% and a Relative Error of 27% was found between the experimental results and the (simulated) spread pattern obtained with the developed setup. In combination with a ballistic flight model, the developed image acquisition and processing algorithms can enable fast determination and evaluation of the spread pattern which can be used as a tool for spreader design and precise machine calibration.

  8. Versatile fusion source integrator AFSI for fast ion and neutron studies in fusion devices

    NASA Astrophysics Data System (ADS)

    Sirén, Paula; Varje, Jari; Äkäslompolo, Simppa; Asunta, Otto; Giroud, Carine; Kurki-Suonio, Taina; Weisen, Henri; JET Contributors, The

    2018-01-01

    ASCOT Fusion Source Integrator AFSI, an efficient tool for calculating fusion reaction rates and characterizing the fusion products, based on arbitrary reactant distributions, has been developed and is reported in this paper. Calculation of reactor-relevant D-D, D-T and D-3He fusion reactions has been implemented based on the Bosch-Hale fusion cross sections. The reactions can be calculated between arbitrary particle populations, including Maxwellian thermal particles and minority energetic particles. Reaction rate profiles, energy spectra and full 4D phase space distributions can be calculated for the non-isotropic reaction products. The code is especially suitable for integrated modelling in self-consistent plasma physics simulations as well as in the Serpent neutronics calculation chain. Validation of the model has been performed for neutron measurements at the JET tokamak and the code has been applied to predictive simulations in ITER.

  9. Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian; Melaina, Marc; Penev, Michael

    This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.

  10. The interrater and test-retest reliability of the Home Falls and Accidents Screening Tool (HOME FAST) in Malaysia: Using raters with a range of professional backgrounds.

    PubMed

    Romli, Muhammad Hibatullah; Mackenzie, Lynette; Lovarini, Meryl; Tan, Maw Pin; Clemson, Lindy

    2017-06-01

    Falls can be a devastating issue for older people living in the community, including those living in Malaysia. Health professionals and community members have a responsibility to ensure that older people have a safe home environment to reduce the risk of falls. Using a standardised screening tool is beneficial to intervene early with this group. The Home Falls and Accidents Screening Tool (HOME FAST) should be considered for this purpose; however, its use in Malaysia has not been studied. Therefore, the aim of this study was to evaluate the interrater and test-retest reliability of the HOME FAST with multiple professionals in the Malaysian context. A cross-sectional design was used to evaluate interrater reliability where the HOME FAST was used simultaneously in the homes of older people by 2 raters and a prospective design was used to evaluate test-retest reliability with a separate group of older people at different times in their homes. Both studies took place in an urban area of Kuala Lumpur. Professionals from 9 professional backgrounds participated as raters in this study, and a group of 51 community older people were recruited for the interrater reliability study and another group of 30 for the test-retest reliability study. The overall agreement was moderate for interrater reliability and good for test-retest reliability. The HOME FAST was consistently rated by different professionals, and no bias was found among the multiple raters. The HOME FAST can be used with confidence by a variety of professionals across different settings. The HOME FAST can become a universal tool to screen for home hazards related to falls. © 2017 John Wiley & Sons, Ltd.

  11. An FMM-FFT Accelerated SIE Simulator for Analyzing EM Wave Propagation in Mine Environments Loaded With Conductors

    PubMed Central

    Sheng, Weitian; Zhou, Chenming; Liu, Yang; Bagci, Hakan; Michielssen, Eric

    2018-01-01

    A fast and memory efficient three-dimensional full-wave simulator for analyzing electromagnetic (EM) wave propagation in electrically large and realistic mine tunnels/galleries loaded with conductors is proposed. The simulator relies on Muller and combined field surface integral equations (SIEs) to account for scattering from mine walls and conductors, respectively. During the iterative solution of the system of SIEs, the simulator uses a fast multipole method-fast Fourier transform (FMM-FFT) scheme to reduce CPU and memory requirements. The memory requirement is further reduced by compressing large data structures via singular value and Tucker decompositions. The efficiency, accuracy, and real-world applicability of the simulator are demonstrated through characterization of EM wave propagation in electrically large mine tunnels/galleries loaded with conducting cables and mine carts. PMID:29726545

  12. CAFNA{reg{underscore}sign}, coded aperture fast neutron analysis for contraband detection: Preliminary results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L.; Lanza, R.C.

    1999-12-01

    The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less

  13. Particle-in-cell studies of fast-ion slowing-down rates in cool tenuous magnetized plasma

    NASA Astrophysics Data System (ADS)

    Evans, Eugene S.; Cohen, Samuel A.; Welch, Dale R.

    2018-04-01

    We report on 3D-3V particle-in-cell simulations of fast-ion energy-loss rates in a cold, weakly-magnetized, weakly-coupled plasma where the electron gyroradius, ρe, is comparable to or less than the Debye length, λDe, and the fast-ion velocity exceeds the electron thermal velocity, a regime in which the electron response may be impeded. These simulations use explicit algorithms, spatially resolve ρe and λDe, and temporally resolve the electron cyclotron and plasma frequencies. For mono-energetic dilute fast ions with isotropic velocity distributions, these scaling studies of the slowing-down time, τs, versus fast-ion charge are in agreement with unmagnetized slowing-down theory; with an applied magnetic field, no consistent anisotropy between τs in the cross-field and field-parallel directions could be resolved. Scaling the fast-ion charge is confirmed as a viable way to reduce the required computational time for each simulation. The implications of these slowing down processes are described for one magnetic-confinement fusion concept, the small, advanced-fuel, field-reversed configuration device.

  14. Validation Of The Airspace Concept Evaluation System Using Real World Data

    NASA Technical Reports Server (NTRS)

    Zelinski, Shannon

    2005-01-01

    This paper discusses the process of performing a validation of the Airspace Concept Evaluation System (ACES) using real world historical flight operational data. ACES inputs are generated from select real world data and processed to create a realistic reproduction of a single day of operations within the National Airspace System (NAS). ACES outputs are then compared to real world operational metrics and delay statistics for the reproduced day. Preliminary results indicate that ACES produces delays and airport operational metrics similar to the real world with minor variations of delay by phase of flight. ACES is a nation-wide fast-time simulation tool developed at NASA Ames Research Center. ACES models and simulates the NAS using interacting agents representing center control, terminal flow management, airports, individual flights, and other NAS elements. These agents pass messages between one another similar to real world communications. This distributed agent based system is designed to emulate the highly unpredictable nature of the NAS, making it a suitable tool to evaluate current and envisioned airspace concepts. To ensure that ACES produces the most realistic results, the system must be validated. There is no way to validate future concepts scenarios using real world historical data, but current day scenario validations increase confidence in the validity of future scenario results. Each operational day has unique weather and traffic demand schedules. The more a simulation utilizes the unique characteristic of a specific day, the more realistic the results should be. ACES is able to simulate the full scale demand traffic necessary to perform a validation using real world data. Through direct comparison with the real world, models may continuee to be improved and unusual trends and biases may be filtered out of the system or used to normalize the results of future concept simulations.

  15. Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR

    NASA Technical Reports Server (NTRS)

    Corpaccioli, Luca; Linskens, Harry; Komar, David R.

    2014-01-01

    The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.

  16. Earth as a Tool for Astrobiology—A European Perspective

    NASA Astrophysics Data System (ADS)

    Martins, Zita; Cottin, Hervé; Kotler, Julia Michelle; Carrasco, Nathalie; Cockell, Charles S.; de la Torre Noetzel, Rosa; Demets, René; de Vera, Jean-Pierre; d'Hendecourt, Louis; Ehrenfreund, Pascale; Elsaesser, Andreas; Foing, Bernard; Onofri, Silvano; Quinn, Richard; Rabbow, Elke; Rettberg, Petra; Ricco, Antonio J.; Slenzka, Klaus; Stalport, Fabien; ten Kate, Inge L.; van Loon, Jack J. W. A.; Westall, Frances

    2017-07-01

    Scientists use the Earth as a tool for astrobiology by analyzing planetary field analogues (i.e. terrestrial samples and field sites that resemble planetary bodies in our Solar System). In addition, they expose the selected planetary field analogues in simulation chambers to conditions that mimic the ones of planets, moons and Low Earth Orbit (LEO) space conditions, as well as the chemistry occurring in interstellar and cometary ices. This paper reviews the ways the Earth is used by astrobiologists: (i) by conducting planetary field analogue studies to investigate extant life from extreme environments, its metabolisms, adaptation strategies and modern biosignatures; (ii) by conducting planetary field analogue studies to investigate extinct life from the oldest rocks on our planet and its biosignatures; (iii) by exposing terrestrial samples to simulated space or planetary environments and producing a sample analogue to investigate changes in minerals, biosignatures and microorganisms. The European Space Agency (ESA) created a topical team in 2011 to investigate recent activities using the Earth as a tool for astrobiology and to formulate recommendations and scientific needs to improve ground-based astrobiological research. Space is an important tool for astrobiology (see Horneck et al. in Astrobiology, 16:201-243, 2016; Cottin et al., 2017), but access to space is limited. Complementing research on Earth provides fast access, more replications and higher sample throughput. The major conclusions of the topical team and suggestions for the future include more scientifically qualified calls for field campaigns with planetary analogy, and a centralized point of contact at ESA or the EU for the organization of a survey of such expeditions. An improvement of the coordinated logistics, infrastructures and funding system supporting the combination of field work with planetary simulation investigations, as well as an optimization of the scientific return and data processing, data storage and data distribution is also needed. Finally, a coordinated EU or ESA education and outreach program would improve the participation of the public in the astrobiological activities.

  17. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitanidis, Peter

    As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO 2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic,more » tracer and thermal tests before CO 2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO 2 storage examples.« less

  19. Pilot study to investigate the feasibility of the Home Falls and Accidents Screening Tool (HOME FAST) to identify older Malaysian people at risk of falls

    PubMed Central

    Romli, Muhammad Hibatullah; Mackenzie, Lynette; Lovarini, Meryl; Tan, Maw Pin

    2016-01-01

    Objective The relationship between home hazards and falls in older Malaysian people is not yet fully understood. No tools to evaluate the Malaysian home environment currently exist. Therefore, this study aimed to pilot the Home Falls and Accidents Screening Tool (HOME FAST) to identify hazards in Malaysian homes, to evaluate the feasibility of using the HOME FAST in the Malaysian Elders Longitudinal Research (MELoR) study and to gather preliminary data about the experience of falls among a small sample of Malaysian older people. Design A cross-sectional pilot study was conducted. Setting An urban setting in Kuala Lumpur. Participants 26 older people aged 60 and over were recruited from the control group of a related research project in Malaysia, in addition to older people known to the researchers. Primary outcome measure The HOME FAST was applied with the baseline survey for the MELoR study via a face-to-face interview and observation of the home by research staff. Results The majority of the participants were female, of Malay or Chinese ethnicity and living with others in a double-storeyed house. Falls were reported in the previous year by 19% and 80% of falls occurred at home. Gender and fear of falling had the strongest associations with home hazards. Most hazards were detected in the bathroom area. A small number of errors were detected in the HOME FAST ratings by researchers. Conclusions The HOME FAST is feasible as a research and clinical tool for the Malaysian context and is appropriate for use in the MELoR study. Home hazards were prevalent in the homes of older people and further research with the larger MELoR sample is needed to confirm the validity of using the HOME FAST in Malaysia. Training in the use of the HOME FAST is needed to ensure accurate use by researchers. PMID:27531736

  20. Quiet Clean Short-haul Experimental Engine (QCSEE) over-the-wing engine and control simulation results

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A hybrid-computer simulation of the over the wing turbofan engine was constructed to develop the dynamic design of the control. This engine and control system includes a full authority digital electronic control using compressor stator reset to achieve fast thrust response and a modified Kalman filter to correct for sensor failures. Fast thrust response for powered-lift operations and accurate, fast responding, steady state control of the engine is provided. Simulation results for throttle bursts from 62 to 100 percent takeoff thrust predict that the engine will accelerate from 62 to 95 percent takeoff thrust in one second.

  1. Mesoscale Simulation Data for Initializing Fast-Time Wake Transport and Decay Models

    NASA Technical Reports Server (NTRS)

    Ahmad, Nashat N.; Proctor, Fred H.; Vanvalkenburg, Randal L.; Pruis, Mathew J.; LimonDuparcmeur, Fanny M.

    2012-01-01

    The fast-time wake transport and decay models require vertical profiles of crosswinds, potential temperature and the eddy dissipation rate as initial conditions. These inputs are normally obtained from various field sensors. In case of data-denied scenarios or operational use, these initial conditions can be provided by mesoscale model simulations. In this study, the vertical profiles of potential temperature from a mesoscale model were used as initial conditions for the fast-time wake models. The mesoscale model simulations were compared against available observations and the wake model predictions were compared with the Lidar measurements from three wake vortex field experiments.

  2. Modelling and Simulation on Multibody Dynamics for Vehicular Cold Launch Systems Based on Subsystem Synthesis Method

    NASA Astrophysics Data System (ADS)

    Panyun, YAN; Guozhu, LIANG; Yongzhi, LU; Zhihui, QI; Xingdou, GAO

    2017-12-01

    The fast simulation of the vehicular cold launch system (VCLS) in the launch process is an essential requirement for practical engineering applications. In particular, a general and fast simulation model of the VCLS will help the designer to obtain the optimum scheme in the initial design phase. For these purposes, a system-level fast simulation model was established for the VCLS based on the subsystem synthesis method. Moreover, a comparison of the load of a seven-axis VCLS on the rigid ground through both theoretical calculations and experiments was carried out. It was found that the error of the load of the rear left outrigger is less than 7.1%, and the error of the total load of all the outriggers is less than 2.8%. Moreover, time taken for completion of the simulation model is only 9.5 min, which is 5% of the time taken by conventional algorithms.

  3. Dual Level Statistical Investigation of Equilibrium Solubility in Simulated Fasted and Fed Intestinal Fluid

    PubMed Central

    2017-01-01

    The oral route is the preferred option for drug administration but contains the inherent issue of drug absorption from the gastro-intestinal tract (GIT) in order to elicit systemic activity. A prerequisite for absorption is drug dissolution, which is dependent upon drug solubility in the variable milieu of GIT fluid, with poorly soluble drugs presenting a formulation and biopharmaceutical challenge. Multiple factors within GIT fluid influence solubility ranging from pH to the concentration and ratio of amphiphilic substances, such as phospholipid, bile salt, monoglyceride, and cholesterol. To aid in vitro investigation simulated intestinal fluids (SIF) covering the fasted and fed state have been developed. SIF media is complex and statistical design of experiment (DoE) investigations have revealed the range of solubility values possible within each state due to physiological variability along with the media factors and factor interactions which influence solubility. However, these studies require large numbers of experiments (>60) and are not feasible or sensible within a drug development setting. In the current study a smaller dual level, reduced experimental number (20) DoE providing three arms covering the fasted and fed states along with a combined analysis has been investigated. The results indicate that this small scale investigation is feasible and provides solubility ranges that encompass published data in human and simulated fasted and fed fluids. The measured fasted and fed solubility ranges are in agreement with published large scale DoE results in around half of the cases, with the differences due to changes in media composition between studies. Indicating that drug specific behaviors are being determined and that careful media factor and concentration level selection is required in order to determine a physiologically relevant solubility range. The study also correctly identifies the major single factor or factors which influence solubility but it is evident that lower significance factors (for example bile salt) are not picked up due to the lower sample number employed. A similar issue is present with factor interactions with only a limited number available for study and generally not determined to have a significant solubility impact due to the lower statistical power of the study. The study indicates that a reduced experimental number DoE is feasible, will provide solubility range results with identification of major solubility factors however statistical limitations restrict the analysis. The approach therefore represents a useful initial screening tool that can guide further in depth analysis of a drug’s behavior in gastrointestinal fluids. PMID:29072917

  4. Laser-heating and Radiance Spectrometry for the Study of Nuclear Materials in Conditions Simulating a Nuclear Power Plant Accident

    PubMed Central

    Manara, Dario; Soldi, Luca; Mastromarino, Sara; Boboridis, Kostantinos; Robba, Davide; Vlahovic, Luka; Konings, Rudy

    2017-01-01

    Major and severe accidents have occurred three times in nuclear power plants (NPPs), at Three Mile Island (USA, 1979), Chernobyl (former USSR, 1986) and Fukushima (Japan, 2011). Research on the causes, dynamics, and consequences of these mishaps has been performed in a few laboratories worldwide in the last three decades. Common goals of such research activities are: the prevention of these kinds of accidents, both in existing and potential new nuclear power plants; the minimization of their eventual consequences; and ultimately, a full understanding of the real risks connected with NPPs. At the European Commission Joint Research Centre's Institute for Transuranium Elements, a laser-heating and fast radiance spectro-pyrometry facility is used for the laboratory simulation, on a small scale, of NPP core meltdown, the most common type of severe accident (SA) that can occur in a nuclear reactor as a consequence of a failure of the cooling system. This simulation tool permits fast and effective high-temperature measurements on real nuclear materials, such as plutonium and minor actinide-containing fission fuel samples. In this respect, and in its capability to produce large amount of data concerning materials under extreme conditions, the current experimental approach is certainly unique. For current and future concepts of NPP, example results are presented on the melting behavior of some different types of nuclear fuels: uranium-plutonium oxides, carbides, and nitrides. Results on the high-temperature interaction of oxide fuels with containment materials are also briefly shown. PMID:29286382

  5. Excavator Design Validation

    NASA Technical Reports Server (NTRS)

    Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je

    2010-01-01

    The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete element models and fast analytical models. By using the first to establish parameters for the second, a system has been created that can be executed in real time, or faster than real time, on a desktop PC. This allows Monte Carlo simulations to be performed on a computer platform available to all researchers, and it allows human interaction to be included in a real-time simulation process. Metrics on excavator performance are established that work with the simulation architecture. Both static and dynamic metrics are included.

  6. Wave propagation simulation in the upper core of sodium-cooled fast reactors using a spectral-element method for heterogeneous media

    NASA Astrophysics Data System (ADS)

    Nagaso, Masaru; Komatitsch, Dimitri; Moysan, Joseph; Lhuillier, Christian

    2018-01-01

    ASTRID project, French sodium cooled nuclear reactor of 4th generation, is under development at the moment by Alternative Energies and Atomic Energy Commission (CEA). In this project, development of monitoring techniques for a nuclear reactor during operation are identified as a measure issue for enlarging the plant safety. Use of ultrasonic measurement techniques (e.g. thermometry, visualization of internal objects) are regarded as powerful inspection tools of sodium cooled fast reactors (SFR) including ASTRID due to opacity of liquid sodium. In side of a sodium cooling circuit, heterogeneity of medium occurs because of complex flow state especially in its operation and then the effects of this heterogeneity on an acoustic propagation is not negligible. Thus, it is necessary to carry out verification experiments for developments of component technologies, while such kind of experiments using liquid sodium may be relatively large-scale experiments. This is why numerical simulation methods are essential for preceding real experiments or filling up the limited number of experimental results. Though various numerical methods have been applied for a wave propagation in liquid sodium, we still do not have a method for verifying on three-dimensional heterogeneity. Moreover, in side of a reactor core being a complex acousto-elastic coupled region, it has also been difficult to simulate such problems with conventional methods. The objective of this study is to solve these 2 points by applying three-dimensional spectral element method. In this paper, our initial results on three-dimensional simulation study on heterogeneous medium (the first point) are shown. For heterogeneity of liquid sodium to be considered, four-dimensional temperature field (three spatial and one temporal dimension) calculated by computational fluid dynamics (CFD) with Large-Eddy Simulation was applied instead of using conventional method (i.e. Gaussian Random field). This three-dimensional numerical experiment yields that we could verify the effects of heterogeneity of propagation medium on waves in Liquid sodium.

  7. Computer-intensive simulation of solid-state NMR experiments using SIMPSON.

    PubMed

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Benchmark of multi-phase method for the computation of fast ion distributions in a tokamak plasma in the presence of low-amplitude resonant MHD activity

    NASA Astrophysics Data System (ADS)

    Bierwage, A.; Todo, Y.

    2017-11-01

    The transport of fast ions in a beam-driven JT-60U tokamak plasma subject to resonant magnetohydrodynamic (MHD) mode activity is simulated using the so-called multi-phase method, where 4 ms intervals of classical Monte-Carlo simulations (without MHD) are interlaced with 1 ms intervals of hybrid simulations (with MHD). The multi-phase simulation results are compared to results obtained with continuous hybrid simulations, which were recently validated against experimental data (Bierwage et al., 2017). It is shown that the multi-phase method, in spite of causing significant overshoots in the MHD fluctuation amplitudes, accurately reproduces the frequencies and positions of the dominant resonant modes, as well as the spatial profile and velocity distribution of the fast ions, while consuming only a fraction of the computation time required by the continuous hybrid simulation. The present paper is limited to low-amplitude fluctuations consisting of a few long-wavelength modes that interact only weakly with each other. The success of this benchmark study paves the way for applying the multi-phase method to the simulation of Abrupt Large-amplitude Events (ALE), which were seen in the same JT-60U experiments but at larger time intervals. Possible implications for the construction of reduced models for fast ion transport are discussed.

  9. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  10. Performance Evaluation of the Approaches and Algorithms for Hamburg Airport Operations

    NASA Technical Reports Server (NTRS)

    Zhu, Zhifan; Jung, Yoon; Lee, Hanbong; Schier, Sebastian; Okuniek, Nikolai; Gerdes, Ingrid

    2016-01-01

    In this work, fast-time simulations have been conducted using SARDA tools at Hamburg airport by NASA and real-time simulations using CADEO and TRACC with the NLR ATM Research Simulator (NARSIM) by DLR. The outputs are analyzed using a set of common metrics collaborated between DLR and NASA. The proposed metrics are derived from International Civil Aviation Organization (ICAO)s Key Performance Areas (KPAs) in capability, efficiency, predictability and environment, and adapted to simulation studies. The results are examined to explore and compare the merits and shortcomings of the two approaches using the common performance metrics. Particular attention is paid to the concept of the close-loop, trajectory-based taxi as well as the application of US concept to the European airport. Both teams consider the trajectory-based surface operation concept a critical technology advance in not only addressing the current surface traffic management problems, but also having potential application in unmanned vehicle maneuver on airport surface, such as autonomous towing or TaxiBot [6][7] and even Remote Piloted Aircraft (RPA). Based on this work, a future integration of TRACC and SOSS is described aiming at bringing conflict-free trajectory-based operation concept to US airport.

  11. Simulation of earthquake caused building damages for the development of fast reconnaissance techniques

    NASA Astrophysics Data System (ADS)

    Schweier, C.; Markus, M.; Steinle, E.

    2004-04-01

    Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.

  12. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  13. Scoring the home falls and accidents screening tool for health professionals (HOME FAST-HP): Evidence from one epidemiological study.

    PubMed

    Mackenzie, Lynette; Byles, Julie

    2018-03-30

    Falls in older people are a major public health concern. To target falls prevention interventions, screening tools need to be able to identify older people at greater risk of falling. This study aimed to investigate the screening capacity of the Home Falls and Accidents Screening Tool for health professionals (HOME FAST-HP), and to identify the best cut-off score to identify older people at higher risk of falls using the HOME FAST-HP. The study used cross-sectional data from a random sample of 650 women from the 1921 to 1926 cohort of the Australian Longitudinal Study of Women's Health (ALSWH). Selected women were sent a postal survey including the HOME FAST-HP, falls history, and other health factors. Scores on the home fast were calculated and the cut-point for optimal sensitivity and specificity of the HOME FAST-HP in relation to falls was assessed using a Receiver Operating Characteristic curve. A total of 567 older women participated (response rate 87%). The mean age of participants was 77.5 yrs (95% CI 77.31-77.70). A total of 153 participants (27%) reported a fall in the previous six months. The mean number of hazards using the HOME FAST-HP was 9.74 (95% CI 9.48-10.01), range 2-22. Non-fallers had a mean of 9.6 hazards (95% CI 9.32-9.91) and fallers had a mean of 10.63 hazards (95% CI 10.08-11.19) which was a significant difference (t = 3.41, P = 0.001). The area under the receiver operator curve (AUC) was 0.58 (95% CI 0.53-0.64). A HOME FAST-HP cut-off score of 9 was associated with the optimal sensitivity for falls (73.9%), with specificity (37.9%), and positive predictive value was 30.6% and negative predictive value was 79.7%. The HOME FAST-HP can be used as a screening tool to identify fallers with a cut-off score of nine indicating a higher risk of falling. © 2018 Occupational Therapy Australia.

  14. Pediatric FAST and elevated liver transaminases: An effective screening tool in blunt abdominal trauma.

    PubMed

    Sola, Juan E; Cheung, Michael C; Yang, Relin; Koslow, Starr; Lanuti, Emma; Seaver, Chris; Neville, Holly L; Schulman, Carl I

    2009-11-01

    The current standard for the evaluation of children with blunt abdominal trauma (BAT) consists of physical examination, screening lab values, and computed tomography (CT) scan. We sought to determine if the focused assessment with sonography for trauma (FAST) combined with elevated liver transaminases (AST/ALT) could be used as a screening tool for intra-abdominal injury (IAI) in pediatric patients with BAT. Registry data at a level 1 trauma center was retrospectively reviewed from 1991-2007. Data collected on BAT patients under the age of 16 y included demographics, injury mechanism, ISS, GCS, imaging studies, serum ALT and AST levels, and disposition. AST and ALT were considered positive if either one was >100 IU/L. Overall, 3171 cases were identified. A total of 1008 (31.8%) patients received CT scan, 1148 (36.2%) had FAST, and 497 (15.7%) patients received both. Of the 497 patients, 400 (87.1%) also had AST and ALT measured. FAST was 50% sensitive, 91% specific, with a positive predictive value (PPV) of 68%, negative predictive value (NPV) of 83%, and accuracy of 80%. Combining FAST with elevated AST or ALT resulted in a statistically significant increase in all measures (sensitivity 88%, specificity 98%, PPV 94%, NPV 96%, accuracy 96%). FAST combined with AST or ALT > 100 IU/L is an effective screening tool for IAI in children following BAT. Pediatric patients with a negative FAST and liver transaminases < 100 IU/L should be observed rather than subjected to the radiation risk of CT.

  15. [Simulation in obstetrics and gynecology - a new method to improve the management of acute obstetric emergencies].

    PubMed

    Blum, Ronja; Gairing Bürglin, Anja; Gisin, Stefan

    2008-11-01

    In medical specialties, such as anaesthesia, the use of simulation has increased over the past 15 years. Medical simulation attempts to reproduce important clinical situations to practise team training or individual skills in a risk free environment. For a long time simulators have only been used by the airline industry and the military. Simulation as a training tool for practicing critical situations in obstetrics is not very common yet. Experience and routine are crucial to evaluate a medical emergency correctly and to take the appropriate measures. Nowadays the obstetrician requires a combination of manual and communication skills, fast emergency management and decision-making skills. Therefore simulation may help to attain these skills. This may not only satisfy the high expectations and demands of the patients towards doctors and midwives but would also help to keep calm in difficult situations and avoid mistakes. The goal is a risk free delivery for mother and child. Therefore we developed a simulation- based curricular unit for hands-on training of four different obstetric emergency scenarios. In this paper we describe our results about the feedback of doctors and midwives on their personal experiences due to this simulation-based curricular unit. The results indicate that simulation seems to be an accepted method for team training in emergency situations in obstetrics. Whether patient security increases after the regularly use of drill training needs to be investigated in further studies.

  16. Operating manual: Fast response solar array simulator

    NASA Technical Reports Server (NTRS)

    Vonhatten, R.; Weimer, A.; Zerbel, D. W.

    1971-01-01

    The fast response solar array simulator (FRSAS) is a universal solar array simulator which features an AC response identical to that of a real array over a large range of DC operating points. In addition, short circuit current (I sub sc) and open circuit voltage (V sub oc) are digitally programmable over a wide range for use not only in simulating a wide range of array sizes, but also to simulate (I sub sc) and (V sub oc) variations with illumination and temperature. A means for simulation of current variations due to spinning is available. Provisions for remote control and monitoring, automatic failure sensing and warning, and a load simulator are also included.

  17. Front panel engineering with CAD simulation tool

    NASA Astrophysics Data System (ADS)

    Delacour, Jacques; Ungar, Serge; Mathieu, Gilles; Hasna, Guenther; Martinez, Pascal; Roche, Jean-Christophe

    1999-04-01

    THe progress made recently in display technology covers many fields of application. The specification of radiance, colorimetry and lighting efficiency creates some new challenges for designers. Photometric design is limited by the capability of correctly predicting the result of a lighting system, to save on the costs and time taken to build multiple prototypes or bread board benches. The second step of the research carried out by company OPTIS is to propose an optimization method to be applied to the lighting system, developed in the software SPEOS. The main features of the tool requires include the CAD interface, to enable fast and efficient transfer between mechanical and light design software, the source modeling, the light transfer model and an optimization tool. The CAD interface is mainly a prototype of transfer, which is not the subjects here. Photometric simulation is efficiently achieved by using the measured source encoding and a simulation by the Monte Carlo method. Today, the advantages and the limitations of the Monte Carlo method are well known. The noise reduction requires a long calculation time, which increases with the complexity of the display panel. A successful optimization is difficult to achieve, due to the long calculation time required for each optimization pass including a Monte Carlo simulation. The problem was initially defined as an engineering method of study. The experience shows that good understanding and mastering of the phenomenon of light transfer is limited by the complexity of non sequential propagation. The engineer must call for the help of a simulation and optimization tool. The main point needed to be able to perform an efficient optimization is a quick method for simulating light transfer. Much work has been done in this area and some interesting results can be observed. It must be said that the Monte Carlo method wastes time calculating some results and information which are not required for the needs of the simulation. Low efficiency transfer system cost a lot of lost time. More generally, the light transfer simulation can be treated efficiently when the integrated result is composed of elementary sub results that include quick analytical calculated intersections. The first axis of research appear. The quick integration research and the quick calculation of geometric intersections. The first axis of research brings some general solutions also valid for multi-reflection systems. The second axis requires some deep thinking on the intersection calculation. An interesting way is the subdivision of space in VOXELS. This is an adapted method of 3D division of space according to the objects and their location. An experimental software has been developed to provide a validation of the method. The gain is particularly high in complex systems. An important reduction in the calculation time has been achieved.

  18. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  19. Techniques for the insertion of the ProSeal laryngeal mask airway: comparison of the Foley airway stylet tool with the introducer tool in a prospective, randomized study.

    PubMed

    Chen, Mao-Kai; Hsu, Hung-Te; Lu, I-Cheng; Shih, Chih-Kai; Shen, Ya-Chun; Tseng, Kuang-Yi; Cheng, Kuang-I

    2014-01-01

    Many tools have been developed to facilitate the insertion of the ProSeal laryngeal mask airway (LMA) insertion, which can be impeded by folding of its soft cuff. The aim of this study was to compare the efficiency of ProSeal LMA insertion guided by a soft, direct optical Foley Airway Stylet Tool (FAST) with the standard introducer tool (IT). One hundred sixty patients undergoing general anesthesia using the ProSeal LMA as an airway management device were randomly allocated to either FAST-guided or IT-assisted groups. Following ProSeal LMA insertion, the glottic and esophageal openings were identified using a fiberoptic bronchoscope introduced through the airway and the drain tube. The primary outcomes were time taken to insert the ProSeal LMA and the success rate at the first attempt. Secondary end points included ease of insertion, hemodynamic response to insertion, and postoperative adverse events recorded in the recovery room and on the first postoperative morning. One hundred forty patients were included in the final analysis: 66 in the FAST-guided group and 74 in the IT-assisted group. The success rate of FAST device-guided ProSeal LMA insertion (95.7%) was broadly comparable with IT-assisted insertion (98.7%). However, the time taken to insert the ProSeal LMA was significantly longer when the FAST technique was used (p <0.001). The incidence of correct alignment of the airway tube and the drain tube did not differ significantly between the groups. There were no significant differences in ease of insertion or hemodynamic responses to insertion, except that the incidence of postoperative sore throat was significantly higher in the FAST group on the first postoperative day (22.2% compared with 6.8% in the IT group; p = 0.035). Both FAST-guided and IT-assisted techniques achieved correct ProSeal LMA positioning, but the IT technique was significantly quicker and less likely to cause a sore throat. ClinicalTrials.gov Identifier: NCT02048657.

  20. TRACON Aircraft Arrival Planning and Optimization Through Spatial Constraint Satisfaction

    NASA Technical Reports Server (NTRS)

    Bergh, Christopher P.; Krzeczowski, Kenneth J.; Davis, Thomas J.; Denery, Dallas G. (Technical Monitor)

    1995-01-01

    A new aircraft arrival planning and optimization algorithm has been incorporated into the Final Approach Spacing Tool (FAST) in the Center-TRACON Automation System (CTAS) developed at NASA-Ames Research Center. FAST simulations have been conducted over three years involving full-proficiency, level five air traffic controllers from around the United States. From these simulations an algorithm, called Spatial Constraint Satisfaction, has been designed, coded, undergone testing, and soon will begin field evaluation at the Dallas-Fort Worth and Denver International airport facilities. The purpose of this new design is an attempt to show that the generation of efficient and conflict free aircraft arrival plans at the runway does not guarantee an operationally acceptable arrival plan upstream from the runway -information encompassing the entire arrival airspace must be used in order to create an acceptable aircraft arrival plan. This new design includes functions available previously but additionally includes necessary representations of controller preferences and workload, operationally required amounts of extra separation, and integrates aircraft conflict resolution. As a result, the Spatial Constraint Satisfaction algorithm produces an optimized aircraft arrival plan that is more acceptable in terms of arrival procedures and air traffic controller workload. This paper discusses the current Air Traffic Control arrival planning procedures, previous work in this field, the design of the Spatial Constraint Satisfaction algorithm, and the results of recent evaluations of the algorithm.

  1. Particle-in-cell studies of fast-ion slowing-down rates in cool tenuous magnetized plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Eugene S.; Cohen, Samuel A.; Welch, Dale R.

    We report on 3D-3V particle-in-cell simulations of fast-ion energy-loss rates in a cold, weakly-magnetized, weakly-coupled plasma where the electron gyroradius, ρe, is comparable to or less than the Debye length, λ De, and the fast-ion velocity exceeds the electron thermal velocity, a regime in which the electron response may be impeded. These simulations use explicit algorithms, spatially resolve ρ e and λ De, and temporally resolve the electron cyclotron and plasma frequencies. For mono-energetic dilute fast ions with isotropic velocity distributions, these scaling studies of the slowing-down time, τ s, versus fast-ion charge are in agreement with unmagnetized slowing-down theory;more » with an applied magnetic field, no consistent anisotropy between τs in the cross-field and field-parallel directions could be resolved. Scaling the fast-ion charge is confirmed as a viable way to reduce the required computational time for each simulation. In conclusion, the implications of these slowing down processes are described for one magnetic-confinement fusion concept, the small, advanced-fuel, field-reversed configuration device.« less

  2. Particle-in-cell studies of fast-ion slowing-down rates in cool tenuous magnetized plasma

    DOE PAGES

    Evans, Eugene S.; Cohen, Samuel A.; Welch, Dale R.

    2018-04-05

    We report on 3D-3V particle-in-cell simulations of fast-ion energy-loss rates in a cold, weakly-magnetized, weakly-coupled plasma where the electron gyroradius, ρe, is comparable to or less than the Debye length, λ De, and the fast-ion velocity exceeds the electron thermal velocity, a regime in which the electron response may be impeded. These simulations use explicit algorithms, spatially resolve ρ e and λ De, and temporally resolve the electron cyclotron and plasma frequencies. For mono-energetic dilute fast ions with isotropic velocity distributions, these scaling studies of the slowing-down time, τ s, versus fast-ion charge are in agreement with unmagnetized slowing-down theory;more » with an applied magnetic field, no consistent anisotropy between τs in the cross-field and field-parallel directions could be resolved. Scaling the fast-ion charge is confirmed as a viable way to reduce the required computational time for each simulation. In conclusion, the implications of these slowing down processes are described for one magnetic-confinement fusion concept, the small, advanced-fuel, field-reversed configuration device.« less

  3. Automatic Parameter Tuning for the Morpheus Vehicle Using Particle Swarm Optimization

    NASA Technical Reports Server (NTRS)

    Birge, B.

    2013-01-01

    A high fidelity simulation using a PC based Trick framework has been developed for Johnson Space Center's Morpheus test bed flight vehicle. There is an iterative development loop of refining and testing the hardware, refining the software, comparing the software simulation to hardware performance and adjusting either or both the hardware and the simulation to extract the best performance from the hardware as well as the most realistic representation of the hardware from the software. A Particle Swarm Optimization (PSO) based technique has been developed that increases speed and accuracy of the iterative development cycle. Parameters in software can be automatically tuned to make the simulation match real world subsystem data from test flights. Special considerations for scale, linearity, discontinuities, can be all but ignored with this technique, allowing fast turnaround both for simulation tune up to match hardware changes as well as during the test and validation phase to help identify hardware issues. Software models with insufficient control authority to match hardware test data can be immediately identified and using this technique requires very little to no specialized knowledge of optimization, freeing model developers to concentrate on spacecraft engineering. Integration of the PSO into the Morpheus development cycle will be discussed as well as a case study highlighting the tool's effectiveness.

  4. Materials Characterisation and Analysis for Flow Simulation of Liquid Resin Infusion

    NASA Astrophysics Data System (ADS)

    Sirtautas, J.; Pickett, A. K.; George, A.

    2015-06-01

    Liquid Resin Infusion (LRI) processes including VARI and VARTM have received increasing attention in recent years, particularly for infusion of large parts, or for low volume production. This method avoids the need for costly matched metal tooling as used in Resin Transfer Moulding (RTM) and can provide fast infusion if used in combination with flow media. Full material characterisation for LRI analysis requires models for three dimensional fabric permeability as a function of fibre volume content, fabric through-thickness compliance as a function of resin pressure, flow media permeability and resin viscosity. The characterisation of fabric relaxation during infusion is usually determined from cyclic compaction tests on saturated fabrics. This work presents an alternative method to determine the compressibility by using LRI flow simulation and fitting a model to experimental thickness measurements during LRI. The flow media is usually assumed to have isotropic permeability, but this work shows greater simulation accuracy from combining the flow media with separation plies as a combined orthotropic material. The permeability of this combined media can also be determined by fitting the model with simulation to LRI flow measurements. The constitutive models and the finite element solution were validated by simulation of the infusion of a complex aerospace demonstrator part.

  5. Virtual reality in surgical training.

    PubMed

    Lange, T; Indelicato, D J; Rosen, J M

    2000-01-01

    Virtual reality in surgery and, more specifically, in surgical training, faces a number of challenges in the future. These challenges are building realistic models of the human body, creating interface tools to view, hear, touch, feel, and manipulate these human body models, and integrating virtual reality systems into medical education and treatment. A final system would encompass simulators specifically for surgery, performance machines, telemedicine, and telesurgery. Each of these areas will need significant improvement for virtual reality to impact medicine successfully in the next century. This article gives an overview of, and the challenges faced by, current systems in the fast-changing field of virtual reality technology, and provides a set of specific milestones for a truly realistic virtual human body.

  6. Space applications of artificial intelligence; Proceedings of the Annual Goddard Conference, Greenbelt, MD, May 16, 17, 1989

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Editor); Dent, Carolyn P. (Editor)

    1989-01-01

    Theoretical and implementation aspects of AI systems for space applications are discussed in reviews and reports. Sections are devoted to planning and scheduling, fault isolation and diagnosis, data management, modeling and simulation, and development tools and methods. Particular attention is given to a situated reasoning architecture for space repair and replace tasks, parallel plan execution with self-processing networks, the electrical diagnostics expert system for Spacelab life-sciences experiments, diagnostic tolerance for missing sensor data, the integration of perception and reasoning in fast neural modules, a connectionist model for dynamic control, and applications of fuzzy sets to the development of rule-based expert systems.

  7. Airborne Precision Spacing (APS) Dependent Parallel Arrivals (DPA)

    NASA Technical Reports Server (NTRS)

    Smith, Colin L.

    2012-01-01

    The Airborne Precision Spacing (APS) team at the NASA Langley Research Center (LaRC) has been developing a concept of operations to extend the current APS concept to support dependent approaches to parallel or converging runways along with the required pilot and controller procedures and pilot interfaces. A staggered operations capability for the Airborne Spacing for Terminal Arrival Routes (ASTAR) tool was developed and designated as ASTAR10. ASTAR10 has reached a sufficient level of maturity to be validated and tested through a fast-time simulation. The purpose of the experiment was to identify and resolve any remaining issues in the ASTAR10 algorithm, as well as put the concept of operations through a practical test.

  8. Model-based sensorimotor integration for multi-joint control: development of a virtual arm model.

    PubMed

    Song, D; Lan, N; Loeb, G E; Gordon, J

    2008-06-01

    An integrated, sensorimotor virtual arm (VA) model has been developed and validated for simulation studies of control of human arm movements. Realistic anatomical features of shoulder, elbow and forearm joints were captured with a graphic modeling environment, SIMM. The model included 15 musculotendon elements acting at the shoulder, elbow and forearm. Muscle actions on joints were evaluated by SIMM generated moment arms that were matched to experimentally measured profiles. The Virtual Muscle (VM) model contained appropriate admixture of slow and fast twitch fibers with realistic physiological properties for force production. A realistic spindle model was embedded in each VM with inputs of fascicle length, gamma static (gamma(stat)) and dynamic (gamma(dyn)) controls and outputs of primary (I(a)) and secondary (II) afferents. A piecewise linear model of Golgi Tendon Organ (GTO) represented the ensemble sampling (I(b)) of the total muscle force at the tendon. All model components were integrated into a Simulink block using a special software tool. The complete VA model was validated with open-loop simulation at discrete hand positions within the full range of alpha and gamma drives to extrafusal and intrafusal muscle fibers. The model behaviors were consistent with a wide variety of physiological phenomena. Spindle afferents were effectively modulated by fusimotor drives and hand positions of the arm. These simulations validated the VA model as a computational tool for studying arm movement control. The VA model is available to researchers at website http://pt.usc.edu/cel .

  9. Simulating a topological transition in a superconducting phase qubit by fast adiabatic trajectories

    NASA Astrophysics Data System (ADS)

    Wang, Tenghui; Zhang, Zhenxing; Xiang, Liang; Gong, Zhihao; Wu, Jianlan; Yin, Yi

    2018-04-01

    The significance of topological phases has been widely recognized in the community of condensed matter physics. The well controllable quantum systems provide an artificial platform to probe and engineer various topological phases. The adiabatic trajectory of a quantum state describes the change of the bulk Bloch eigenstates with the momentum, and this adiabatic simulation method is however practically limited due to quantum dissipation. Here we apply the "shortcut to adiabaticity" (STA) protocol to realize fast adiabatic evolutions in the system of a superconducting phase qubit. The resulting fast adiabatic trajectories illustrate the change of the bulk Bloch eigenstates in the Su-Schrieffer-Heeger (SSH) model. A sharp transition is experimentally determined for the topological invariant of a winding number. Our experiment helps identify the topological Chern number of a two-dimensional toy model, suggesting the applicability of the fast adiabatic simulation method for topological systems.

  10. Convergence of highly parallel stray field calculation using the fast multipole method on irregular meshes

    NASA Astrophysics Data System (ADS)

    Palmesi, P.; Abert, C.; Bruckner, F.; Suess, D.

    2018-05-01

    Fast stray field calculation is commonly considered of great importance for micromagnetic simulations, since it is the most time consuming part of the simulation. The Fast Multipole Method (FMM) has displayed linear O(N) parallelization behavior on many cores. This article investigates the error of a recent FMM approach approximating sources using linear—instead of constant—finite elements in the singular integral for calculating the stray field and the corresponding potential. After measuring performance in an earlier manuscript, this manuscript investigates the convergence of the relative L2 error for several FMM simulation parameters. Various scenarios either calculating the stray field directly or via potential are discussed.

  11. Experimental and computational study of complex shockwave dynamics in laser ablation plumes in argon atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harilal, S. S.; Miloshevsky, G. V.; Diwakar, P. K.

    2012-08-15

    We investigated spatio-temporal evolution of ns laser ablation plumes at atmospheric pressure, a favored condition for laser-induced breakdown spectroscopy and laser-ablation inductively coupled plasma mass-spectrometry. The 1064 nm, 6 ns pulses from a Nd:YAG laser were focused on to an Al target and the generated plasma was allowed to expand in 1 atm Ar. The hydrodynamic expansion features were studied using focused shadowgraphy and gated 2 ns self-emission visible imaging. Shadowgram images showed material ejection and generation of shock fronts. A secondary shock is observed behind the primary shock during the time window of 100-500 ns with instabilities near themore » laser cone angle. By comparing the self-emission images obtained using fast photography, it is concluded that the secondary shocks observed in the shadowgraphy were generated by fast moving target material. The plume front estimates using fast photography exhibited reasonable agreement with data obtained from shadowgraphy at early times {<=}400 ns. However, at later times, fast photography images showed plume confinement while the shadowgraphic images showed propagation of the plume front even at greater times. The structure and dynamics of the plume obtained from optical diagnostic tools were compared to numerical simulations. We have shown that the main features of plume expansion in ambient Ar observed in the experiments can be reproduced using a continuum hydrodynamics model which provided valuable insight into the expansion dynamics and shock structure of the plasma plume.« less

  12. FAST: FAST Analysis of Sequences Toolbox

    PubMed Central

    Lawrence, Travis J.; Kauffman, Kyle T.; Amrine, Katherine C. H.; Carper, Dana L.; Lee, Raymond S.; Becich, Peter J.; Canales, Claudia J.; Ardell, David H.

    2015-01-01

    FAST (FAST Analysis of Sequences Toolbox) provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU's Not Unix) Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R, and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics make FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format). Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought. PMID:26042145

  13. Fast simulation of electromagnetic and hadronic showers in SpaCal calorimeter at the H1 experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raičević, Nataša, E-mail: raicevic@mail.desy.de; Glazov, Alexandre

    2016-03-25

    The fast simulation of showers induced by electrons (positrons) in the H1 lead/scintillating-fiber calorimeter, SpaCal, based on shower library technique has been presented previously. In this paper we show the results on linearity and uniformity of the reconstructed electron/positron cluster energy in electromagnetic section of Spacal for the simulations based on shower library and GFLASH shower parametrisation. The shapes of the clusters originating from photon and hadron candidates in SpaCal are analysed and experimental distributions compared with the two simulations.

  14. Simulations of Fuel Assembly and Fast-Electron Transport in Integrated Fast-Ignition Experiments on OMEGA

    NASA Astrophysics Data System (ADS)

    Solodov, A. A.; Theobald, W.; Anderson, K. S.; Shvydky, A.; Epstein, R.; Betti, R.; Myatt, J. F.; Stoeckl, C.; Jarrott, L. C.; McGuffey, C.; Qiao, B.; Beg, F. N.; Wei, M. S.; Stephens, R. B.

    2013-10-01

    Integrated fast-ignition experiments on OMEGA benefit from improved performance of the OMEGA EP laser, including higher contrast, higher energy, and a smaller focus. Recent 8-keV, Cu-Kα flash radiography of cone-in-shell implosions and cone-tip breakout measurements showed good agreement with the 2-D radiation-hydrodynamic simulations using the code DRACO. DRACO simulations show that the fuel assembly can be further improved by optimizing the compression laser pulse, evacuating air from the shell, and by adjusting the material of the cone tip. This is found to delay the cone-tip breakout by ~220 ps and increase the core areal density from ~80 mg/cm2 in the current experiments to ~500 mg/cm2 at the time of the OMEGA EP beam arrival before the cone-tip breakout. Simulations using the code LSP of fast-electron transport in the recent integrated OMEGA experiments with Cu-doped shells will be presented. Cu-doping is added to probe the transport of fast electrons via their induced Cu K-shell fluorescent emission. This material is based upon work supported by the Department of Energy National Nuclear Security Administration DE-NA0001944 and the Office of Science under DE-FC02-04ER54789.

  15. Adaptive Time Stepping for Transient Network Flow Simulation in Rocket Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok K.; Ravindran, S. S.

    2017-01-01

    Fluid and thermal transients found in rocket propulsion systems such as propellant feedline system is a complex process involving fast phases followed by slow phases. Therefore their time accurate computation requires use of short time step initially followed by the use of much larger time step. Yet there are instances that involve fast-slow-fast phases. In this paper, we present a feedback control based adaptive time stepping algorithm, and discuss its use in network flow simulation of fluid and thermal transients. The time step is automatically controlled during the simulation by monitoring changes in certain key variables and by feedback. In order to demonstrate the viability of time adaptivity for engineering problems, we applied it to simulate water hammer and cryogenic chill down in pipelines. Our comparison and validation demonstrate the accuracy and efficiency of this adaptive strategy.

  16. Ballistic impact response of lipid membranes.

    PubMed

    Zhang, Yao; Meng, Zhaoxu; Qin, Xin; Keten, Sinan

    2018-03-08

    Therapeutic agent loaded micro and nanoscale particles as high-velocity projectiles can penetrate cells and tissues, thereby serving as gene and drug delivery vehicles for direct and rapid internalization. Despite recent progress in developing micro/nanoscale ballistic tools, the underlying biophysics of how fast projectiles deform and penetrate cell membranes is still poorly understood. To understand the rate and size-dependent penetration processes, we present coarse-grained molecular dynamics simulations of the ballistic impact of spherical projectiles on lipid membranes. Our simulations reveal that upon impact, the projectile can pursue one of three distinct pathways. At low velocities below the critical penetration velocity, projectiles rebound off the surface. At intermediate velocities, penetration occurs after the projectile deforms the membrane into a tubular thread. At very high velocities, rapid penetration occurs through localized membrane deformation without tubulation. Membrane tension, projectile velocity and size govern which phenomenon occurs, owing to their positive correlation with the reaction force generated between the projectile and the membrane during impact. Two critical membrane tension values dictate the boundaries among the three pathways for a given system, due to the rate dependence of the stress generated in the membrane. Our findings provide broad physical insights into the ballistic impact response of soft viscous membranes and guide design strategies for drug delivery through lipid membranes using micro/nanoscale ballistic tools.

  17. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  18. A fast mass spring model solver for high-resolution elastic objects

    NASA Astrophysics Data System (ADS)

    Zheng, Mianlun; Yuan, Zhiyong; Zhu, Weixu; Zhang, Guian

    2017-03-01

    Real-time simulation of elastic objects is of great importance for computer graphics and virtual reality applications. The fast mass spring model solver can achieve visually realistic simulation in an efficient way. Unfortunately, this method suffers from resolution limitations and lack of mechanical realism for a surface geometry model, which greatly restricts its application. To tackle these problems, in this paper we propose a fast mass spring model solver for high-resolution elastic objects. First, we project the complex surface geometry model into a set of uniform grid cells as cages through *cages mean value coordinate method to reflect its internal structure and mechanics properties. Then, we replace the original Cholesky decomposition method in the fast mass spring model solver with a conjugate gradient method, which can make the fast mass spring model solver more efficient for detailed surface geometry models. Finally, we propose a graphics processing unit accelerated parallel algorithm for the conjugate gradient method. Experimental results show that our method can realize efficient deformation simulation of 3D elastic objects with visual reality and physical fidelity, which has a great potential for applications in computer animation.

  19. Fast Simulation of Electromagnetic Showers in the ATLAS Calorimeter: Frozen Showers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barberio, E.; /Melbourne U.; Boudreau, J.

    2011-11-29

    One of the most time consuming process simulating pp interactions in the ATLAS detector at LHC is the simulation of electromagnetic showers in the calorimeter. In order to speed up the event simulation several parametrisation methods are available in ATLAS. In this paper we present a short description of a frozen shower technique, together with some recent benchmarks and comparison with full simulation. An expected high rate of proton-proton collisions in ATLAS detector at LHC requires large samples of simulated events (Monte Carlo) to study various physics processes. A detailed simulation of particle reactions ('full simulation') in the ATLAS detectormore » is based on GEANT4 and is very accurate. However, due to complexity of the detector, high particle multiplicity and GEANT4 itself, the average CPU time spend to simulate typical QCD event in pp collision is 20 or more minutes for modern computers. During detector simulation the largest time is spend in the calorimeters (up to 70%) most of which is required for electromagnetic particles in the electromagnetic (EM) part of the calorimeters. This is the motivation for fast simulation approaches which reduce the simulation time without affecting the accuracy. Several of fast simulation methods available within the ATLAS simulation framework (standard Athena based simulation program) are discussed here with the focus on the novel frozen shower library (FS) technique. The results obtained with FS are presented here as well.« less

  20. Three-dimensional simulation of ultrasound propagation through trabecular bone structures measured by synchrotron microtomography.

    PubMed

    Bossy, Emmanuel; Padilla, Frédéric; Peyrin, Françoise; Laugier, Pascal

    2005-12-07

    Three-dimensional numerical simulations of ultrasound transmission were performed through 31 trabecular bone samples measured by synchrotron microtomography. The synchrotron microtomography provided high resolution 3D mappings of bone structures, which were used as the input geometry in the simulation software developed in our laboratory. While absorption (i.e. the absorption of ultrasound through dissipative mechanisms) was not taken into account in the algorithm, the simulations reproduced major phenomena observed in real through-transmission experiments in trabecular bone. The simulated attenuation (i.e. the decrease of the transmitted ultrasonic energy) varies linearly with frequency in the MHz frequency range. Both the speed of sound (SOS) and the slope of the normalized frequency-dependent attenuation (nBUA) increase with the bone volume fraction. Twenty-five out of the thirty-one samples exhibited negative velocity dispersion. One sample was rotated to align the main orientation of the trabecular structure with the direction of ultrasonic propagation, leading to the observation of a fast and a slow wave. Coupling numerical simulation with real bone architecture therefore provides a powerful tool to investigate the physics of ultrasound propagation in trabecular structures. As an illustration, comparison between results obtained on bone modelled either as a fluid or a solid structure suggested the major role of mode conversion of the incident acoustic wave to shear waves in bone to explain the large contribution of scattering to the overall attenuation.

  1. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    PubMed

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  2. Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads

    NASA Astrophysics Data System (ADS)

    Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard

    2016-11-01

    Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.

  3. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  4. Examining the validity of the ActivPAL monitor in measuring posture and ambulatory movement in children

    PubMed Central

    2012-01-01

    Background Decreasing sedentary activities that involve prolonged sitting may be an important strategy to reduce obesity and other physical and psychosocial health problems in children. The first step to understanding the effect of sedentary activities on children’s health is to objectively assess these activities with a valid measurement tool. Purpose To examine the validity of the ActivPAL monitor in measuring sitting/lying, standing, and walking time, transition counts and step counts in children in a laboratory setting. Methods Twenty five healthy elementary school children (age 9.9 ± 0.3 years; BMI 18.2 ± 1.9; mean ± SD) were randomly recruited across the Auckland region, New Zealand. Children were fitted with ActivPAL monitors and observed during simulated free-living activities involving sitting/lying, standing and walking, followed by treadmill and over-ground activities at various speeds (slow, normal, fast) against video observation (criterion measure). The ActivPAL sit-to-stand and stand-to-sit transition counts and steps were also compared with video data. The accuracy of step counts measured by the ActivPAL was also compared against the New Lifestyles NL-2000 and the Yamax Digi-Walker SW-200 pedometers. Results We observed a perfect correlation between the ActivPAL monitor in time spent sitting/lying, standing, and walking in simulated free-living activities with direct observation. Correlations between the ActivPAL and video observation in total numbers of sit-to-stand and stand-to-sit transitions were high (r = 0.99 ± 0.01). Unlike pedometers, the ActivPAL did not misclassify fidgeting as steps taken. Strong correlations (r = 0.88-1.00) between ActivPAL step counts and video observation in both treadmill and over-ground slow and normal walking were also observed. During treadmill and over-ground fast walking and running, the correlations were low (r = 0.21-0.46). Conclusion The ActivPAL monitor is a valid measurement tool for assessing time spent sitting/lying, standing, and walking, sit-to-stand and stand-to-sit transition counts and step counts in slow and normal walking. The device did not measure accurately steps taken during treadmill and over-ground fast walking and running in children. PMID:23031188

  5. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    NASA Technical Reports Server (NTRS)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.

  6. Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks

    PubMed Central

    Vestergaard, Christian L.; Génois, Mathieu

    2015-01-01

    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling. PMID:26517860

  7. Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks.

    PubMed

    Vestergaard, Christian L; Génois, Mathieu

    2015-10-01

    Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling.

  8. CscoreTool: fast Hi-C compartment analysis at high resolution.

    PubMed

    Zheng, Xiaobin; Zheng, Yixian

    2018-05-01

    The genome-wide chromosome conformation capture (Hi-C) has revealed that the eukaryotic genome can be partitioned into A and B compartments that have distinctive chromatin and transcription features. Current Principle Component Analyses (PCA)-based method for the A/B compartment prediction based on Hi-C data requires substantial CPU time and memory. We report the development of a method, CscoreTool, which enables fast and memory-efficient determination of A/B compartments at high resolution even in datasets with low sequencing depth. https://github.com/scoutzxb/CscoreTool. xzheng@carnegiescience.edu. Supplementary data are available at Bioinformatics online.

  9. Pilot study to investigate the feasibility of the Home Falls and Accidents Screening Tool (HOME FAST) to identify older Malaysian people at risk of falls.

    PubMed

    Romli, Muhammad Hibatullah; Mackenzie, Lynette; Lovarini, Meryl; Tan, Maw Pin

    2016-08-16

    The relationship between home hazards and falls in older Malaysian people is not yet fully understood. No tools to evaluate the Malaysian home environment currently exist. Therefore, this study aimed to pilot the Home Falls and Accidents Screening Tool (HOME FAST) to identify hazards in Malaysian homes, to evaluate the feasibility of using the HOME FAST in the Malaysian Elders Longitudinal Research (MELoR) study and to gather preliminary data about the experience of falls among a small sample of Malaysian older people. A cross-sectional pilot study was conducted. An urban setting in Kuala Lumpur. 26 older people aged 60 and over were recruited from the control group of a related research project in Malaysia, in addition to older people known to the researchers. The HOME FAST was applied with the baseline survey for the MELoR study via a face-to-face interview and observation of the home by research staff. The majority of the participants were female, of Malay or Chinese ethnicity and living with others in a double-storeyed house. Falls were reported in the previous year by 19% and 80% of falls occurred at home. Gender and fear of falling had the strongest associations with home hazards. Most hazards were detected in the bathroom area. A small number of errors were detected in the HOME FAST ratings by researchers. The HOME FAST is feasible as a research and clinical tool for the Malaysian context and is appropriate for use in the MELoR study. Home hazards were prevalent in the homes of older people and further research with the larger MELoR sample is needed to confirm the validity of using the HOME FAST in Malaysia. Training in the use of the HOME FAST is needed to ensure accurate use by researchers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. CTAS: Computer intelligence for air traffic control in the terminal area

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1992-01-01

    A system for the automated management and control of arrival traffic, referred to as the Center-TRACON Automation System (CTAS), has been designed by the ATC research group at NASA Ames research center. In a cooperative program, NASA and the FAA have efforts underway to install and evaluate the system at the Denver and Dallas/Ft. Worth airports. CTAS consists of three types of integrated tools that provide computer-generated intelligence for both Center and TRACON controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), establishes optimized landing sequences and landing times for aircraft arriving in the center airspace several hundred miles from the airport. In TRACON, TMA frequencies missed approach aircraft and unanticipated arrivals. Another tool, the Descent Advisor (DA), generates clearances for the center controllers handling at crossing times provided by TMA. In the TRACON, the final approach spacing tool (FAST) provides heading and speed clearances that produce and accurately spaced flow of aircraft on the final approach course. A data base consisting of aircraft performance models, airline preferred operational procedures and real time wind measurements contribute to the effective operation of CTAS. Extensive simulator evaluations of CTAS have demonstrated controller acceptance, delay reductions, and fuel savings.

  11. Determining the spatial altitude of the hydraulic fractures.

    NASA Astrophysics Data System (ADS)

    Khamiev, Marsel; Kosarev, Victor; Goncharova, Galina

    2016-04-01

    Mathematical modeling and numerical simulation are the most widely used approaches for the solving geological problems. They imply software tools which are based on Monte Carlo method. The results of this project presents shows the possibility of using PNL tool to determine fracturing location. The modeled media is a homogeneous rock (limestone) cut by a vertical borehole (d=216 mm) with metal casing 9 mm thick. The cement sheath is 35 mm thick. The borehole is filled with fresh water. The rock mass is cut by crack, filled with a mixture of doped (gadolinium oxide Gd2O3) proppant (75%) and water (25%). A pulse neutron logging (PNL) tool is used for quality control in hydraulic fracturing operations. It includes a fast neutron source (so-called "neutron generator") and a set of thermal (or epithermal) neutron-sensing devices, forming the so-called near (ND) and far (FD) detectors. To evaluate neutron properties various segments (sectors) of the rock mass, the detector must register only neutrons that come from this very formation. It's possible if detecting block includes some (6 for example) thermal neutron detectors arranged circumferentially inside the tool. As a result we get few independent well logs, each accords with define rock sector. Afterwards synthetic logs processing we can determine spatial position of the hydraulic fracture.

  12. An Engineering Tool for the Prediction of Internal Dielectric Charging

    NASA Astrophysics Data System (ADS)

    Rodgers, D. J.; Ryden, K. A.; Wrenn, G. L.; Latham, P. M.; Sorensen, J.; Levy, L.

    1998-11-01

    A practical internal charging tool has been developed. It provides an easy-to-use means for satellite engineers to predict whether on-board dielectrics are vulnerable to electrostatic discharge in the outer radiation belt. The tool is designed to simulate irradiation of single-dielectric planar or cylindrical structures with or without shielding. Analytical equations are used to describe current deposition in the dielectric. This is fast and gives charging currents to sufficient accuracy given the uncertainties in other aspects of the problem - particularly material characteristics. Time-dependent internal electric fields are calculated, taking into account the effect on conductivity of electric field, dose rate and temperature. A worst-case model of electron fluxes in the outer belt has been created specifically for the internal charging problem and is built into the code. For output, the tool gives a YES or NO decision on the susceptibility of the structure to internal electrostatic breakdown and if necessary, calculates the required changes to bring the system below the breakdown threshold. A complementary programme of laboratory irradiations has been carried out to validate the tool. The results for Epoxy-fibreglass samples show that the code models electric field realistically for a wide variety of shields, dielectric thicknesses and electron spectra. Results for Teflon samples indicate that some further experimentation is required and the radiation-induced conductivity aspects of the code have not been validated.

  13. Image-based deep learning for classification of noise transients in gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Razzano, Massimiliano; Cuoco, Elena

    2018-05-01

    The detection of gravitational waves has inaugurated the era of gravitational astronomy and opened new avenues for the multimessenger study of cosmic sources. Thanks to their sensitivity, the Advanced LIGO and Advanced Virgo interferometers will probe a much larger volume of space and expand the capability of discovering new gravitational wave emitters. The characterization of these detectors is a primary task in order to recognize the main sources of noise and optimize the sensitivity of interferometers. Glitches are transient noise events that can impact the data quality of the interferometers and their classification is an important task for detector characterization. Deep learning techniques are a promising tool for the recognition and classification of glitches. We present a classification pipeline that exploits convolutional neural networks to classify glitches starting from their time-frequency evolution represented as images. We evaluated the classification accuracy on simulated glitches, showing that the proposed algorithm can automatically classify glitches on very fast timescales and with high accuracy, thus providing a promising tool for online detector characterization.

  14. BDA: A novel method for identifying defects in body-centered cubic crystals.

    PubMed

    Möller, Johannes J; Bitzek, Erik

    2016-01-01

    The accurate and fast identification of crystallographic defects plays a key role for the analysis of atomistic simulation output data. For face-centered cubic (fcc) metals, most existing structure analysis tools allow for the direct distinction of common defects, such as stacking faults or certain low-index surfaces. For body-centered cubic (bcc) metals, on the other hand, a robust way to identify such defects is currently not easily available. We therefore introduce a new method for analyzing atomistic configurations of bcc metals, the BCC Defect Analysis (BDA). It uses existing structure analysis algorithms and combines their results to uniquely distinguish between typical defects in bcc metals. In essence, the BDA method offers the following features:•Identification of typical defect structures in bcc metals.•Reduction of erroneously identified defects by iterative comparison to the defects in the atom's neighborhood.•Availability as ready-to-use Python script for the widespread visualization tool OVITO [http://ovito.org].

  15. Oak Ridge National Laboratory Support of Non-light Water Reactor Technologies: Capabilities Assessment for NRC Near-term Implementation Action Plans for Non-light Water Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belles, Randy; Jain, Prashant K.; Powers, Jeffrey J.

    The Oak Ridge National Laboratory (ORNL) has a rich history of support for light water reactor (LWR) and non-LWR technologies. The ORNL history involves operation of 13 reactors at ORNL including the graphite reactor dating back to World War II, two aqueous homogeneous reactors, two molten salt reactors (MSRs), a fast-burst health physics reactor, and seven LWRs. Operation of the High Flux Isotope Reactor (HFIR) has been ongoing since 1965. Expertise exists amongst the ORNL staff to provide non-LWR training; support evaluation of non-LWR licensing and safety issues; perform modeling and simulation using advanced computational tools; run laboratory experiments usingmore » equipment such as the liquid salt component test facility; and perform in-depth fuel performance and thermal-hydraulic technology reviews using a vast suite of computer codes and tools. Summaries of this expertise are included in this paper.« less

  16. The impact of reduced gastric acid secretion on dissolution of salts of weak bases in the fasted upper gastrointestinal lumen: Data in biorelevant media and in human aspirates.

    PubMed

    Litou, Chara; Vertzoni, Maria; Xu, Wei; Kesisoglou, Filippos; Reppas, Christos

    2017-06-01

    To propose media for simulating the intragastric environment under reduced gastric acid secretion in the fasted state at three levels of simulation of the gastric environment and evaluate their usefulness in evaluating the intragastric dissolution of salts of weak bases. To evaluate the importance of bicarbonate buffer in biorelevant in vitro dissolution testing when using Level II biorelevant media simulating the environment in the fasted upper small intestine, regardless of gastric acid secretions. Media for simulating the hypochlorhydric and achlorhydric conditions in stomach were proposed using phosphates, maleates and bicarbonates buffers. The impact of bicarbonates in Level II biorelevant media simulating the environment in upper small intestine was evaluated so that pH and bulk buffer capacity were maintained. Dissolution data were collected using two model compounds, pioglitazone hydrochloride and semifumarate cocrystal of Compound B, and the mini-paddle dissolution apparatus in biorelevant media and in human aspirates. Simulated gastric fluids proposed in this study were in line with pH, buffer capacity, pepsin content, total bile salt/lecithin content and osmolality of the fasted stomach under partial and under complete inhibition of gastric acid secretion. Fluids simulating the conditions under partial inhibition of acid secretion were useful in simulating concentrations of both model compounds in gastric aspirates. Bicarbonates in Level III biorelevant gastric media and in Level II biorelevant media simulating the composition in the upper intestinal lumen did not improve simulation of concentrations in human aspirates. Level III biorelevant media for simulating the intragastric environment under hypochlorhydric conditions were proposed and their usefulness in the evaluation of concentrations of two model salts of weak bases in gastric aspirates was shown. Level II biorelevant media for simulating the environment in upper intestinal lumen led to underestimation of concentrations in aspirates, even when bicarbonate buffer was used. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Fast emulation of track reconstruction in the CMS simulation

    NASA Astrophysics Data System (ADS)

    Komm, Matthias; CMS Collaboration

    2017-10-01

    Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation while retaining a sufficient, tuneable accuracy.

  18. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  19. Malingering and PTSD: detecting malingering and war related PTSD by Miller Forensic Assessment of Symptoms Test (M-FAST).

    PubMed

    Ahmadi, Khodabakhsh; Lashani, Zeynab; Afzali, Mohammad Hassan; Tavalaie, S Abbas; Mirzaee, Jafar

    2013-05-29

    Malingering is prevalent in PTSD, especially in delayed-onset PTSD. Despite the attempts to detect it, indicators, tools and methods to accurately detect malingering need extensive scientific and clinical research. Therefore, this study was designed to validate a tool that can detect malingering of war-related PTSD by Miller Forensic Assessment of Symptoms Test (M-FAST). In this blind clinical diagnosis study, one hundred and twenty veterans referred to War Related PTSD Diagnosis Committee in Iran in 2011 were enrolled. In the first step, the clients received Psychiatry diagnosis and were divided into two groups based on the DSM-IV-TR, and in the second step, the participants completed M-FAST. The t-test score within two groups by M-FAST Scale showed a significant difference (t = 14.058, P < 0.0001), and 92% of malingering war-related PTSD participants scored more than 6 and %87 of PTSD group scored less than 6 in M-FAST Scale. M-FAST showed a significant difference between war-related PTSD and malingering participants. The ≥6 score cutoff was suggested by M-FAST to detect malingering of war-related PTSD.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, J. H.; Ng, E. Y. K.; Robertson, Amy

    As part of a collaboration of the National Renewable Energy Laboratory (NREL) and SWAY AS, NREL installed scientific wind, wave, and motion measurement equipment on the spar-type 1/6.5th-scale prototype SWAY floating offshore wind system. The equipment enhanced SWAY's data collection and allowed SWAY to verify the concept and NREL to validate a FAST model of the SWAY design in an open-water condition. Nanyang Technological University (NTU), in collaboration with NREL, assisted with the validation. This final report gives an overview of the SWAY prototype and NREL and NTU's efforts to validate a model of the system. The report provides amore » summary of the different software tools used in the study, the modeling strategies, and the development of a FAST model of the SWAY prototype wind turbine, including justification of the modeling assumptions. Because of uncertainty in system parameters and modeling assumptions due to the complexity of the design, several system properties were tuned to better represent the system and improve the accuracy of the simulations. Calibration was performed using data from a static equilibrium test and free-decay tests.« less

  1. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  2. E-FAST-Exposure and Fate Assessment Screening Tool Version 2014

    EPA Pesticide Factsheets

    E-FAST estimates potential exposures to the general population and surface water concentrations based on releases from industrial operations and basic physical-chemical properties and fate parameters of the substance

  3. Fast Multipole Methods for Three-Dimensional N-body Problems

    NASA Technical Reports Server (NTRS)

    Koumoutsakos, P.

    1995-01-01

    We are developing computational tools for the simulations of three-dimensional flows past bodies undergoing arbitrary motions. High resolution viscous vortex methods have been developed that allow for extended simulations of two-dimensional configurations such as vortex generators. Our objective is to extend this methodology to three dimensions and develop a robust computational scheme for the simulation of such flows. A fundamental issue in the use of vortex methods is the ability of employing efficiently large numbers of computational elements to resolve the large range of scales that exist in complex flows. The traditional cost of the method scales as Omicron (N(sup 2)) as the N computational elements/particles induce velocities at each other, making the method unacceptable for simulations involving more than a few tens of thousands of particles. In the last decade fast methods have been developed that have operation counts of Omicron (N log N) or Omicron (N) (referred to as BH and GR respectively) depending on the details of the algorithm. These methods are based on the observation that the effect of a cluster of particles at a certain distance may be approximated by a finite series expansion. In order to exploit this observation we need to decompose the element population spatially into clusters of particles and build a hierarchy of clusters (a tree data structure) - smaller neighboring clusters combine to form a cluster of the next size up in the hierarchy and so on. This hierarchy of clusters allows one to determine efficiently when the approximation is valid. This algorithm is an N-body solver that appears in many fields of engineering and science. Some examples of its diverse use are in astrophysics, molecular dynamics, micro-magnetics, boundary element simulations of electromagnetic problems, and computer animation. More recently these N-body solvers have been implemented and applied in simulations involving vortex methods. Koumoutsakos and Leonard (1995) implemented the GR scheme in two dimensions for vector computer architectures allowing for simulations of bluff body flows using millions of particles. Winckelmans presented three-dimensional, viscous simulations of interacting vortex rings, using vortons and an implementation of a BH scheme for parallel computer architectures. Bhatt presented a vortex filament method to perform inviscid vortex ring interactions, with an alternative implementation of a BH scheme for a Connection Machine parallel computer architecture.

  4. Phase space effects on fast ion transport modeling in tokamaks

    NASA Astrophysics Data System (ADS)

    Podesta, Mario

    2015-11-01

    Simulations of burning plasmas require a consistent treatment of energetic particles (EP), possibly including the effects of instabilities. Reduced EP transport models are emerging as an effective tool to account for those effects in long time-scale simulations. Available models essentially differ for the main transport drive, which is associated to gradients in real or phase space. It is crucial to assess to what extent those different assumptions affect computed quantities such as EP profile, Neutral Beam (NB) driven current and energy/momentum transfer to the thermal populations. These issues are investigated through a kick model, which includes modifications of the EP distribution by instabilities in real and velocity space. TRANSP simulations including the kick model are applied to NB-heated NSTX discharges featuring unstable toroidal Alfvén eigenmodes (TAEs). Results show that TAEs mainly affect fast ions with large parallel velocity, i.e. the most effective for NB current drive. Other portions of the EP distribution are nearly unperturbed. Core NB driven current decreases by 10-30%, with even larger relative changes toward the plasma edge. When TAEs evolve in so-called avalanches, the model reproduces measured drops of ~ 10% in the neutron rate. Consistently with previous results, the drop is caused by both EP energy loss and EP redistribution. These results are compared to those from a simple diffusive model and a ``critical gradient'' model, which postulates radial EP gradient as the only transport drive. The importance of EP velocity space modifications is discussed in terms of accuracy of the predictions, with emphasis on Neutral Beam driven current. Work supported by U.S. DOE Contract DE-AC02-09CH11466.

  5. Consistent radiative transfer modeling of active and passive observations of precipitation

    NASA Astrophysics Data System (ADS)

    Adams, Ian

    2016-04-01

    Spaceborne platforms such as the Tropical Rainfall Measurement Mission (TRMM) and the Global Precipitation Measurement (GPM) mission exploit a combination of active and passive sensors to provide a greater understanding of the three-dimensional structure of precipitation. While "operationalized" retrieval algorithms require fast forward models, the ability to perform higher fidelity simulations is necessary in order to understand the physics of remote sensing problems by testing assumptions and developing parameterizations for the fast models. To ensure proper synergy between active and passive modeling, forward models must be consistent when modeling the responses of radars and radiometers. This work presents a self-consistent transfer model for simulating radar reflectivities and millimeter wave brightness temperatures for precipitating scenes. To accomplish this, we extended the Atmospheric Radiative Transfer Simulator (ARTS) version 2.3 to solve the radiative transfer equation for active sensors and multiple scattering conditions. Early versions of ARTS (1.1) included a passive Monte Carlo solver, and ARTS is capable of handling atmospheres of up to three dimensions with ellipsoidal planetary geometries. The modular nature of ARTS facilitates extensibility, and the well-developed ray-tracing tools are suited for implementation of Monte Carlo algorithms. Finally, since ARTS handles the full Stokes vector, co- and cross-polarized reflectivity products are possible for scenarios that include nonspherical particles, with or without preferential alignment. The accuracy of the forward model will be demonstrated with precipitation events observed by TRMM and GPM, and the effects of multiple scattering will be detailed. The three-dimensional nature of the radiative transfer model will be useful for understanding the effects of nonuniform beamfill and multiple scattering for spatially heterogeneous precipitation events. The targets of this forward model are GPM (the Dual-wavelength Precipitation Radar (DPR) and GPM Microwave Imager (GMI)).

  6. Neutronic calculation of fast reactors by the EUCLID/V1 integrated code

    NASA Astrophysics Data System (ADS)

    Koltashev, D. A.; Stakhanova, A. A.

    2017-01-01

    This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.

  7. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering

    PubMed Central

    2012-01-01

    Background Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Results Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. Conclusions This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces. PMID:22871125

  8. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering.

    PubMed

    Oliynyk, Andriy; Bonifazzi, Claudio; Montani, Fernando; Fadiga, Luciano

    2012-08-08

    Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces.

  9. Countermeasure effectiveness against a man-portable air-defense system containing a two-color spinscan infrared seeker

    NASA Astrophysics Data System (ADS)

    Jackman, James; Richardson, Mark; Butters, Brian; Walmsley, Roy

    2011-12-01

    Man-portable air-defense (MANPAD) systems have developed sophisticated counter-countermeasures (CCM) to try and defeat any expendable countermeasure that is deployed by an aircraft. One of these is a seeker that is able to detect in two different parts of the electromagnetic spectrum. Termed two-color, the seeker can compare the emissions from the target and a countermeasure in different wavebands and reject the countermeasure. In this paper we describe the modeling process of a two-color infrared seeker using COUNTERSIM, a missile engagement and countermeasure software simulation tool. First, the simulations model a MANPAD with a two-color CCM which is fired against a fast jet model and a transport aircraft model releasing reactive countermeasures. This is then compared to when the aircraft releases countermeasures throughout an engagement up to the hit point to investigate the optimum flare firing time. The results show that the release time of expendable decoys as a countermeasure against a MANPAD with a two-color CCM is critical.

  10. A serious game for learning ultrasound-guided needle placement skills.

    PubMed

    Chan, Wing-Yin; Qin, Jing; Chui, Yim-Pan; Heng, Pheng-Ann

    2012-11-01

    Ultrasound-guided needle placement is a key step in a lot of radiological intervention procedures such as biopsy, local anesthesia and fluid drainage. To help training future intervention radiologists, we develop a serious game to teach the skills involved. We introduce novel techniques for realistic simulation and integrate game elements for active and effective learning. This game is designed in the context of needle placement training based on the some essential characteristics of serious games. Training scenarios are interactively generated via a block-based construction scheme. A novel example-based texture synthesis technique is proposed to simulate corresponding ultrasound images. Game levels are defined based on the difficulties of the generated scenarios. Interactive recommendation of desirable insertion paths is provided during the training as an adaptation mechanism. We also develop a fast physics-based approach to reproduce the shadowing effect of needles in ultrasound images. Game elements such as time-attack tasks, hints and performance evaluation tools are also integrated in our system. Extensive experiments are performed to validate its feasibility for training.

  11. MHD Calculation of halo currents and vessel forces in NSTX VDEs

    NASA Astrophysics Data System (ADS)

    Breslau, J. A.; Strauss, H. R.; Paccagnella, R.

    2012-10-01

    Research tokamaks such as ITER must be designed to tolerate a limited number of disruptions without sustaining significant damage. It is therefore vital to have numerical tools that can accurately predict the effects of these events. The 3D nonlinear extended MHD code M3D [1] can be used to simulate disruptions and calculate the associated wall currents and forces. It has now been validated against halo current data from NSTX experiments in which vertical displacement events (VDEs) were deliberately induced by turning off vertical feedback control. The results of high-resolution numerical simulations at realistic Lundquist numbers show reasonable agreement with the data, supporting a model in which the most dangerously asymmetric currents and heat loads, and the largest horizontal forces, arise in situations where a fast-growing ideal 2,1 external kink mode is destabilized by the scraping-off of flux surfaces with safety factor q>2 during the course of the VDE. [4pt] [1] W. Park, et al., Phys. Plasmas 6 (1999) 1796.

  12. Bifurcated helical core equilibrium states in tokamaks

    NASA Astrophysics Data System (ADS)

    Cooper, W. A.; Chapman, I. T.; Schmitz, O.; Turnbull, A. D.; Tobias, B. J.; Lazarus, E. A.; Turco, F.; Lanctot, M. J.; Evans, T. E.; Graves, J. P.; Brunetti, D.; Pfefferlé, D.; Reimerdes, H.; Sauter, O.; Halpern, F. D.; Tran, T. M.; Coda, S.; Duval, B. P.; Labit, B.; Pochelon, A.; Turnyanskiy, M. R.; Lao, L.; Luce, T. C.; Buttery, R.; Ferron, J. R.; Hollmann, E. M.; Petty, C. C.; van Zeeland, M.; Fenstermacher, M. E.; Hanson, J. M.; Lütjens, H.

    2013-07-01

    Tokamaks with weak to moderate reversed central shear in which the minimum inverse rotational transform (safety factor) qmin is in the neighbourhood of unity can trigger bifurcated magnetohydrodynamic equilibrium states, one of which is similar to a saturated ideal internal kink mode. Peaked prescribed pressure profiles reproduce the ‘snake’ structures observed in many tokamaks which has led to a novel explanation of the snake as a bifurcated equilibrium state. Snake equilibrium structures are computed in simulations of the tokamak à configuration variable (TCV), DIII-D and mega amp spherical torus (MAST) tokamaks. The internal helical deformations only weakly modulate the plasma-vacuum interface which is more sensitive to ripple and resonant magnetic perturbations. On the other hand, the external perturbations do not alter the helical core deformation in a significant manner. The confinement of fast particles in MAST simulations deteriorate with the amplitude of the helical core distortion. These three-dimensional bifurcated solutions constitute a paradigm shift that motivates the applications of tools developed for stellarator research in tokamak physics investigations.

  13. A water resources simulation gaming model for the Invitational Drought Tournament.

    PubMed

    Wang, K; Davies, E G R

    2015-09-01

    A system dynamics-based simulation gaming model, developed as a component of Agriculture and Agri-Food Canada's Invitational Drought Tournament (IDT; Hill et al., 2014), is introduced in this paper as a decision support tool for drought management at the river-basin scale. This IDT Model provides a comprehensive and integrated overview of drought conditions, and illustrates the broad effects of socio-economic drought and mitigation strategies. It is intended to provide a safe, user-friendly experimental environment with fast run-times for testing management options, and to promote collaborative decision-making and consensus building. Examples of model results from several recent IDT events demonstrate potential effects of drought and the short-to longer-term effectiveness of policies selected by IDT teams; such results have also improved teams' understanding of the complexity of water resources systems and their management trade-offs. The IDT Model structure and framework can also be reconfigured quickly for application to different river basins. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Fast Simulators for Satellite Cloud Optical Centroid Pressure Retrievals, 1. Evaluation of OMI Cloud Retrievals

    NASA Technical Reports Server (NTRS)

    Joiner, J.; Vasilkov, A. P.; Gupta, Pawan; Bhartia, P. K.; Veefkind, Pepijn; Sneep, Maarten; deHaan, Johan; Polonsky, Igor; Spurr, Robert

    2011-01-01

    We have developed a relatively simple scheme for simulating retrieved cloud optical centroid pressures (OCP) from satellite solar backscatter observations. We have compared simulator results with those from more detailed retrieval simulators that more fully account for the complex radiative transfer in a cloudy atmosphere. We used this fast simulator to conduct a comprehensive evaluation of cloud OCPs from the two OMI algorithms using collocated data from CloudSat and Aqua MODIS, a unique situation afforded by the A-train formation of satellites. We find that both OMI algorithms perform reasonably well and that the two algorithms agree better with each other than either does with the collocated CloudSat data. This indicates that patchy snow/ice, cloud 3D, and aerosol effects not simulated with the CloudSat data are affecting both algorithms similarly. We note that the collocation with CloudSat occurs mainly on the East side of OMI's swath. Therefore, we are not able to address cross-track biases in OMI cloud OCP retrievals. Our fast simulator may also be used to simulate cloud OCP from output generated by general circulation models (GCM) with appropriate account of cloud overlap. We have implemented such a scheme and plan to compare OMI data with GCM output in the near future.

  15. Fast analysis of radionuclide decay chain migration

    NASA Astrophysics Data System (ADS)

    Chen, J. S.; Liang, C. P.; Liu, C. W.; Li, L.

    2014-12-01

    A novel tool for rapidly predicting the long-term plume behavior of an arbitrary length radionuclide decay chain is presented in this study. This fast tool is achieved based on generalized analytical solutions in compact format derived for a set of two-dimensional advection-dispersion equations coupled with sequential first-order decay reactions in groundwater system. The performance of the developed tool is evaluated by a numerical model using a Laplace transform finite difference scheme. The results of performance evaluation indicate that the developed model is robust and accurate. The developed model is then used to fast understand the transport behavior of a four-member radionuclide decay chain. Results show that the plume extents and concentration levels of any target radionuclide are very sensitive to longitudinal, transverse dispersion, decay rate constant and retardation factor. The developed model are useful tools for rapidly assessing the ecological and environmental impact of the accidental radionuclide releases such as the Fukushima nuclear disaster where multiple radionuclides leaked through the reactor, subsequently contaminating the local groundwater and ocean seawater in the vicinity of the nuclear plant.

  16. Adaptive time-stepping Monte Carlo integration of Coulomb collisions

    NASA Astrophysics Data System (ADS)

    Särkimäki, K.; Hirvijoki, E.; Terävä, J.

    2018-01-01

    We report an accessible and robust tool for evaluating the effects of Coulomb collisions on a test particle in a plasma that obeys Maxwell-Jüttner statistics. The implementation is based on the Beliaev-Budker collision integral which allows both the test particle and the background plasma to be relativistic. The integration method supports adaptive time stepping, which is shown to greatly improve the computational efficiency. The Monte Carlo method is implemented for both the three-dimensional particle momentum space and the five-dimensional guiding center phase space. Detailed description is provided for both the physics and implementation of the operator. The focus is in adaptive integration of stochastic differential equations, which is an overlooked aspect among existing Monte Carlo implementations of Coulomb collision operators. We verify that our operator converges to known analytical results and demonstrate that careless implementation of the adaptive time step can lead to severely erroneous results. The operator is provided as a self-contained Fortran 95 module and can be included into existing orbit-following tools that trace either the full Larmor motion or the guiding center dynamics. The adaptive time-stepping algorithm is expected to be useful in situations where the collision frequencies vary greatly over the course of a simulation. Examples include the slowing-down of fusion products or other fast ions, and the Dreicer generation of runaway electrons as well as the generation of fast ions or electrons with ion or electron cyclotron resonance heating.

  17. FlashPhotol: Using a Flash Photolysis Apparatus Simulator to Introduce Students to the Kinetics of Transient Species and Fast Reactions

    ERIC Educational Resources Information Center

    Bigger, Stephen W.

    2016-01-01

    FlashPhotol is an educational software package that introduces students to the kinetics of transient species and fast reactions. This is achieved by means of a computer-simulated flash photolysis apparatus that comprises all major functional elements and that students can use to perform various experiments. The experimental interface presents a…

  18. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics.

    PubMed

    Strehl, Robert; Ilie, Silvana

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated on three benchmarking systems, with special focus on approximation accuracy and efficiency.

  19. On FAST3D simulations of directly-driven inertial-fusion targets with high-Z layers for reducing laser imprint and surface non-uniformity growth

    NASA Astrophysics Data System (ADS)

    Bates, Jason; Schmitt, Andrew; Klapisch, Marcel; Karasik, Max; Obenschain, Steve

    2013-10-01

    Modifications to the FAST3D code have been made to enhance its ability to simulate the dynamics of plastic ICF targets with high-Z overcoats. This class of problems is challenging computationally due in part to plasma conditions that are not in a state of local thermodynamic equilibrium and to the presence of mixed computational cells containing more than one material. Recently, new opacity tables for gold, palladium and plastic have been generated with an improved version of the STA code. These improved tables provide smoother, higher-fidelity opacity data over a wider range of temperature and density states than before, and contribute to a more accurate treatment of radiative transfer processes in FAST3D simulations. Furthermore, a new, more efficient subroutine known as ``MMEOS'' has been installed in the FAST3D code for determining pressure and temperature equilibrium conditions within cells containing multiple materials. We will discuss these topics, and present new simulation results for high-Z planar-target experiments performed recently on the NIKE Laser Facility. Work supported by DOE/NNSA.

  20. Initial Evaluation of a Conflict Detection Tool in the Terminal Area

    NASA Technical Reports Server (NTRS)

    Verma Savita Arora; Tang, Huabin; Ballinger, Deborah S.; Kozon, Thomas E.; Farrahi, Amir Hossein

    2012-01-01

    Despite the recent economic recession and its adverse impact on air travel, the Federal Aviation Administration (FAA) continues to forecast an increase in air traffic demand that may see traffic double or triple by the year 2025. Increases in air traffic will burden the air traffic management system, and higher levels of safety and efficiency will be required. The air traffic controllers primary task is to ensure separation between aircraft in their airspace and keep the skies safe. As air traffic is forecasted to increase in volume and complexity [1], there is an increased likelihood of conflicts between aircraft, which adds risk and inefficiency to air traffic management and increases controller workload. To attenuate these factors, recent ATM research has shown that air and ground-based automation tools could reduce controller workload, especially if the automation is focused on conflict detection and resolution. Conflict Alert is a short time horizon conflict detection tool deployed in the Terminal Radar Approach Control (TRACON), which has limited utility due to the high number of false alerts generated and its use of dead reckoning to predict loss of separation between aircraft. Terminal Tactical Separation Assurance Flight Environment (T-TSAFE) is a short time horizon conflict detection tool that uses both flight intent and dead reckoning to detect conflicts. Results of a fast time simulation experiment indicated that TTSAFE provided a more effective alert lead-time and generated less false alerts than Conflict Alert [2]. TSAFE was previously tested in a Human-In-The-Loop (HITL) simulation study that focused on the en route phase of flight [3]. The current study tested the T-TSAFE tool in an HITL simulation study, focusing on the terminal environment with current day operations. The study identified procedures, roles, responsibilities, information requirements and usability, with the help of TRACON controllers who participated in the experiment. Metrics such as lead alert time, alert response time, workload, situation awareness and other measures were statistically analyzed. These metrics were examined from an overall perspective and comparisons between conditions (altitude resolutions via keyboard entry vs. ADS-B entry) and controller positions (two final approach sectors and two feeder sectors) were also examined. Results of these analyses and controller feedback provided evidence of T-TSAFE s potential promise as a useful air traffic controller tool. Heuristic analysis also provided information on ways in which the T-TSAFE tool can be improved. Details of analyses results will be presented in the full paper.

  1. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  2. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  3. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  4. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  5. Designing a SCADA system simulator for fast breeder reactor

    NASA Astrophysics Data System (ADS)

    Nugraha, E.; Abdullah, A. G.; Hakim, D. L.

    2016-04-01

    SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.

  6. HAKOU v3: SWIMS Hurricane Inundation Fast Forecasting Tool for Hawaii

    DTIC Science & Technology

    2012-02-01

    SUBTITLE HAKOU v3: SWIMS Hurricane Inundation Fast Forecasting Tool For Hawaii 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...Coupled SWAN+ADCIRC were driven with wind and pressure fields generated by the planetary boundary layer model TC96 (Thompson and Cardone 1996...F., and V. J. Cardone . 1996. Practical modeling of hurricane surface wind fields. J. Waterw. Port C-ASCE. 122(4): 195-205. Zijlema, M. 2010

  7. 1:50 Scale Testing of Three Floating Wind Turbines at MARIN and Numerical Model Validation Against Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dagher, Habib; Viselli, Anthony; Goupee, Andrew

    The primary goal of the basin model test program discussed herein is to properly scale and accurately capture physical data of the rigid body motions, accelerations and loads for different floating wind turbine platform technologies. The intended use for this data is for performing comparisons with predictions from various aero-hydro-servo-elastic floating wind turbine simulators for calibration and validation. Of particular interest is validating the floating offshore wind turbine simulation capabilities of NREL’s FAST open-source simulation tool. Once the validation process is complete, coupled simulators such as FAST can be used with a much greater degree of confidence in design processesmore » for commercial development of floating offshore wind turbines. The test program subsequently described in this report was performed at MARIN (Maritime Research Institute Netherlands) in Wageningen, the Netherlands. The models considered consisted of the horizontal axis, NREL 5 MW Reference Wind Turbine (Jonkman et al., 2009) with a flexible tower affixed atop three distinct platforms: a tension leg platform (TLP), a spar-buoy modeled after the OC3 Hywind (Jonkman, 2010) and a semi-submersible. The three generic platform designs were intended to cover the spectrum of currently investigated concepts, each based on proven floating offshore structure technology. The models were tested under Froude scale wind and wave loads. The high-quality wind environments, unique to these tests, were realized in the offshore basin via a novel wind machine which exhibits negligible swirl and low turbulence intensity in the flow field. Recorded data from the floating wind turbine models included rotor torque and position, tower top and base forces and moments, mooring line tensions, six-axis platform motions and accelerations at key locations on the nacelle, tower, and platform. A large number of tests were performed ranging from simple free-decay tests to complex operating conditions with irregular sea states and dynamic winds.« less

  8. hybrid\\scriptsize{{MANTIS}}: a CPU-GPU Monte Carlo method for modeling indirect x-ray detectors with columnar scintillators

    NASA Astrophysics Data System (ADS)

    Sharma, Diksha; Badal, Andreu; Badano, Aldo

    2012-04-01

    The computational modeling of medical imaging systems often requires obtaining a large number of simulated images with low statistical uncertainty which translates into prohibitive computing times. We describe a novel hybrid approach for Monte Carlo simulations that maximizes utilization of CPUs and GPUs in modern workstations. We apply the method to the modeling of indirect x-ray detectors using a new and improved version of the code \\scriptsize{{MANTIS}}, an open source software tool used for the Monte Carlo simulations of indirect x-ray imagers. We first describe a GPU implementation of the physics and geometry models in fast\\scriptsize{{DETECT}}2 (the optical transport model) and a serial CPU version of the same code. We discuss its new features like on-the-fly column geometry and columnar crosstalk in relation to the \\scriptsize{{MANTIS}} code, and point out areas where our model provides more flexibility for the modeling of realistic columnar structures in large area detectors. Second, we modify \\scriptsize{{PENELOPE}} (the open source software package that handles the x-ray and electron transport in \\scriptsize{{MANTIS}}) to allow direct output of location and energy deposited during x-ray and electron interactions occurring within the scintillator. This information is then handled by optical transport routines in fast\\scriptsize{{DETECT}}2. A load balancer dynamically allocates optical transport showers to the GPU and CPU computing cores. Our hybrid\\scriptsize{{MANTIS}} approach achieves a significant speed-up factor of 627 when compared to \\scriptsize{{MANTIS}} and of 35 when compared to the same code running only in a CPU instead of a GPU. Using hybrid\\scriptsize{{MANTIS}}, we successfully hide hours of optical transport time by running it in parallel with the x-ray and electron transport, thus shifting the computational bottleneck from optical to x-ray transport. The new code requires much less memory than \\scriptsize{{MANTIS}} and, as a result, allows us to efficiently simulate large area detectors.

  9. Spectrally resolved far-fields of terahertz quantum cascade lasers.

    PubMed

    Brandstetter, Martin; Schönhuber, Sebastian; Krall, Michael; Kainz, Martin A; Detz, Hermann; Zederbauer, Tobias; Andrews, Aaron M; Strasser, Gottfried; Unterrainer, Karl

    2016-10-31

    We demonstrate a convenient and fast method to measure the spectrally resolved far-fields of multimode terahertz quantum cascade lasers by combining a microbolometer focal plane array with an FTIR spectrometer. Far-fields of fundamental TM0 and higher lateral order TM1 modes of multimode Fabry-Pérot type lasers have been distinguished, which very well fit to the results obtained by a 3D finite-element simulation. Furthermore, multimode random laser cavities have been investigated, analyzing the contribution of each single laser mode to the total far-field. The presented method is thus an important tool to gain in-depth knowledge of the emission properties of multimode laser cavities at terahertz frequencies, which become increasingly important for future sensing applications.

  10. A Virtual World of Visualization

    NASA Technical Reports Server (NTRS)

    1998-01-01

    In 1990, Sterling Software, Inc., developed the Flow Analysis Software Toolkit (FAST) for NASA Ames on contract. FAST is a workstation based modular analysis and visualization tool. It is used to visualize and animate grids and grid oriented data, typically generated by finite difference, finite element and other analytical methods. FAST is now available through COSMIC, NASA's software storehouse.

  11. 77 FR 18718 - Petroleum Reduction and Alternative Fuel Consumption Requirements for Federal Fleets

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... Statistical Tool Web-based reporting system (FAST) for FY 2005. Moreover, section 438.102(b) would require... reflected in FY 2005 FAST data, or (2) the lesser of (a) five percent of total Federal fleet vehicle fuel... event that the Federal fleet's alternative fuel use value for FY 2005 submitted through FAST did not...

  12. Impaired Comprehension of Speed Verbs in Parkinson's Disease.

    PubMed

    Speed, Laura J; van Dam, Wessel O; Hirath, Priyantha; Vigliocco, Gabriella; Desai, Rutvik H

    2017-05-01

    A wealth of studies provide evidence for action simulation during language comprehension. Recent research suggests such action simulations might be sensitive to fine-grained information, such as speed. Here, we present a crucial test for action simulation of speed in language by assessing speed comprehension in patients with Parkinson's disease (PD). Based on the patients' motor deficits, we hypothesized that the speed of motion described in language would modulate their performance in semantic tasks. Specifically, they would have more difficulty processing language about relatively fast speed than language about slow speed. We conducted a semantic similarity judgment task on fast and slow action verbs in patients with PD and age-matched healthy controls. Participants had to decide which of two verbs most closely matched a target word. Compared to controls, PD patients were slower making judgments about fast action verbs, but not for judgments about slow action verbs, suggesting impairment in processing language about fast action. Moreover, this impairment was specific to verbs describing fast action performed with the hand. Problems moving quickly lead to difficulties comprehending language about moving quickly. This study provides evidence that speed is an important part of action representations. (JINS, 2017, 23, 412-420).

  13. Flight investigation of a four-dimensional terminal area guidance system for STOL aircraft

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Hardy, G. H.

    1981-01-01

    A series of flight tests and fast-time simulations were conducted, using the augmentor wing jet STOL research aircraft and the STOLAND 4D-RNAV system to add to the growing data base of 4D-RNAV system performance capabilities. To obtain statistically meaningful data a limited amount of flight data were supplemented by a statistically significant amount of data obtained from fast-time simulation. The results of these tests are reported. Included are comparisons of the 4D-RNAV estimated winds with actual winds encountered in flight, as well as data on along-track navigation and guidance errors, and time-of-arrival errors at the final approach waypoint. In addition, a slight improvement of the STOLAND 4D-RNAV system is proposed and demonstrated, using the fast-time simulation.

  14. FastSim: A Fast Simulation for the SuperB Detector

    NASA Astrophysics Data System (ADS)

    Andreassen, R.; Arnaud, N.; Brown, D. N.; Burmistrov, L.; Carlson, J.; Cheng, C.-h.; Di Simone, A.; Gaponenko, I.; Manoni, E.; Perez, A.; Rama, M.; Roberts, D.; Rotondo, M.; Simi, G.; Sokoloff, M.; Suzuki, A.; Walsh, J.

    2011-12-01

    We have developed a parameterized (fast) simulation for detector optimization and physics reach studies of the proposed SuperB Flavor Factory in Italy. Detector components are modeled as thin sections of planes, cylinders, disks or cones. Particle-material interactions are modeled using simplified cross-sections and formulas. Active detectors are modeled using parameterized response functions. Geometry and response parameters are configured using xml files with a custom-designed schema. Reconstruction algorithms adapted from BaBar are used to build tracks and clusters. Multiple sources of background signals can be merged with primary signals. Pattern recognition errors are modeled statistically by randomly misassigning nearby tracking hits. Standard BaBar analysis tuples are used as an event output. Hadronic B meson pair events can be simulated at roughly 10Hz.

  15. Isotope and fast ions turbulence suppression effects: Consequences for high-β ITER plasmas

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Görler, T.; Jenko, F.

    2018-05-01

    The impact of isotope effects and fast ions on microturbulence is analyzed by means of non-linear gyrokinetic simulations for an ITER hybrid scenario at high beta obtained from previous integrated modelling simulations with simplified assumptions. Simulations show that ITER might work very close to threshold, and in these conditions, significant turbulence suppression is found from DD to DT plasmas. Electromagnetic effects are shown to play an important role in the onset of this isotope effect. Additionally, even external ExB flow shear, which is expected to be low in ITER, has a stronger impact on DT than on DD. The fast ions generated by fusion reactions can additionally reduce turbulence even more although the impact in ITER seems weaker than in present-day tokamaks.

  16. New Software for the Fast Estimation of Population Recombination Rates (FastEPRR) in the Genomic Era.

    PubMed

    Gao, Feng; Ming, Chen; Hu, Wangjie; Li, Haipeng

    2016-06-01

    Genetic recombination is a very important evolutionary mechanism that mixes parental haplotypes and produces new raw material for organismal evolution. As a result, information on recombination rates is critical for biological research. In this paper, we introduce a new extremely fast open-source software package (FastEPRR) that uses machine learning to estimate recombination rate [Formula: see text] (=[Formula: see text]) from intraspecific DNA polymorphism data. When [Formula: see text] and the number of sampled diploid individuals is large enough ([Formula: see text]), the variance of [Formula: see text] remains slightly smaller than that of [Formula: see text] The new estimate [Formula: see text] (calculated by averaging [Formula: see text] and [Formula: see text]) has the smallest variance of all cases. When estimating [Formula: see text], the finite-site model was employed to analyze cases with a high rate of recurrent mutations, and an additional method is proposed to consider the effect of variable recombination rates within windows. Simulations encompassing a wide range of parameters demonstrate that different evolutionary factors, such as demography and selection, may not increase the false positive rate of recombination hotspots. Overall, accuracy of FastEPRR is similar to the well-known method, LDhat, but requires far less computation time. Genetic maps for each human population (YRI, CEU, and CHB) extracted from the 1000 Genomes OMNI data set were obtained in less than 3 d using just a single CPU core. The Pearson Pairwise correlation coefficient between the [Formula: see text] and [Formula: see text] maps is very high, ranging between 0.929 and 0.987 at a 5-Mb scale. Considering that sample sizes for these kinds of data are increasing dramatically with advances in next-generation sequencing technologies, FastEPRR (freely available at http://www.picb.ac.cn/evolgen/) is expected to become a widely used tool for establishing genetic maps and studying recombination hotspots in the population genomic era. Copyright © 2016 Gao et al.

  17. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    NASA Astrophysics Data System (ADS)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  18. Velocity space resolved absolute measurement of fast ion losses induced by a tearing mode in the ASDEX Upgrade tokamak

    NASA Astrophysics Data System (ADS)

    Galdon-Quiroga, J.; Garcia-Munoz, M.; Sanchis-Sanchez, L.; Mantsinen, M.; Fietz, S.; Igochine, V.; Maraschek, M.; Rodriguez-Ramos, M.; Sieglin, B.; Snicker, A.; Tardini, G.; Vezinet, D.; Weiland, M.; Eriksson, L. G.; The ASDEX Upgrade Team; The EUROfusion MST1 Team

    2018-03-01

    Absolute flux of fast ion losses induced by tearing modes have been measured by means of fast ion loss detectors (FILD) for the first time in RF heated plasmas in the ASDEX Upgrade tokamak. Up to 30 MW m-2 of fast ion losses are measured by FILD at 5 cm from the separatrix, consistent with infra-red camera measurements, with energies in the range of 250-500 keV and pitch angles corresponding to large trapped orbits. A resonant interaction between the fast ions in the high energy tail of the ICRF distribution and a m/n  =  5/4 tearing mode leads to enhanced fast ion losses. Around 9.3 +/- 0.7 % of the fast ion losses are found to be coherent with the mode and scale linearly with its amplitude, indicating the convective nature of the transport mechanism. Simulations have been carried out to estimate the contribution of the prompt losses. A good agreement is found between the simulated and the measured velocity space of the losses. The velocity space resonances that may be responsible for the enhanced fast ion losses are identified.

  19. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    NASA Technical Reports Server (NTRS)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    Incheon International Airport (ICN) is one of the hub airports in East Asia. Airport operations at ICN have been growing more than 5% per year in the past five years. According to the current airport expansion plan, a new passenger terminal will be added and the current cargo ramp will be expanded in 2018. This expansion project will bring 77 new stands without adding a new runway to the airport. Due to such continuous growth in airport operations and future expansion of the ramps, it will be highly likely that airport surface traffic will experience more congestion, and therefore, suffer from efficiency degradation. There is a growing awareness in aviation research community of need for strategic and tactical surface scheduling capabilities for efficient airport surface operations. Specific to ICN airport operations, a need for A-CDM (Airport - Collaborative Decision Making) or S-CDM(Surface - Collaborative Decision Making), and controller decision support tools for efficient air traffic management has arisen since several years ago. In the United States, there has been independent research efforts made by academia, industry, and government research organizations to enhance efficiency and predictability of surface operations at busy airports. Among these research activities, the Spot and Runway Departure Advisor (SARDA) developed and tested by National Aeronautics and Space Administration (NASA) is a decision support tool to provide tactical advisories to the controllers for efficient surface operations. The effectiveness of SARDA concept, was successfully verified through the human-in-the-loop (HITL) simulations for both spot release and runway operations advisories for ATC Tower controllers of Dallas/Fort Worth International Airport (DFW) in 2010 and 2012, and gate pushback advisories for the ramp controller of Charlotte/Douglas International Airport (CLT) in 2014. The SARDA concept for tactical surface scheduling is further enhanced and is being integrated into NASA's Airspace Technology Demonstration - 2 (ATD-2) project for technology demonstration of Integrated Arrival/Departure/Surface (ADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement)/KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  20. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    NASA Technical Reports Server (NTRS)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    Incheon International Airport (ICN) is one of the hub airports in East Asia. Airport operations at ICN have been growing more than 5 percent per year in the past five years. According to the current airport expansion plan, a new passenger terminal will be added and the current cargo ramp will be expanded in 2018. This expansion project will bring 77 new stands without adding a new runway to the airport. Due to such continuous growth in airport operations and future expansion of the ramps, it will be highly likely that airport surface traffic will experience more congestion, and therefore, suffer from efficiency degradation. There is a growing awareness in aviation research community of need for strategic and tactical surface scheduling capabilities for efficient airport surface operations. Specific to ICN airport operations, a need for A-CDM (Airport - Collaborative Decision Making) or S-CDM (Surface - Collaborative Decision Making), and controller decision support tools for efficient air traffic management has arisen since several years ago. In the United States, there has been independent research efforts made by academia, industry, and government research organizations to enhance efficiency and predictability of surface operations at busy airports. Among these research activities, the Spot and Runway Departure Advisor (SARDA) developed and tested by National Aeronautics and Space Administration (NASA) is a decision support tool to provide tactical advisories to the controllers for efficient surface operations. The effectiveness of SARDA concept, was successfully verified through the human-in-the-loop (HITL) simulations for both spot release and runway operations advisories for ATC Tower controllers of Dallas-Fort Worth International Airport (DFW) in 2010 and 2012, and gate pushback advisories for the ramp controller of Charlotte-Douglas International Airport (CLT) in 2014. The SARDA concept for tactical surface scheduling is further enhanced and is being integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  1. Operational test results of the passive final approach spacing tool

    DOT National Transportation Integrated Search

    1997-06-01

    A prototype decision support tool for terminal area air traffic controllers, : referred to as the Final Approach Spacing Tool (FAST), was recently evaluated in : operation with live air traffic at the Dallas/Fort Worth, Texas Airport, : Controllers u...

  2. Study on beam geometry and image reconstruction algorithm in fast neutron computerized tomography at NECTAR facility

    NASA Astrophysics Data System (ADS)

    Guo, J.; Bücherl, T.; Zou, Y.; Guo, Z.

    2011-09-01

    Investigations on the fast neutron beam geometry for the NECTAR facility are presented. The results of MCNP simulations and experimental measurements of the beam distributions at NECTAR are compared. Boltzmann functions are used to describe the beam profile in the detection plane assuming the area source to be set up of large number of single neutron point sources. An iterative algebraic reconstruction algorithm is developed, realized and verified by both simulated and measured projection data. The feasibility for improved reconstruction in fast neutron computerized tomography at the NECTAR facility is demonstrated.

  3. A Method for Extracting the Free Energy Surface and Conformational Dynamics of Fast-Folding Proteins from Single Molecule Photon Trajectories

    PubMed Central

    2015-01-01

    Single molecule fluorescence spectroscopy holds the promise of providing direct measurements of protein folding free energy landscapes and conformational motions. However, fulfilling this promise has been prevented by technical limitations, most notably, the difficulty in analyzing the small packets of photons per millisecond that are typically recorded from individual biomolecules. Such limitation impairs the ability to accurately determine conformational distributions and resolve sub-millisecond processes. Here we develop an analytical procedure for extracting the conformational distribution and dynamics of fast-folding proteins directly from time-stamped photon arrival trajectories produced by single molecule FRET experiments. Our procedure combines the maximum likelihood analysis originally developed by Gopich and Szabo with a statistical mechanical model that describes protein folding as diffusion on a one-dimensional free energy surface. Using stochastic kinetic simulations, we thoroughly tested the performance of the method in identifying diverse fast-folding scenarios, ranging from two-state to one-state downhill folding, as a function of relevant experimental variables such as photon count rate, amount of input data, and background noise. The tests demonstrate that the analysis can accurately retrieve the original one-dimensional free energy surface and microsecond folding dynamics in spite of the sub-megahertz photon count rates and significant background noise levels of current single molecule fluorescence experiments. Therefore, our approach provides a powerful tool for the quantitative analysis of single molecule FRET experiments of fast protein folding that is also potentially extensible to the analysis of any other biomolecular process governed by sub-millisecond conformational dynamics. PMID:25988351

  4. A Method for Extracting the Free Energy Surface and Conformational Dynamics of Fast-Folding Proteins from Single Molecule Photon Trajectories.

    PubMed

    Ramanathan, Ravishankar; Muñoz, Victor

    2015-06-25

    Single molecule fluorescence spectroscopy holds the promise of providing direct measurements of protein folding free energy landscapes and conformational motions. However, fulfilling this promise has been prevented by technical limitations, most notably, the difficulty in analyzing the small packets of photons per millisecond that are typically recorded from individual biomolecules. Such limitation impairs the ability to accurately determine conformational distributions and resolve sub-millisecond processes. Here we develop an analytical procedure for extracting the conformational distribution and dynamics of fast-folding proteins directly from time-stamped photon arrival trajectories produced by single molecule FRET experiments. Our procedure combines the maximum likelihood analysis originally developed by Gopich and Szabo with a statistical mechanical model that describes protein folding as diffusion on a one-dimensional free energy surface. Using stochastic kinetic simulations, we thoroughly tested the performance of the method in identifying diverse fast-folding scenarios, ranging from two-state to one-state downhill folding, as a function of relevant experimental variables such as photon count rate, amount of input data, and background noise. The tests demonstrate that the analysis can accurately retrieve the original one-dimensional free energy surface and microsecond folding dynamics in spite of the sub-megahertz photon count rates and significant background noise levels of current single molecule fluorescence experiments. Therefore, our approach provides a powerful tool for the quantitative analysis of single molecule FRET experiments of fast protein folding that is also potentially extensible to the analysis of any other biomolecular process governed by sub-millisecond conformational dynamics.

  5. Development of interactive graphic user interfaces for modeling reaction-based biogeochemical processes in batch systems with BIOGEOCHEM

    NASA Astrophysics Data System (ADS)

    Chang, C.; Li, M.; Yeh, G.

    2010-12-01

    The BIOGEOCHEM numerical model (Yeh and Fang, 2002; Fang et al., 2003) was developed with FORTRAN for simulating reaction-based geochemical and biochemical processes with mixed equilibrium and kinetic reactions in batch systems. A complete suite of reactions including aqueous complexation, adsorption/desorption, ion-exchange, redox, precipitation/dissolution, acid-base reactions, and microbial mediated reactions were embodied in this unique modeling tool. Any reaction can be treated as fast/equilibrium or slow/kinetic reaction. An equilibrium reaction is modeled with an implicit finite rate governed by a mass action equilibrium equation or by a user-specified algebraic equation. A kinetic reaction is modeled with an explicit finite rate with an elementary rate, microbial mediated enzymatic kinetics, or a user-specified rate equation. None of the existing models has encompassed this wide array of scopes. To ease the input/output learning curve using the unique feature of BIOGEOCHEM, an interactive graphic user interface was developed with the Microsoft Visual Studio and .Net tools. Several user-friendly features, such as pop-up help windows, typo warning messages, and on-screen input hints, were implemented, which are robust. All input data can be real-time viewed and automated to conform with the input file format of BIOGEOCHEM. A post-processor for graphic visualizations of simulated results was also embedded for immediate demonstrations. By following data input windows step by step, errorless BIOGEOCHEM input files can be created even if users have little prior experiences in FORTRAN. With this user-friendly interface, the time effort to conduct simulations with BIOGEOCHEM can be greatly reduced.

  6. PlanetPack: A radial-velocity time-series analysis tool facilitating exoplanets detection, characterization, and dynamical simulations

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-08-01

    We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.

  7. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    NASA Astrophysics Data System (ADS)

    Helle, K. B.; Müller, T. O.; Astrup, P.; Dyve, J. E.

    2014-05-01

    Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often chosen using regular grids or according to administrative constraints. Nowadays, however, the choice can be based on more realistic risk assessment, as it is possible to simulate potential radioactive plumes. To support sensor planning, we developed the DETECT Optimisation Tool (DOT) within the scope of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64 source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given monitoring network for early detection of radioactive plumes or for the creation of dose maps. The DOT is implemented as a stand-alone easy-to-use JAVA-based application with a graphical user interface and an R backend. Users can run evaluations and optimisations, and display, store and download the results. The DOT runs on a server and can be accessed via common web browsers; it can also be installed locally.

  8. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    DOE PAGES

    Nord, B.; Amara, A.; Refregier, A.; ...

    2016-03-03

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less

  9. An Energy-Dispersive X-Ray Fluorescence Spectrometry and Monte Carlo simulation study of Iron-Age Nuragic small bronzes ("Navicelle") from Sardinia, Italy

    NASA Astrophysics Data System (ADS)

    Schiavon, Nick; de Palmas, Anna; Bulla, Claudio; Piga, Giampaolo; Brunetti, Antonio

    2016-09-01

    A spectrometric protocol combining Energy Dispersive X-Ray Fluorescence Spectrometry with Monte Carlo simulations of experimental spectra using the XRMC code package has been applied for the first time to characterize the elemental composition of a series of famous Iron Age small scale archaeological bronze replicas of ships (known as the ;Navicelle;) from the Nuragic civilization in Sardinia, Italy. The proposed protocol is a useful, nondestructive and fast analytical tool for Cultural Heritage sample. In Monte Carlo simulations, each sample was modeled as a multilayered object composed by two or three layers depending on the sample: when all present, the three layers are the original bronze substrate, the surface corrosion patina and an outermost protective layer (Paraloid) applied during past restorations. Monte Carlo simulations were able to account for the presence of the patina/corrosion layer as well as the presence of the Paraloid protective layer. It also accounted for the roughness effect commonly found at the surface of corroded metal archaeological artifacts. In this respect, the Monte Carlo simulation approach adopted here was, to the best of our knowledge, unique and enabled to determine the bronze alloy composition together with the thickness of the surface layers without the need for previously removing the surface patinas, a process potentially threatening preservation of precious archaeological/artistic artifacts for future generations.

  10. Desorption of sulphur mustard simulants methyl salicylate and 2-chloroethyl ethyl sulphide from contaminated scalp hair after vapour exposure.

    PubMed

    Spiandore, Marie; Souilah-Edib, Mélanie; Piram, Anne; Lacoste, Alexandre; Josse, Denis; Doumenq, Pierre

    2018-01-01

    Chemical warfare agents have been used to incapacitate, injure or kill people, in a context of war or terrorist attack. It has previously been shown that hair could trap the sulphur mustard simulants methyl salicylate and 2-chloroethyl ethyl sulphide. In order to investigate simulants persistency in hair after intense vapour exposure, their desorption kinetics were studied by using two complementary methods: hair residual content measurement and desorbed vapour monitoring. Results showed that both simulants were detected in air and could be recovered from hair 2 h after the end of exposure. Longer experiments with methyl salicylate showed that it could still be recovered from hair after 24 h. Our data were fitted with several kinetic models and best correlation was obtained with a bimodal first-order equation, suggesting a 2-step desorption kinetics model: initial fast regime followed by a slower desorption. 2-chloroethyl ethyl sulphide was also detected in the immediate environment after hair exposure for 2 h, and hair simulant content decreased by more than 80%. Our results showed that hair ability to release formerly trapped chemical toxics could lead to health hazard. Their persistency however confirmed the potentiality of hair analysis as a tool for chemical exposure assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Multi-physics CFD simulations in engineering

    NASA Astrophysics Data System (ADS)

    Yamamoto, Makoto

    2013-08-01

    Nowadays Computational Fluid Dynamics (CFD) software is adopted as a design and analysis tool in a great number of engineering fields. We can say that single-physics CFD has been sufficiently matured in the practical point of view. The main target of existing CFD software is single-phase flows such as water and air. However, many multi-physics problems exist in engineering. Most of them consist of flow and other physics, and the interactions between different physics are very important. Obviously, multi-physics phenomena are critical in developing machines and processes. A multi-physics phenomenon seems to be very complex, and it is so difficult to be predicted by adding other physics to flow phenomenon. Therefore, multi-physics CFD techniques are still under research and development. This would be caused from the facts that processing speed of current computers is not fast enough for conducting a multi-physics simulation, and furthermore physical models except for flow physics have not been suitably established. Therefore, in near future, we have to develop various physical models and efficient CFD techniques, in order to success multi-physics simulations in engineering. In the present paper, I will describe the present states of multi-physics CFD simulations, and then show some numerical results such as ice accretion and electro-chemical machining process of a three-dimensional compressor blade which were obtained in my laboratory. Multi-physics CFD simulations would be a key technology in near future.

  12. Direct Monte Carlo simulation of chemical reaction systems: Simple bimolecular reactions

    NASA Astrophysics Data System (ADS)

    Piersall, Shannon D.; Anderson, James B.

    1991-07-01

    In applications to several simple reaction systems we have explored a ``direct simulation'' method for predicting and understanding the behavior of gas phase chemical reaction systems. This Monte Carlo method, originated by Bird, has been found remarkably successful in treating a number of difficult problems in rarefied dynamics. Extension to chemical reactions offers a powerful tool for treating reaction systems with nonthermal distributions, with coupled gas-dynamic and reaction effects, with emission and adsorption of radiation, and with many other effects difficult to treat in any other way. The usual differential equations of chemical kinetics are eliminated. For a bimolecular reaction of the type A+B→C+D with a rate sufficiently low to allow a continued thermal equilibrium of reactants we find that direct simulation reproduces the expected second order kinetics. Simulations for a range of temperatures yield the activation energies expected for the reaction models specified. For faster reactions under conditions leading to a depletion of energetic reactant species, the expected slowing of reaction rates and departures from equilibrium distributions are observed. The minimum sample sizes required for adequate simulations are as low as 1000 molecules for these cases. The calculations are found to be simple and straightforward for the homogeneous systems considered. Although computation requirements may be excessively high for very slow reactions, they are reasonably low for fast reactions, for which nonequilibrium effects are most important.

  13. Impact of pharmacy automation on patient waiting time: an application of computer simulation.

    PubMed

    Tan, Woan Shin; Chua, Siang Li; Yong, Keng Woh; Wu, Tuck Seng

    2009-06-01

    This paper aims to illustrate the use of computer simulation in evaluating the impact of a prototype automated dispensing system on waiting time in an outpatient pharmacy and its potential as a routine tool in pharmacy management. A discrete event simulation model was developed to investigate the impact of a prototype automated dispensing system on operational efficiency and service standards in an outpatient pharmacy. The simulation results suggest that automating the prescription-filing function using a prototype that picks and packs at 20 seconds per item will not assist the pharmacy in achieving the waiting time target of 30 minutes for all patients. Regardless of the state of automation, to meet the waiting time target, 2 additional pharmacists are needed to overcome the process bottleneck at the point of medication dispense. However, if the automated dispensing is the preferred option, the speed of the system needs to be twice as fast as the current configuration to facilitate the reduction of the 95th percentile patient waiting time to below 30 minutes. The faster processing speed will concomitantly allow the pharmacy to reduce the number of pharmacy technicians from 11 to 8. Simulation was found to be a useful and low cost method that allows an otherwise expensive and resource intensive evaluation of new work processes and technology to be completed within a short time.

  14. Impact of marine reserve on maximum sustainable yield in a traditional prey-predator system

    NASA Astrophysics Data System (ADS)

    Paul, Prosenjit; Kar, T. K.; Ghorai, Abhijit

    2018-01-01

    Multispecies fisheries management requires managers to consider the impact of fishing activities on several species as fishing impacts both targeted and non-targeted species directly or indirectly in several ways. The intended goal of traditional fisheries management is to achieve maximum sustainable yield (MSY) from the targeted species, which on many occasions affect the targeted species as well as the entire ecosystem. Marine reserves are often acclaimed as the marine ecosystem management tool. Few attempts have been made to generalize the ecological effects of marine reserve on MSY policy. We examine here how MSY and population level in a prey-predator system are affected by the low, medium and high reserve size under different possible scenarios. Our simulation works shows that low reserve area, the value of MSY for prey exploitation is maximum when both prey and predator species have fast movement rate. For medium reserve size, our analysis revealed that the maximum value of MSY for prey exploitation is obtained when prey population has fast movement rate and predator population has slow movement rate. For high reserve area, the maximum value of MSY for prey's exploitation is very low compared to the maximum value of MSY for prey's exploitation in case of low and medium reserve. On the other hand, for low and medium reserve area, MSY for predator exploitation is maximum when both the species have fast movement rate.

  15. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  16. Effects of chewing rate and reactive hyperemia on blood flow in denture-supporting mucosa during simulated chewing.

    PubMed

    Ogino, Takamichi; Ueda, Takayuki; Ogami, Koichiro; Koike, Takashi; Sakurai, Kaoru

    2017-01-01

    We examined how chewing rate and the extent of reactive hyperemia affect the blood flow in denture-supporting mucosa during chewing. The left palatal mucosa was loaded under conditions of simulated chewing or simulated clenching for 30s, and the blood flow during loading was recorded. We compared the relative blood flow during loading under conditions that recreated different chewing rates by combining duration of chewing cycle (DCC) and occlusal time (OT): fast chewing group, typical chewing group, slow chewing group and clenching group. The relationship between relative blood flow during simulated chewing and the extent of reactive hyperemia was also analyzed. When comparing the different chewing rate, the relative blood flow was highest in fast chewing rate, followed by typical chewing rate and slow chewing rate. Accordingly, we suggest that fast chewing increases the blood flow more than typical chewing or slow chewing. There was a significant correlation between the amount of blood flow during simulated chewing and the extent of reactive hyperemia. Within the limitations of this study, we concluded that slow chewing induced less blood flow than typical or fast chewing in denture-supporting mucosa and that people with less reactive hyperemia had less blood flow in denture-supporting mucosa during chewing. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  17. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    PubMed

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.

  18. 3D Vectorial Time Domain Computational Integrated Photonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallman, J S; Bond, T C; Koning, J M

    2007-02-16

    The design of integrated photonic structures poses considerable challenges. 3D-Time-Domain design tools are fundamental in enabling technologies such as all-optical logic, photonic bandgap sensors, THz imaging, and fast radiation diagnostics. Such technologies are essential to LLNL and WFO sponsors for a broad range of applications: encryption for communications and surveillance sensors (NSA, NAI and IDIV/PAT); high density optical interconnects for high-performance computing (ASCI); high-bandwidth instrumentation for NIF diagnostics; micro-sensor development for weapon miniaturization within the Stockpile Stewardship and DNT programs; and applications within HSO for CBNP detection devices. While there exist a number of photonics simulation tools on the market,more » they primarily model devices of interest to the communications industry. We saw the need to extend our previous software to match the Laboratory's unique emerging needs. These include modeling novel material effects (such as those of radiation induced carrier concentrations on refractive index) and device configurations (RadTracker bulk optics with radiation induced details, Optical Logic edge emitting lasers with lateral optical inputs). In addition we foresaw significant advantages to expanding our own internal simulation codes: parallel supercomputing could be incorporated from the start, and the simulation source code would be accessible for modification and extension. This work addressed Engineering's Simulation Technology Focus Area, specifically photonics. Problems addressed from the Engineering roadmap of the time included modeling the Auston switch (an important THz source/receiver), modeling Vertical Cavity Surface Emitting Lasers (VCSELs, which had been envisioned as part of fast radiation sensors), and multi-scale modeling of optical systems (for a variety of applications). We proposed to develop novel techniques to numerically solve the 3D multi-scale propagation problem for both the microchip laser logic devices as well as devices characterized by electromagnetic (EM) propagation in nonlinear materials with time-varying parameters. The deliverables for this project were extended versions of the laser logic device code Quench2D and the EM propagation code EMsolve with new modules containing the novel solutions incorporated by taking advantage of the existing software interface and structured computational modules. Our approach was multi-faceted since no single methodology can always satisfy the tradeoff between model runtime and accuracy requirements. We divided the problems to be solved into two main categories: those that required Full Wave Methods and those that could be modeled using Approximate Methods. Full Wave techniques are useful in situations where Maxwell's equations are not separable (or the problem is small in space and time), while approximate techniques can treat many of the remaining cases.« less

  19. Fast multipole method using Cartesian tensor in beam dynamic simulation

    DOE PAGES

    Zhang, He; Huang, He; Li, Rui; ...

    2017-03-06

    Here, the fast multipole method (FMM) using traceless totally symmetric Cartesian tensor to calculate the Coulomb interaction between charged particles will be presented. The Cartesian tensor-based FMM can be generalized to treat other non-oscillating interactions with the help of the differential algebra or the truncated power series algebra. Issues on implementation of the FMM in beam dynamic simulations are also discussed.

  20. Thermal decomposition of solid phase nitromethane under various heating rates and target temperatures based on ab initio molecular dynamics simulations.

    PubMed

    Xu, Kai; Wei, Dong-Qing; Chen, Xiang-Rong; Ji, Guang-Fu

    2014-10-01

    The Car-Parrinello molecular dynamics simulation was applied to study the thermal decomposition of solid phase nitromethane under gradual heating and fast annealing conditions. In gradual heating simulations, we found that, rather than C-N bond cleavage, intermolecular proton transfer is more likely to be the first reaction in the decomposition process. At high temperature, the first reaction in fast annealing simulation is intermolecular proton transfer leading to CH3NOOH and CH2NO2, whereas the initial chemical event at low temperature tends to be a unimolecular C-N bond cleavage, producing CH3 and NO2 fragments. It is the first time to date that the direct rupture of a C-N bond has been reported as the first reaction in solid phase nitromethane. In addition, the fast annealing simulations on a supercell at different temperatures are conducted to validate the effect of simulation cell size on initial reaction mechanisms. The results are in qualitative agreement with the simulations on a unit cell. By analyzing the time evolution of some molecules, we also found that the time of first water molecule formation is clearly sensitive to heating rates and target temperatures when the first reaction is an intermolecular proton transfer.

  1. SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karthik, Rajasekar; Lu, Wei

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operationmore » purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban emergency evacuation framework that can improve traffic mobility and safety under critical infrastructure disruption in today s socially connected world.« less

  2. Supervisory control of mobile sensor networks: math formulation, simulation, and implementation.

    PubMed

    Giordano, Vincenzo; Ballal, Prasanna; Lewis, Frank; Turchiano, Biagio; Zhang, Jing Bing

    2006-08-01

    This paper uses a novel discrete-event controller (DEC) for the coordination of cooperating heterogeneous wireless sensor networks (WSNs) containing both unattended ground sensors (UGSs) and mobile sensor robots. The DEC sequences the most suitable tasks for each agent and assigns sensor resources according to the current perception of the environment. A matrix formulation makes this DEC particularly useful for WSN, where missions change and sensor agents may be added or may fail. WSN have peculiarities that complicate their supervisory control. Therefore, this paper introduces several new tools for DEC design and operation, including methods for generating the required supervisory matrices based on mission planning, methods for modifying the matrices in the event of failed nodes, or nodes entering the network, and a novel dynamic priority assignment weighting approach for selecting the most appropriate and useful sensors for a given mission task. The resulting DEC represents a complete dynamical description of the WSN system, which allows a fast programming of deployable WSN, a computer simulation analysis, and an efficient implementation. The DEC is actually implemented on an experimental wireless-sensor-network prototyping system. Both simulation and experimental results are presented to show the effectiveness and versatility of the developed control architecture.

  3. Parametric investigations of plasma characteristics in a remote inductively coupled plasma system

    NASA Astrophysics Data System (ADS)

    Shukla, Prasoon; Roy, Abhra; Jain, Kunal; Bhoj, Ananth

    2016-09-01

    Designing a remote plasma system involves source chamber sizing, selection of coils and/or electrodes to power the plasma, designing the downstream tubes, selection of materials used in the source and downstream regions, locations of inlets and outlets and finally optimizing the process parameter space of pressure, gas flow rates and power delivery. Simulations can aid in spatial and temporal plasma characterization in what are often inaccessible locations for experimental probes in the source chamber. In this paper, we report on simulations of a remote inductively coupled Argon plasma system using the modeling platform CFD-ACE +. The coupled multiphysics model description successfully address flow, chemistry, electromagnetics, heat transfer and plasma transport in the remote plasma system. The SimManager tool enables easy setup of parametric simulations to investigate the effect of varying the pressure, power, frequency, flow rates and downstream tube lengths. It can also enable the automatic solution of the varied parameters to optimize a user-defined objective function, which may be the integral ion and radical fluxes at the wafer. The fast run time coupled with the parametric and optimization capabilities can add significant insight and value in design and optimization.

  4. The fast encryption package

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1988-01-01

    The organization of some tools to help improve passwork security at a UNIX-based site is described along with how to install and use them. These tools and their associated library enable a site to force users to pick reasonably safe passwords (safe being site configurable) and to enable site management to try to crack existing passworks. The library contains various versions of a very fast implementation of the Data Encryption Standard and of the one-way encryption functions used to encryp the password.

  5. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strehl, Robert; Ilie, Silvana, E-mail: silvana@ryerson.ca

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated onmore » three benchmarking systems, with special focus on approximation accuracy and efficiency.« less

  6. A fast sorting algorithm for a hypersonic rarefied flow particle simulation on the connection machine

    NASA Technical Reports Server (NTRS)

    Dagum, Leonardo

    1989-01-01

    The data parallel implementation of a particle simulation for hypersonic rarefied flow described by Dagum associates a single parallel data element with each particle in the simulation. The simulated space is divided into discrete regions called cells containing a variable and constantly changing number of particles. The implementation requires a global sort of the parallel data elements so as to arrange them in an order that allows immediate access to the information associated with cells in the simulation. Described here is a very fast algorithm for performing the necessary ranking of the parallel data elements. The performance of the new algorithm is compared with that of the microcoded instruction for ranking on the Connection Machine.

  7. A Three-dimensional Simulation of a Magnetized Accretion Disk: Fast Funnel Accretion onto a Weakly Magnetized Star

    NASA Astrophysics Data System (ADS)

    Takasao, Shinsuke; Tomida, Kengo; Iwasaki, Kazunari; Suzuki, Takeru K.

    2018-04-01

    We present the results of a global, three-dimensional magnetohydrodynamics simulation of an accretion disk with a rotating, weakly magnetized central star. The disk is threaded by a weak, large-scale poloidal magnetic field, and the central star has no strong stellar magnetosphere initially. Our simulation investigates the structure of the accretion flows from a turbulent accretion disk onto the star. The simulation reveals that fast accretion onto the star at high latitudes occurs even without a stellar magnetosphere. We find that the failed disk wind becomes the fast, high-latitude accretion as a result of angular momentum exchange mediated by magnetic fields well above the disk, where the Lorentz force that decelerates the rotational motion of gas can be comparable to the centrifugal force. Unlike the classical magnetospheric accretion scenario, fast accretion streams are not guided by magnetic fields of the stellar magnetosphere. Nevertheless, the accretion velocity reaches the free-fall velocity at the stellar surface due to the efficient angular momentum loss at a distant place from the star. This study provides a possible explanation why Herbig Ae/Be stars whose magnetic fields are generally not strong enough to form magnetospheres also show indications of fast accretion. A magnetically driven jet is not formed from the disk in our model. The differential rotation cannot generate sufficiently strong magnetic fields for the jet acceleration because the Parker instability interrupts the field amplification.

  8. The Open Source Snowpack modelling ecosystem

    NASA Astrophysics Data System (ADS)

    Bavay, Mathias; Fierz, Charles; Egger, Thomas; Lehning, Michael

    2016-04-01

    As a large number of numerical snow models are available, a few stand out as quite mature and widespread. One such model is SNOWPACK, the Open Source model that is developed at the WSL Institute for Snow and Avalanche Research SLF. Over the years, various tools have been developed around SNOWPACK in order to expand its use or to integrate additional features. Today, the model is part of a whole ecosystem that has evolved to both offer seamless integration and high modularity so each tool can easily be used outside the ecosystem. Many of these Open Source tools experience their own, autonomous development and are successfully used in their own right in other models and applications. There is Alpine3D, the spatially distributed version of SNOWPACK, that forces it with terrain-corrected radiation fields and optionally with blowing and drifting snow. This model can be used on parallel systems (either with OpenMP or MPI) and has been used for applications ranging from climate change to reindeer herding. There is the MeteoIO pre-processing library that offers fully integrated data access, data filtering, data correction, data resampling and spatial interpolations. This library is now used by several other models and applications. There is the SnopViz snow profile visualization library and application that supports both measured and simulated snow profiles (relying on the CAAML standard) as well as time series. This JavaScript application can be used standalone without any internet connection or served on the web together with simulation results. There is the OSPER data platform effort with a data management service (build on the Global Sensor Network (GSN) platform) as well as a data documenting system (metadata management as a wiki). There are several distributed hydrological models for mountainous areas in ongoing development that require very little information about the soil structure based on the assumption that in step terrain, the most relevant information is contained in the Digital Elevation Model (DEM). There is finally a set of tools making up the operational chain to automatically run, monitor and publish SNOWPACK simulations for operational avalanche warning purposes. This tool chain has been developed with the aim of offering very low maintenance operation and very fast deployment and to easily adapt to other avalanche services.

  9. The Virtual Watershed Observatory: Cyberinfrastructure for Model-Data Integration and Access

    NASA Astrophysics Data System (ADS)

    Duffy, C.; Leonard, L. N.; Giles, L.; Bhatt, G.; Yu, X.

    2011-12-01

    The Virtual Watershed Observatory (VWO) is a concept where scientists, water managers, educators and the general public can create a virtual observatory from integrated hydrologic model results, national databases and historical or real-time observations via web services. In this paper, we propose a prototype for automated and virtualized web services software using national data products for climate reanalysis, soils, geology, terrain and land cover. The VWO has the broad purpose of making accessible water resource simulations, real-time data assimilation, calibration and archival at the scale of HUC 12 watersheds (Hydrologic Unit Code) anywhere in the continental US. Our prototype for model-data integration focuses on creating tools for fast data storage from selected national databases, as well as the computational resources necessary for a dynamic, distributed watershed simulation. The paper will describe cyberinfrastructure tools and workflow that attempts to resolve the problem of model-data accessibility and scalability such that individuals, research teams, managers and educators can create a WVO in a desired context. Examples are given for the NSF-funded Shale Hills Critical Zone Observatory and the European Critical Zone Observatories within the SoilTrEC project. In the future implementation of WVO services will benefit from the development of a cloud cyber infrastructure as the prototype evolves to data and model intensive computation for continental scale water resource predictions.

  10. Benchmarking short sequence mapping tools

    PubMed Central

    2013-01-01

    Background The development of next-generation sequencing instruments has led to the generation of millions of short sequences in a single run. The process of aligning these reads to a reference genome is time consuming and demands the development of fast and accurate alignment tools. However, the current proposed tools make different compromises between the accuracy and the speed of mapping. Moreover, many important aspects are overlooked while comparing the performance of a newly developed tool to the state of the art. Therefore, there is a need for an objective evaluation method that covers all the aspects. In this work, we introduce a benchmarking suite to extensively analyze sequencing tools with respect to various aspects and provide an objective comparison. Results We applied our benchmarking tests on 9 well known mapping tools, namely, Bowtie, Bowtie2, BWA, SOAP2, MAQ, RMAP, GSNAP, Novoalign, and mrsFAST (mrFAST) using synthetic data and real RNA-Seq data. MAQ and RMAP are based on building hash tables for the reads, whereas the remaining tools are based on indexing the reference genome. The benchmarking tests reveal the strengths and weaknesses of each tool. The results show that no single tool outperforms all others in all metrics. However, Bowtie maintained the best throughput for most of the tests while BWA performed better for longer read lengths. The benchmarking tests are not restricted to the mentioned tools and can be further applied to others. Conclusion The mapping process is still a hard problem that is affected by many factors. In this work, we provided a benchmarking suite that reveals and evaluates the different factors affecting the mapping process. Still, there is no tool that outperforms all of the others in all the tests. Therefore, the end user should clearly specify his needs in order to choose the tool that provides the best results. PMID:23758764

  11. Fast calibration of electromagnetically tracked oblique-viewing rigid endoscopes.

    PubMed

    Liu, Xinyang; Rice, Christina E; Shekhar, Raj

    2017-10-01

    The oblique-viewing (i.e., angled) rigid endoscope is a commonly used tool in conventional endoscopic surgeries. The relative rotation between its two moveable parts, the telescope and the camera head, creates a rotation offset between the actual and the projection of an object in the camera image. A calibration method tailored to compensate such offset is needed. We developed a fast calibration method for oblique-viewing rigid endoscopes suitable for clinical use. In contrast to prior approaches based on optical tracking, we used electromagnetic (EM) tracking as the external tracking hardware to improve compactness and practicality. Two EM sensors were mounted on the telescope and the camera head, respectively, with considerations to minimize EM tracking errors. Single-image calibration was incorporated into the method, and a sterilizable plate, laser-marked with the calibration pattern, was also developed. Furthermore, we proposed a general algorithm to estimate the rotation center in the camera image. Formulas for updating the camera matrix in terms of clockwise and counterclockwise rotations were also developed. The proposed calibration method was validated using a conventional [Formula: see text], 5-mm laparoscope. Freehand calibrations were performed using the proposed method, and the calibration time averaged 2 min and 8 s. The calibration accuracy was evaluated in a simulated clinical setting with several surgical tools present in the magnetic field of EM tracking. The root-mean-square re-projection error averaged 4.9 pixel (range 2.4-8.5 pixel, with image resolution of [Formula: see text] for rotation angles ranged from [Formula: see text] to [Formula: see text]. We developed a method for fast and accurate calibration of oblique-viewing rigid endoscopes. The method was also designed to be performed in the operating room and will therefore support clinical translation of many emerging endoscopic computer-assisted surgical systems.

  12. Measurement of the passive fast-ion D-alpha emission on the NSTX-U tokamak

    DOE PAGES

    Hao, G. Z.; Heidbrink, W. W.; Liu, D.; ...

    2018-01-08

    On National Spherical Torus Experiment Upgrade, the passive fast-ion D-alpha (passive-FIDA) spectra from charge exchange (CX) between the beam ions and the background neutrals are measured and simulated. The results indicate that the passive-FIDA signal is measurable and comparable to the active-FIDA on several channels, such as at the major radius R = 117 cm. For this, active-FIDA means the active D-alpha emission from the fast ions that CX with the injected neutrals. The shapes of measured spectra are in agreement with FIDASIM simulations on many fibers. Furthermore, the passive-FIDA spatial profile agrees with the simulation. When making measurements ofmore » active-FIDA in the edge region using time-slice subtraction, variations in the passive-FIDA contribution to the signal should be considered.« less

  13. Measurement of the passive fast-ion D-alpha emission on the NSTX-U tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, G. Z.; Heidbrink, W. W.; Liu, D.

    On National Spherical Torus Experiment Upgrade, the passive fast-ion D-alpha (passive-FIDA) spectra from charge exchange (CX) between the beam ions and the background neutrals are measured and simulated. The results indicate that the passive-FIDA signal is measurable and comparable to the active-FIDA on several channels, such as at the major radius R = 117 cm. For this, active-FIDA means the active D-alpha emission from the fast ions that CX with the injected neutrals. The shapes of measured spectra are in agreement with FIDASIM simulations on many fibers. Furthermore, the passive-FIDA spatial profile agrees with the simulation. When making measurements ofmore » active-FIDA in the edge region using time-slice subtraction, variations in the passive-FIDA contribution to the signal should be considered.« less

  14. Integrated HI emission in galaxy groups and clusters

    NASA Astrophysics Data System (ADS)

    Ai, Mei; Zhu, Ming; Fu, Jian

    2017-09-01

    The integrated HI emission from hierarchical structures such as groups and clusters of galaxies can be detected by FAST at intermediate redshifts. Here we propose to use FAST to study the evolution of the global HI content of clusters and groups over cosmic time by measuring their integrated HI emissions. We use the Virgo Cluster as an example to estimate the detection limit of FAST, and have estimated the integration time to detect a Virgo type cluster at different redshifts (from z = 0.1 to z = 1.5).We have also employed a semi-analytic model (SAM) to simulate the evolution of HI contents in galaxy clusters. Our simulations suggest that the HI mass of a Virgo-like cluster could be 2-3 times higher and the physical size could be more than 50% smaller when redshift increases from z = 0.3 to z = 1. Thus the integration time could be reduced significantly and gas rich clusters at intermediate redshifts can be detected by FAST in less than 2 hours of integration time. For the local Universe, we have also used SAM simulations to create mock catalogs of clusters to predict the outcomes from FAST all sky surveys. Comparing with the optically selected catalogs derived by cross matching the galaxy catalogs from the SDSS survey and the ALFALFA survey, we find that the HI mass distribution of the mock catalog with 20 s of integration time agrees well with that of observations. However, the mock catalog with 120 s of integration time predicts many more groups and clusters that contain a population of low mass HI galaxies not detected by the ALFALFA survey. A future deep HI blind sky survey with FAST would be able to test such prediction and set constraints on the numerical simulation models. The observational strategy and sample selections for future FAST observations of galaxy clusters at high redshifts are also discussed.

  15. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Electromagnetic variable degrees of freedom actuator systems and methods

    DOEpatents

    Montesanti, Richard C [Pleasanton, CA; Trumper, David L [Plaistow, NH; Kirtley, Jr., James L.

    2009-02-17

    The present invention provides a variable reluctance actuator system and method that can be adapted for simultaneous rotation and translation of a moving element by applying a normal-direction magnetic flux on the moving element. In a beneficial example arrangement, the moving element includes a swing arm that carries a cutting tool at a set radius from an axis of rotation so as to produce a rotary fast tool servo that provides a tool motion in a direction substantially parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. An actuator rotates a swing arm such that a cutting tool moves toward and away from a mounted rotating workpiece in a controlled manner in order to machine the workpiece. Position sensors provide rotation and displacement information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in feed slide of a precision lathe.

  17. Distributed simulation for formation flying applications

    NASA Technical Reports Server (NTRS)

    Sohl, Garett A.; Udomkesmalee, Santi; Kellogg, Jennifer L.

    2005-01-01

    High fidelity engineering simulation plays a key role in the rapidly developing field of space-based formation flying. This paper describes the design and implementation of the Formation Algorithms and Simulation Testbed (FAST).

  18. Development of FAST.Farm: A New Multiphysics Engineering Tool for Wind Farm Design and Analysis: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, Jason; Annoni, Jennifer; Hayman, Greg

    2017-01-01

    This paper presents the development of FAST.Farm, a new multiphysics tool applicable to engineering problems in research and industry involving wind farm performance and cost optimization that is needed to address the current underperformance, failures, and expenses plaguing the wind industry. Achieving wind cost-of-energy targets - which requires improvements in wind farm performance and reliability, together with reduced uncertainty and expenditures - has been eluded by the complicated nature of the wind farm design problem, especially the sophisticated interaction between atmospheric phenomena and wake dynamics and array effects. FAST.Farm aims to balance the need for accurate modeling of the relevantmore » physics for predicting power performance and loads while maintaining low computational cost to support a highly iterative and probabilistic design process and system-wide optimization. FAST.Farm makes use of FAST to model the aero-hydro-servo-elastics of distinct turbines in the wind farm, and it is based on some of the principles of the Dynamic Wake Meandering (DWM) model, but avoids many of the limitations of existing DWM implementations.« less

  19. Modeling scintillator and WLS fiber signals for fast Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Sánchez, F. A.; Medina-Tanco, G.

    2010-08-01

    In this work we present a fast, robust and flexible procedure to simulate electronic signals of scintillator units: plastic scintillator material embedded with a wavelength shifter optical fiber coupled to a photo-multiplier tube which, in turn, is plugged to a front-end electronic board. The simple rationale behind the simulation chain allows to adapt the procedure to a broad range of detectors based on that kind of units. We show that, in order to produce realistic results, the simulation parameters can be properly calibrated against laboratory measurements and used thereafter as input of the simulations. Simulated signals of atmospheric background cosmic ray muons are presented and their main features analyzed and validated using actual measured data. Conversely, for any given practical application, the present simulation scheme can be used to find an adequate combination of photo-multiplier tube and optical fiber at the prototyping stage.

  20. 3D FEM Simulation of Flank Wear in Turning

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio

    2011-05-01

    This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.

  1. A Quasiphysics Intelligent Model for a Long Range Fast Tool Servo

    PubMed Central

    Liu, Qiang; Zhou, Xiaoqin; Lin, Jieqiong; Xu, Pengzi; Zhu, Zhiwei

    2013-01-01

    Accurately modeling the dynamic behaviors of fast tool servo (FTS) is one of the key issues in the ultraprecision positioning of the cutting tool. Herein, a quasiphysics intelligent model (QPIM) integrating a linear physics model (LPM) and a radial basis function (RBF) based neural model (NM) is developed to accurately describe the dynamic behaviors of a voice coil motor (VCM) actuated long range fast tool servo (LFTS). To identify the parameters of the LPM, a novel Opposition-based Self-adaptive Replacement Differential Evolution (OSaRDE) algorithm is proposed which has been proved to have a faster convergence mechanism without compromising with the quality of solution and outperform than similar evolution algorithms taken for consideration. The modeling errors of the LPM and the QPIM are investigated by experiments. The modeling error of the LPM presents an obvious trend component which is about ±1.15% of the full span range verifying the efficiency of the proposed OSaRDE algorithm for system identification. As for the QPIM, the trend component in the residual error of LPM can be well suppressed, and the error of the QPIM maintains noise level. All the results verify the efficiency and superiority of the proposed modeling and identification approaches. PMID:24163627

  2. Fast Flows in the Magnetotail and Energetic Particle Transport: Multiscale Coupling in the Magnetosphere

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Wang, X.; Fok, M. C. H.; Buzulukova, N.; Perez, J. D.; Chen, L. J.

    2017-12-01

    The interaction between the Earth's inner and outer magnetospheric regions associated with the tail fast flows is calculated by coupling the Auburn 3-D global hybrid simulation code (ANGIE3D) to the Comprehensive Inner Magnetosphere/Ionosphere (CIMI) model. The global hybrid code solves fully kinetic equations governing the ions and a fluid model for electrons in the self-consistent electromagnetic field of the dayside and night side outer magnetosphere. In the integrated computation model, the hybrid simulation provides the CIMI model with field data in the CIMI 3-D domain and particle data at its boundary, and the transport in the inner magnetosphere is calculated by the CIMI model. By joining the two existing codes, effects of the solar wind on particle transport through the outer magnetosphere into the inner magnetosphere are investigated. Our simulation shows that fast flows and flux ropes are localized transients in the magnetotail plasma sheet and their overall structures have a dawn-dusk asymmetry. Strong perpendicular ion heating is found at the fast flow braking, which affects the earthward transport of entropy-depleted bubbles. We report on the impacts from the temperature anisotropy and non-Maxwellian ion distributions associated with the fast flows on the ring current and the convection electric field.

  3. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  4. Time-domain separation of interfering waves in cancellous bone using bandlimited deconvolution: simulation and phantom study.

    PubMed

    Wear, Keith A

    2014-04-01

    In through-transmission interrogation of cancellous bone, two longitudinal pulses ("fast" and "slow" waves) may be generated. Fast and slow wave properties convey information about material and micro-architectural characteristics of bone. However, these properties can be difficult to assess when fast and slow wave pulses overlap in time and frequency domains. In this paper, two methods are applied to decompose signals into fast and slow waves: bandlimited deconvolution and modified least-squares Prony's method with curve-fitting (MLSP + CF). The methods were tested in plastic and Zerdine(®) samples that provided fast and slow wave velocities commensurate with velocities for cancellous bone. Phase velocity estimates were accurate to within 6 m/s (0.4%) (slow wave with both methods and fast wave with MLSP + CF) and 26 m/s (1.2%) (fast wave with bandlimited deconvolution). Midband signal loss estimates were accurate to within 0.2 dB (1.7%) (fast wave with both methods), and 1.0 dB (3.7%) (slow wave with both methods). Similar accuracies were found for simulations based on fast and slow wave parameter values published for cancellous bone. These methods provide sufficient accuracy and precision for many applications in cancellous bone such that experimental error is likely to be a greater limiting factor than estimation error.

  5. Comparison of a multimedia simulator to a human model for teaching FAST exam image interpretation and image acquisition.

    PubMed

    Damewood, Sara; Jeanmonod, Donald; Cadigan, Beth

    2011-04-01

    This study compared the effectiveness of a multimedia ultrasound (US) simulator to normal human models during the practical portion of a course designed to teach the skills of both image acquisition and image interpretation for the Focused Assessment with Sonography for Trauma (FAST) exam. This was a prospective, blinded, controlled education study using medical students as an US-naïve population. After a standardized didactic lecture on the FAST exam, trainees were separated into two groups to practice image acquisition on either a multimedia simulator or a normal human model. Four outcome measures were then assessed: image interpretation of prerecorded FAST exams, adequacy of image acquisition on a standardized normal patient, perceived confidence of image adequacy, and time to image acquisition. Ninety-two students were enrolled and separated into two groups, a multimedia simulator group (n = 44), and a human model group (n = 48). Bonferroni adjustment factor determined the level of significance to be p = 0.0125. There was no difference between those trained on the multimedia simulator and those trained on a human model in image interpretation (median 80 of 100 points, interquartile range [IQR] 71-87, vs. median 78, IQR 62-86; p = 0.16), image acquisition (median 18 of 24 points, IQR 12-18 points, vs. median 16, IQR 14-20; p = 0.95), trainee's confidence in obtaining images on a 1-10 visual analog scale (median 5, IQR 4.1-6.5, vs. median 5, IQR 3.7-6.0; p = 0.36), or time to acquire images (median 3.8 minutes, IQR 2.7-5.4 minutes, vs. median = 4.5 minutes, IQR = 3.4-5.9 minutes; p = 0.044). There was no difference in teaching the skills of image acquisition and interpretation to novice FAST examiners using the multimedia simulator or normal human models. These data suggest that practical image acquisition skills learned during simulated training can be directly applied to human models. © 2011 by the Society for Academic Emergency Medicine.

  6. siMacro: A Fast and Easy Data Processing Tool for Cell-Based Genomewide siRNA Screens.

    PubMed

    Singh, Nitin Kumar; Seo, Bo Yeun; Vidyasagar, Mathukumalli; White, Michael A; Kim, Hyun Seok

    2013-03-01

    Growing numbers of studies employ cell line-based systematic short interfering RNA (siRNA) screens to study gene functions and to identify drug targets. As multiple sources of variations that are unique to siRNA screens exist, there is a growing demand for a computational tool that generates normalized values and standardized scores. However, only a few tools have been available so far with limited usability. Here, we present siMacro, a fast and easy-to-use Microsoft Office Excel-based tool with a graphic user interface, designed to process single-condition or two-condition synthetic screen datasets. siMacro normalizes position and batch effects, censors outlier samples, and calculates Z-scores and robust Z-scores, with a spreadsheet output of >120,000 samples in under 1 minute.

  7. siMacro: A Fast and Easy Data Processing Tool for Cell-Based Genomewide siRNA Screens

    PubMed Central

    Singh, Nitin Kumar; Seo, Bo Yeun; Vidyasagar, Mathukumalli; White, Michael A.

    2013-01-01

    Growing numbers of studies employ cell line-based systematic short interfering RNA (siRNA) screens to study gene functions and to identify drug targets. As multiple sources of variations that are unique to siRNA screens exist, there is a growing demand for a computational tool that generates normalized values and standardized scores. However, only a few tools have been available so far with limited usability. Here, we present siMacro, a fast and easy-to-use Microsoft Office Excel-based tool with a graphic user interface, designed to process single-condition or two-condition synthetic screen datasets. siMacro normalizes position and batch effects, censors outlier samples, and calculates Z-scores and robust Z-scores, with a spreadsheet output of >120,000 samples in under 1 minute. PMID:23613684

  8. A study on the optimum fast neutron flux for boron neutron capture therapy of deep-seated tumors.

    PubMed

    Rasouli, Fatemeh S; Masoudi, S Farhad

    2015-02-01

    High-energy neutrons, named fast neutrons which have a number of undesirable biological effects on tissue, are a challenging problem in beam designing for Boron Neutron Capture Therapy, BNCT. In spite of this fact, there is not a widely accepted criterion to guide the beam designer to determine the appropriate contribution of fast neutrons in the spectrum. Although a number of researchers have proposed a target value for the ratio of fast neutron flux to epithermal neutron flux, it can be shown that this criterion may not provide the optimum treatment condition. This simulation study deals with the determination of the optimum contribution of fast neutron flux in the beam for BNCT of deep-seated tumors. Since the dose due to these high-energy neutrons damages shallow tissues, delivered dose to skin is considered as a measure for determining the acceptability of the designed beam. To serve this purpose, various beam shaping assemblies that result in different contribution of fast neutron flux are designed. The performances of the neutron beams corresponding to such configurations are assessed in a simulated head phantom. It is shown that the previously used criterion, which suggests a limit value for the contribution of fast neutrons in beam, does not necessarily provide the optimum condition. Accordingly, it is important to specify other complementary limits considering the energy of fast neutrons. By analyzing various neutron spectra, two limits on fast neutron flux are proposed and their validity is investigated. The results show that considering these limits together with the widely accepted IAEA criteria makes it possible to have a more realistic assessment of sufficiency of the designed beam. Satisfying these criteria not only leads to reduction of delivered dose to skin, but also increases the advantage depth in tissue and delivered dose to tumor during the treatment time. The Monte Carlo Code, MCNP-X, is used to perform these simulations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  10. Ultrafast High Accuracy PCRTM_SOLAR Model for Cloudy Atmosphere

    NASA Technical Reports Server (NTRS)

    Yang, Qiguang; Liu, Xu; Wu, Wan; Yang, Ping; Wang, Chenxi

    2015-01-01

    An ultrafast high accuracy PCRTM_SOLAR model is developed based on PCA compression and principal component-based radiative transfer model (PCRTM). A fast algorithm for simulation of multi-scattering properties of cloud and/or aerosols is integrated into the fast infrared PCRTM. We completed radiance simulation and training for instruments, such as IASI, AIRS, CrIS, NASTI and SHIS, under diverse conditions. The new model is 5 orders faster than 52-stream DISORT with very high accuracy for cloudy sky radiative transfer simulation. It is suitable for hyperspectral remote data assimilation and cloudy sky retrievals.

  11. Analysis of the OPERA 15-pin experiment with SABRE-2P. [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, S.D.; Carbajo, J.J.

    The OPERA (Out-of-Pile Expulsion and Reentry Apparatus) experiment simulates the initial phase of a pump coastdown without scram of a liquid-metal fast breeder reactor, specifically the Fast Flux Test Facility. The test section is a 15-pin 60/sup 0/ triangular sector designed to simulate a full-size 61-pin hexagonal bundle. A previous study indicates this to be an adequate simulation. In this paper, experimental results from the OPERA 15-pin experiment performed at ANL in 1982 are compared to analytical calculations obtained with the SABRE-2P code at ORNL.

  12. CABS-flex: server for fast simulation of protein structure fluctuations

    PubMed Central

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2013-01-01

    The CABS-flex server (http://biocomp.chem.uw.edu.pl/CABSflex) implements CABS-model–based protocol for the fast simulations of near-native dynamics of globular proteins. In this application, the CABS model was shown to be a computationally efficient alternative to all-atom molecular dynamics—a classical simulation approach. The simulation method has been validated on a large set of molecular dynamics simulation data. Using a single input (user-provided file in PDB format), the CABS-flex server outputs an ensemble of protein models (in all-atom PDB format) reflecting the flexibility of the input structure, together with the accompanying analysis (residue mean-square-fluctuation profile and others). The ensemble of predicted models can be used in structure-based studies of protein functions and interactions. PMID:23658222

  13. CABS-flex: Server for fast simulation of protein structure fluctuations.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2013-07-01

    The CABS-flex server (http://biocomp.chem.uw.edu.pl/CABSflex) implements CABS-model-based protocol for the fast simulations of near-native dynamics of globular proteins. In this application, the CABS model was shown to be a computationally efficient alternative to all-atom molecular dynamics--a classical simulation approach. The simulation method has been validated on a large set of molecular dynamics simulation data. Using a single input (user-provided file in PDB format), the CABS-flex server outputs an ensemble of protein models (in all-atom PDB format) reflecting the flexibility of the input structure, together with the accompanying analysis (residue mean-square-fluctuation profile and others). The ensemble of predicted models can be used in structure-based studies of protein functions and interactions.

  14. Simulator platform for fast reactor operation and safety technology demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, R. B.; Park, Y. S.; Grandy, C.

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe responsemore » to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.« less

  15. Physiological changes in fast and slow muscle with simulated weightlessness

    NASA Technical Reports Server (NTRS)

    Dettbarn, W. D.; Misulis, K. E.

    1984-01-01

    A rat hindlimb suspension model of simulated weightlessness was used to examine the physiological characteristics of skeletal muscle. The physiological sequelae of hindlimb suspension were compared to those of spinal cord section, denervation by sciatic nerve crush, and control. Muscle examined were the predominantly slow (Type 1) soleus (SOL) and the predominantly fast (Type 2) extensor digitorum longus (EDL). Two procedures which alter motor unit activity, hindlimb suspension and spinal cord section, produce changes in characteristics of skeletal muscles that are dependent upon fiber type. The SOL develops characteristics more representative of a fast muscle, including smaller Type 1 fiber proportion and higher AChE activity. The EDL, which is already predominantly fast, loses most of its few Type 1 fibers, thus also becoming faster. These data are in agreement with the studies in which rats experienced actual weightlessness.

  16. Proceedings of the 1987 conference on tools for the simulation profession

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawkins, R.; Klukis, K.

    1987-01-01

    This book covers the proceedings of the 1987 conference on tools for the simulation profession. Some of the topics are: SIMULACT: a generic tool for simulating distributed systems; ESL language simulation of spacecraft batteries; and Trends in global cadmium levels from increased use of fossil fuels.

  17. Fast multipole methods on a cluster of GPUs for the meshless simulation of turbulence

    NASA Astrophysics Data System (ADS)

    Yokota, R.; Narumi, T.; Sakamaki, R.; Kameoka, S.; Obi, S.; Yasuoka, K.

    2009-11-01

    Recent advances in the parallelizability of fast N-body algorithms, and the programmability of graphics processing units (GPUs) have opened a new path for particle based simulations. For the simulation of turbulence, vortex methods can now be considered as an interesting alternative to finite difference and spectral methods. The present study focuses on the efficient implementation of the fast multipole method and pseudo-particle method on a cluster of NVIDIA GeForce 8800 GT GPUs, and applies this to a vortex method calculation of homogeneous isotropic turbulence. The results of the present vortex method agree quantitatively with that of the reference calculation using a spectral method. We achieved a maximum speed of 7.48 TFlops using 64 GPUs, and the cost performance was near 9.4/GFlops. The calculation of the present vortex method on 64 GPUs took 4120 s, while the spectral method on 32 CPUs took 4910 s.

  18. Love-Wave Sensors Combined with Microfluidics for Fast Detection of Biological Warfare Agents

    PubMed Central

    Matatagui, Daniel; Fontecha, José Luis; Fernández, María Jesús; Gràcia, Isabel; Cané, Carles; Santos, José Pedro; Horrillo, María Carmen

    2014-01-01

    The following paper examines a time-efficient method for detecting biological warfare agents (BWAs). The method is based on a system of a Love-wave immunosensor combined with a microfluidic chip which detects BWA samples in a dynamic mode. In this way a continuous flow-through of the sample is created, promoting the reaction between antigen and antibody and allowing a fast detection of the BWAs. In order to prove this method, static and dynamic modes have been simulated and different concentrations of BWA simulants have been tested with two immunoreactions: phage M13 has been detected using the mouse monoclonal antibody anti-M13 (AM13), and the rabbit immunoglobulin (Rabbit IgG) has been detected using the polyclonal antibody goat anti-rabbit (GAR). Finally, different concentrations of each BWA simulants have been detected with a fast response time and a desirable level of discrimination among them has been achieved. PMID:25029282

  19. IgSimulator: a versatile immunosequencing simulator.

    PubMed

    Safonova, Yana; Lapidus, Alla; Lill, Jennie

    2015-10-01

    The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. RTSPM: real-time Linux control software for scanning probe microscopy.

    PubMed

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

Top