Sample records for scientific simulation codes

  1. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  2. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  3. A suite of exercises for verifying dynamic earthquake rupture codes

    USGS Publications Warehouse

    Harris, Ruth A.; Barall, Michael; Aagaard, Brad T.; Ma, Shuo; Roten, Daniel; Olsen, Kim B.; Duan, Benchun; Liu, Dunyu; Luo, Bin; Bai, Kangchen; Ampuero, Jean-Paul; Kaneko, Yoshihiro; Gabriel, Alice-Agnes; Duru, Kenneth; Ulrich, Thomas; Wollherr, Stephanie; Shi, Zheqiang; Dunham, Eric; Bydlon, Sam; Zhang, Zhenguo; Chen, Xiaofei; Somala, Surendra N.; Pelties, Christian; Tago, Josue; Cruz-Atienza, Victor Manuel; Kozdon, Jeremy; Daub, Eric; Aslam, Khurram; Kase, Yuko; Withers, Kyle; Dalguer, Luis

    2018-01-01

    We describe a set of benchmark exercises that are designed to test if computer codes that simulate dynamic earthquake rupture are working as intended. These types of computer codes are often used to understand how earthquakes operate, and they produce simulation results that include earthquake size, amounts of fault slip, and the patterns of ground shaking and crustal deformation. The benchmark exercises examine a range of features that scientists incorporate in their dynamic earthquake rupture simulations. These include implementations of simple or complex fault geometry, off‐fault rock response to an earthquake, stress conditions, and a variety of formulations for fault friction. Many of the benchmarks were designed to investigate scientific problems at the forefronts of earthquake physics and strong ground motions research. The exercises are freely available on our website for use by the scientific community.

  4. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    NASA Astrophysics Data System (ADS)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  5. : A Scalable and Transparent System for Simulating MPI Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S

    2010-01-01

    is a scalable, transparent system for experimenting with the execution of parallel programs on simulated computing platforms. The level of simulated detail can be varied for application behavior as well as for machine characteristics. Unique features of are repeatability of execution, scalability to millions of simulated (virtual) MPI ranks, scalability to hundreds of thousands of host (real) MPI ranks, portability of the system to a variety of host supercomputing platforms, and the ability to experiment with scientific applications whose source-code is available. The set of source-code interfaces supported by is being expanded to support a wider set of applications, andmore » MPI-based scientific computing benchmarks are being ported. In proof-of-concept experiments, has been successfully exercised to spawn and sustain very large-scale executions of an MPI test program given in source code form. Low slowdowns are observed, due to its use of purely discrete event style of execution, and due to the scalability and efficiency of the underlying parallel discrete event simulation engine, sik. In the largest runs, has been executed on up to 216,000 cores of a Cray XT5 supercomputer, successfully simulating over 27 million virtual MPI ranks, each virtual rank containing its own thread context, and all ranks fully synchronized by virtual time.« less

  6. Monte Carlo Particle Lists: MCPL

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.

    2017-09-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  7. COCOA code for creating mock observations of star cluster models

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2018-04-01

    We introduce and present results from the COCOA (Cluster simulatiOn Comparison with ObservAtions) code that has been developed to create idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. In this paper, we describe the COCOA code and demonstrate its different applications by utilizing globular cluster (GC) models simulated with the MOCCA (MOnte Carlo Cluster simulAtor) code. COCOA is used to synthetically observe these different GC models with optical telescopes, perform point spread function photometry, and subsequently produce observed colour-magnitude diagrams. We also use COCOA to compare the results from synthetic observations of a cluster model that has the same age and metallicity as the Galactic GC NGC 2808 with observations of the same cluster carried out with a 2.2 m optical telescope. We find that COCOA can effectively simulate realistic observations and recover photometric data. COCOA has numerous scientific applications that maybe be helpful for both theoreticians and observers that work on star clusters. Plans for further improving and developing the code are also discussed in this paper.

  8. Component Technology for High-Performance Scientific Simulation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epperly, T; Kohn, S; Kumfert, G

    2000-11-09

    We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less

  9. Toward Scientific Numerical Modeling

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2007-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.

  10. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.

    2016-01-01

    MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  11. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  12. SIGNUM: A Matlab, TIN-based landscape evolution model

    NASA Astrophysics Data System (ADS)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  13. Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.

    2007-12-01

    Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.

  14. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  15. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE PAGES

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.; ...

    2016-01-01

    We present MADNESS (multiresolution adaptive numerical environment for scientific simulation) that is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision that are based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  16. NAS (Numerical Aerodynamic Simulation Program) technical summaries, March 1989 - February 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Given here are selected scientific results from the Numerical Aerodynamic Simulation (NAS) Program's third year of operation. During this year, the scientific community was given access to a Cray-2 and a Cray Y-MP supercomputer. Topics covered include flow field analysis of fighter wing configurations, large-scale ocean modeling, the Space Shuttle flow field, advanced computational fluid dynamics (CFD) codes for rotary-wing airloads and performance prediction, turbulence modeling of separated flows, airloads and acoustics of rotorcraft, vortex-induced nonlinearities on submarines, and standing oblique detonation waves.

  17. The adaption and use of research codes for performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    1987-05-01

    Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less

  18. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  19. Fourier-Bessel Particle-In-Cell (FBPIC) v0.1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehe, Remi; Kirchen, Manuel; Jalas, Soeren

    The Fourier-Bessel Particle-In-Cell code is a scientific simulation software for relativistic plasma physics. It is a Particle-In-Cell code whose distinctive feature is to use a spectral decomposition in cylindrical geometry. This decomposition allows to combine the advantages of spectral 3D Cartesian PIC codes (high accuracy and stability) and those of finite-difference cylindrical PIC codes with azimuthal decomposition (orders-of-magnitude speedup when compared to 3D simulations). The code is built on Python and can run both on CPU and GPU (the GPU runs being typically 1 or 2 orders of magnitude faster than the corresponding CPU runs.) The code has the exactmore » same output format as the open-source PIC codes Warp and PIConGPU (openPMD format: openpmd.org) and has a very similar input format as Warp (Python script with many similarities). There is therefore tight interoperability between Warp and FBPIC, and this interoperability will increase even more in the future.« less

  20. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  1. PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.

    2017-12-01

    Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.

  2. LANL LDRD-funded project: Test particle simulations of energetic ions in natural and artificial radiation belts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowee, Misa; Liu, Kaijun; Friedel, Reinhard H.

    2012-07-17

    We summarize the scientific problem and work plan for the LANL LDRD-funded project to use a test particle code to study the sudden de-trapping of inner belt protons and possible cross-L transport of debris ions after a high altitude nuclear explosion (HANE). We also discuss future application of the code for other HANE-related problems.

  3. Simulating Scenes In Outer Space

    NASA Technical Reports Server (NTRS)

    Callahan, John D.

    1989-01-01

    Multimission Interactive Picture Planner, MIP, computer program for scientifically accurate and fast, three-dimensional animation of scenes in deep space. Versatile, reasonably comprehensive, and portable, and runs on microcomputers. New techniques developed to perform rapidly calculations and transformations necessary to animate scenes in scientifically accurate three-dimensional space. Written in FORTRAN 77 code. Primarily designed to handle Voyager, Galileo, and Space Telescope. Adapted to handle other missions.

  4. TTCI's Scientific Software Suite and NUCARS Overview

    DOT National Transportation Integrated Search

    2015-06-30

    On June 30-July 1 of 2015 the FRA held the Best Practices Workshop on VTI Simulation at the Volpe Center in Cambridge, Massachusetts. The two day workshop was attended by representatives from the government, code developers, researchers, academia, an...

  5. Subsurface Transport Over Multiple Phases Demonstration Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-01-05

    The STOMP simulator is a suite of numerical simulators developed by Pacific Northwest National Laboratory for addressing problems involving coupled multifluid hydrologic, thermal, geochemical, and geomechanical processes in the subsurface. The simulator has been applied to problems concerning environmental remediation, environmental stewardship, carbon sequestration, conventional petroleum production, and the production of unconventional hydrocarbon fuels. The simulator is copyrighted by Battelle Memorial Institute, and is available outside of PNNL via use agreements. To promote the open exchange of scientific ideas the simulator is provided as source code. A demonstration version of the simulator has been developed, which will provide potential newmore » users with an executable (not source code) implementation of the software royalty free. Demonstration versions will be offered via the STOMP website for all currently available operational modes of the simulator. The demonstration versions of the simulator will be configured with the direct banded linear system solver and have a limit of 1,000 active grid cells. This will provide potential new users with an opportunity to apply the code to simple problems, including many of the STOMP short course problems, without having to pay a license fee. Users will be required to register on the STOMP website prior to receiving an executable.« less

  6. Code Modernization of VPIC

    NASA Astrophysics Data System (ADS)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  7. Py-SPHViewer: Cosmological simulations using Smoothed Particle Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Benítez-Llambay, Alejandro

    2017-12-01

    Py-SPHViewer visualizes and explores N-body + Hydrodynamics simulations. The code interpolates the underlying density field (or any other property) traced by a set of particles, using the Smoothed Particle Hydrodynamics (SPH) interpolation scheme, thus producing not only beautiful but also useful scientific images. Py-SPHViewer enables the user to explore simulated volumes using different projections. Py-SPHViewer also provides a natural way to visualize (in a self-consistent fashion) gas dynamical simulations, which use the same technique to compute the interactions between particles.

  8. The SURFEXv7.2 land and ocean surface platform for coupled or offline simulation of earth surface variables and fluxes

    NASA Astrophysics Data System (ADS)

    Masson, V.; Le Moigne, P.; Martin, E.; Faroux, S.; Alias, A.; Alkama, R.; Belamari, S.; Barbu, A.; Boone, A.; Bouyssel, F.; Brousseau, P.; Brun, E.; Calvet, J.-C.; Carrer, D.; Decharme, B.; Delire, C.; Donier, S.; Essaouini, K.; Gibelin, A.-L.; Giordani, H.; Habets, F.; Jidane, M.; Kerdraon, G.; Kourzeneva, E.; Lafaysse, M.; Lafont, S.; Lebeaupin Brossier, C.; Lemonsu, A.; Mahfouf, J.-F.; Marguinaud, P.; Mokhtari, M.; Morin, S.; Pigeon, G.; Salgado, R.; Seity, Y.; Taillefer, F.; Tanguy, G.; Tulet, P.; Vincendon, B.; Vionnet, V.; Voldoire, A.

    2013-07-01

    SURFEX is a new externalized land and ocean surface platform that describes the surface fluxes and the evolution of four types of surfaces: nature, town, inland water and ocean. It is mostly based on pre-existing, well-validated scientific models that are continuously improved. The motivation for the building of SURFEX is to use strictly identical scientific models in a high range of applications in order to mutualise the research and development efforts. SURFEX can be run in offline mode (0-D or 2-D runs) or in coupled mode (from mesoscale models to numerical weather prediction and climate models). An assimilation mode is included for numerical weather prediction and monitoring. In addition to momentum, heat and water fluxes, SURFEX is able to simulate fluxes of carbon dioxide, chemical species, continental aerosols, sea salt and snow particles. The main principles of the organisation of the surface are described first. Then, a survey is made of the scientific module (including the coupling strategy). Finally, the main applications of the code are summarised. The validation work undertaken shows that replacing the pre-existing surface models by SURFEX in these applications is usually associated with improved skill, as the numerous scientific developments contained in this community code are used to good advantage.

  9. Proposed standards for peer-reviewed publication of computer code

    USDA-ARS?s Scientific Manuscript database

    Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...

  10. Modeling the source of GW150914 with targeted numerical-relativity simulations

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey; Lousto, Carlos O.; Healy, James; Scheel, Mark A.; Garcia, Alyssa; O'Shaughnessy, Richard; Boyle, Michael; Campanelli, Manuela; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Szilágyi, Béla; Teukolsky, Saul A.; Zlochower, Yosef

    2016-12-01

    In fall of 2015, the two LIGO detectors measured the gravitational wave signal GW150914, which originated from a pair of merging black holes (Abbott et al Virgo, LIGO Scientific 2016 Phys. Rev. Lett. 116 061102). In the final 0.2 s (about 8 gravitational-wave cycles) before the amplitude reached its maximum, the observed signal swept up in amplitude and frequency, from 35 Hz to 150 Hz. The theoretical gravitational-wave signal for merging black holes, as predicted by general relativity, can be computed only by full numerical relativity, because analytic approximations fail near the time of merger. Moreover, the nearly-equal masses, moderate spins, and small number of orbits of GW150914 are especially straightforward and efficient to simulate with modern numerical-relativity codes. In this paper, we report the modeling of GW150914 with numerical-relativity simulations, using black-hole masses and spins consistent with those inferred from LIGO’s measurement (Abbott et al LIGO Scientific Collaboration, Virgo Collaboration 2016 Phys. Rev. Lett. 116 241102). In particular, we employ two independent numerical-relativity codes that use completely different analytical and numerical methods to model the same merging black holes and to compute the emitted gravitational waveform; we find excellent agreement between the waveforms produced by the two independent codes. These results demonstrate the validity, impact, and potential of current and future studies using rapid-response, targeted numerical-relativity simulations for better understanding gravitational-wave observations.

  11. Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community

    NASA Astrophysics Data System (ADS)

    Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.

    2016-12-01

    The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.

  12. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  13. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950

  14. Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F; Levy, Robert; Lyapustin, Alexei

    2008-05-01

    Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org.

  15. Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study

    NASA Astrophysics Data System (ADS)

    Kotchenova, Svetlana Y.; Vermote, Eric F.; Levy, Robert; Lyapustin, Alexei

    2008-05-01

    Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org.

  16. An Array Library for Microsoft SQL Server with Astrophysical Applications

    NASA Astrophysics Data System (ADS)

    Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.

    2012-09-01

    Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.

  17. Scientific Discovery through Advanced Computing in Plasma Science

    NASA Astrophysics Data System (ADS)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.

  18. Soapy: an adaptive optics simulation written purely in Python for rapid concept development

    NASA Astrophysics Data System (ADS)

    Reeves, Andrew

    2016-07-01

    Soapy is a newly developed Adaptive Optics (AO) simulation which aims be a flexible and fast to use tool-kit for many applications in the field of AO. It is written purely in the Python language, adding to and taking advantage of the already rich ecosystem of scientific libraries and programs. The simulation has been designed to be extremely modular, such that each component can be used stand-alone for projects which do not require a full end-to-end simulation. Ease of use, modularity and code clarity have been prioritised at the expense of computational performance. Though this means the code is not yet suitable for large studies of Extremely Large Telescope AO systems, it is well suited to education, exploration of new AO concepts and investigations of current generation telescopes.

  19. Effects of virtualization on a scientific application - Running a hyperspectral radiative transfer code on virtual machines.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tikotekar, Anand A; Vallee, Geoffroy R; Naughton III, Thomas J

    2008-01-01

    The topic of system-level virtualization has recently begun to receive interest for high performance computing (HPC). This is in part due to the isolation and encapsulation offered by the virtual machine. These traits enable applications to customize their environments and maintain consistent software configurations in their virtual domains. Additionally, there are mechanisms that can be used for fault tolerance like live virtual machine migration. Given these attractive benefits to virtualization, a fundamental question arises, how does this effect my scientific application? We use this as the premise for our paper and observe a real-world scientific code running on a Xenmore » virtual machine. We studied the effects of running a radiative transfer simulation, Hydrolight, on a virtual machine. We discuss our methodology and report observations regarding the usage of virtualization with this application.« less

  20. WFIRST: Data/Instrument Simulation Support at IPAC

    NASA Astrophysics Data System (ADS)

    Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin

    2018-01-01

    As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.

  1. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  2. A domain specific language for performance portable molecular dynamics algorithms

    NASA Astrophysics Data System (ADS)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  3. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE PAGES

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    2018-04-17

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  4. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  5. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  6. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  7. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  8. Modeling Cometary Coma with a Three Dimensional, Anisotropic Multiple Scattering Distributed Processing Code

    NASA Technical Reports Server (NTRS)

    Luchini, Chris B.

    1997-01-01

    Development of camera and instrument simulations for space exploration requires the development of scientifically accurate models of the objects to be studied. Several planned cometary missions have prompted the development of a three dimensional, multi-spectral, anisotropic multiple scattering model of cometary coma.

  9. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  10. Development of an object-oriented finite element program: application to metal-forming and impact simulations

    NASA Astrophysics Data System (ADS)

    Pantale, O.; Caperaa, S.; Rakotomalala, R.

    2004-07-01

    During the last 50 years, the development of better numerical methods and more powerful computers has been a major enterprise for the scientific community. In the same time, the finite element method has become a widely used tool for researchers and engineers. Recent advances in computational software have made possible to solve more physical and complex problems such as coupled problems, nonlinearities, high strain and high-strain rate problems. In this field, an accurate analysis of large deformation inelastic problems occurring in metal-forming or impact simulations is extremely important as a consequence of high amount of plastic flow. In this presentation, the object-oriented implementation, using the C++ language, of an explicit finite element code called DynELA is presented. The object-oriented programming (OOP) leads to better-structured codes for the finite element method and facilitates the development, the maintainability and the expandability of such codes. The most significant advantage of OOP is in the modeling of complex physical systems such as deformation processing where the overall complex problem is partitioned in individual sub-problems based on physical, mathematical or geometric reasoning. We first focus on the advantages of OOP for the development of scientific programs. Specific aspects of OOP, such as the inheritance mechanism, the operators overload procedure or the use of template classes are detailed. Then we present the approach used for the development of our finite element code through the presentation of the kinematics, conservative and constitutive laws and their respective implementation in C++. Finally, the efficiency and accuracy of our finite element program are investigated using a number of benchmark tests relative to metal forming and impact simulations.

  11. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  12. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  13. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  14. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  15. Statistical Validation of a New Python-based Military Workforce Simulation Model

    DTIC Science & Technology

    2014-12-30

    also having a straightforward syntax that is accessible to non-programmers. Furthermore, it is supported by an impressive variety of scientific... accessed by a given element of model logic or line of code. For example, in Arena, data arrays, queues and the simulation clock are part of the...global scope and are therefore accessible anywhere in the model. The disadvantage of scopes is that all names in a scope must be unique. If more than

  16. Simulating a transmon implementation of the surface code, Part II

    NASA Astrophysics Data System (ADS)

    O'Brien, Thomas; Tarasinski, Brian; Rol, Adriaan; Bultink, Niels; Fu, Xiang; Criger, Ben; Dicarlo, Leonardo

    The majority of quantum error correcting circuit simulations use Pauli error channels, as they can be efficiently calculated. This raises two questions: what is the effect of more complicated physical errors on the logical qubit error rate, and how much more efficient can decoders become when accounting for realistic noise? To answer these questions, we design a minimal weight perfect matching decoder parametrized by a physically motivated noise model and test it on the full density matrix simulation of Surface-17, a distance-3 surface code. We compare performance against other decoders, for a range of physical parameters. Particular attention is paid to realistic sources of error for transmon qubits in a circuit QED architecture, and the requirements for real-time decoding via an FPGA Research funded by the Foundation for Fundamental Research on Matter (FOM), the Netherlands Organization for Scientific Research (NWO/OCW), IARPA, an ERC Synergy Grant, the China Scholarship Council, and Intel Corporation.

  17. GPU Particle Tracking and MHD Simulations with Greatly Enhanced Computational Speed

    NASA Astrophysics Data System (ADS)

    Ziemba, T.; O'Donnell, D.; Carscadden, J.; Cash, M.; Winglee, R.; Harnett, E.

    2008-12-01

    GPUs are intrinsically highly parallelized systems that provide more than an order of magnitude computing speed over a CPU based systems, for less cost than a high end-workstation. Recent advancements in GPU technologies allow for full IEEE float specifications with performance up to several hundred GFLOPs per GPU, and new software architectures have recently become available to ease the transition from graphics based to scientific applications. This allows for a cheap alternative to standard supercomputing methods and should increase the time to discovery. 3-D particle tracking and MHD codes have been developed using NVIDIA's CUDA and have demonstrated speed up of nearly a factor of 20 over equivalent CPU versions of the codes. Such a speed up enables new applications to develop, including real time running of radiation belt simulations and real time running of global magnetospheric simulations, both of which could provide important space weather prediction tools.

  18. Research Prototype: Automated Analysis of Scientific and Engineering Semantics

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    Physical and mathematical formulae and concepts are fundamental elements of scientific and engineering software. These classical equations and methods are time tested, universally accepted, and relatively unambiguous. The existence of this classical ontology suggests an ideal problem for automated comprehension. This problem is further motivated by the pervasive use of scientific code and high code development costs. To investigate code comprehension in this classical knowledge domain, a research prototype has been developed. The prototype incorporates scientific domain knowledge to recognize code properties (including units, physical, and mathematical quantity). Also, the procedure implements programming language semantics to propagate these properties through the code. This prototype's ability to elucidate code and detect errors will be demonstrated with state of the art scientific codes.

  19. An Open Simulation System Model for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1995-01-01

    A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.

  20. The brian simulator.

    PubMed

    Goodman, Dan F M; Brette, Romain

    2009-09-01

    "Brian" is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.

  1. Idea Paper: The Lifecycle of Software for Scientific Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; McInnes, Lois C.

    The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less

  2. Maestro and Castro: Simulation Codes for Astrophysical Flows

    NASA Astrophysics Data System (ADS)

    Zingale, Michael; Almgren, Ann; Beckner, Vince; Bell, John; Friesen, Brian; Jacobs, Adam; Katz, Maximilian P.; Malone, Christopher; Nonaka, Andrew; Zhang, Weiqun

    2017-01-01

    Stellar explosions are multiphysics problems—modeling them requires the coordinated input of gravity solvers, reaction networks, radiation transport, and hydrodynamics together with microphysics recipes to describe the physics of matter under extreme conditions. Furthermore, these models involve following a wide range of spatial and temporal scales, which puts tough demands on simulation codes. We developed the codes Maestro and Castro to meet the computational challenges of these problems. Maestro uses a low Mach number formulation of the hydrodynamics to efficiently model convection. Castro solves the fully compressible radiation hydrodynamics equations to capture the explosive phases of stellar phenomena. Both codes are built upon the BoxLib adaptive mesh refinement library, which prepares them for next-generation exascale computers. Common microphysics shared between the codes allows us to transfer a problem from the low Mach number regime in Maestro to the explosive regime in Castro. Importantly, both codes are freely available (https://github.com/BoxLib-Codes). We will describe the design of the codes and some of their science applications, as well as future development directions.Support for development was provided by NSF award AST-1211563 and DOE/Office of Nuclear Physics grant DE-FG02-87ER40317 to Stony Brook and by the Applied Mathematics Program of the DOE Office of Advance Scientific Computing Research under US DOE contract DE-AC02-05CH11231 to LBNL.

  3. Computational Infrastructure for Geodynamics (CIG)

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.

    2004-12-01

    Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.

  4. An Experiment in Scientific Program Understanding

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Owen, Karl (Technical Monitor)

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  5. PlasmaPy: initial development of a Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas; Leonard, Andrew J.; Stańczak, Dominik; Haggerty, Colby C.; Parashar, Tulasi N.; Huang, Yu-Min; PlasmaPy Community

    2017-10-01

    We report on initial development of PlasmaPy: an open source community-driven Python package for plasma physics. PlasmaPy seeks to provide core functionality that is needed for the formation of a fully open source Python ecosystem for plasma physics. PlasmaPy prioritizes code readability, consistency, and maintainability while using best practices for scientific computing such as version control, continuous integration testing, embedding documentation in code, and code review. We discuss our current and planned capabilities, including features presently under development. The development roadmap includes features such as fluid and particle simulation capabilities, a Grad-Shafranov solver, a dispersion relation solver, atomic data retrieval methods, and tools to analyze simulations and experiments. We describe several ways to contribute to PlasmaPy. PlasmaPy has a code of conduct and is being developed under a BSD license, with a version 0.1 release planned for 2018. The success of PlasmaPy depends on active community involvement, so anyone interested in contributing to this project should contact the authors. This work was partially supported by the U.S. Department of Energy.

  6. A practical guide to replica-exchange Wang—Landau simulations

    NASA Astrophysics Data System (ADS)

    Vogel, Thomas; Li, Ying Wai; Landau, David P.

    2018-04-01

    This paper is based on a series of tutorial lectures about the replica-exchange Wang-Landau (REWL) method given at the IX Brazilian Meeting on Simulational Physics (BMSP 2017). It provides a practical guide for the implementation of the method. A complete example code for a model system is available online. In this paper, we discuss the main parallel features of this code after a brief introduction to the REWL algorithm. The tutorial section is mainly directed at users who have written a single-walker Wang–Landau program already but might have just taken their first steps in parallel programming using the Message Passing Interface (MPI). In the last section, we answer “frequently asked questions” from users about the implementation of REWL for different scientific problems.

  7. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  8. Software and the Scientist: Coding and Citation Practices in Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  9. Warp-X: A new exascale computing platform for beam–plasma simulations

    DOE PAGES

    Vay, J. -L.; Almgren, A.; Bell, J.; ...

    2018-01-31

    Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less

  10. Warp-X: A new exascale computing platform for beam–plasma simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vay, J. -L.; Almgren, A.; Bell, J.

    Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less

  11. Accuracy of the lattice-Boltzmann method using the Cell processor

    NASA Astrophysics Data System (ADS)

    Harvey, M. J.; de Fabritiis, G.; Giupponi, G.

    2008-11-01

    Accelerator processors like the new Cell processor are extending the traditional platforms for scientific computation, allowing orders of magnitude more floating-point operations per second (flops) compared to standard central processing units. However, they currently lack double-precision support and support for some IEEE 754 capabilities. In this work, we develop a lattice-Boltzmann (LB) code to run on the Cell processor and test the accuracy of this lattice method on this platform. We run tests for different flow topologies, boundary conditions, and Reynolds numbers in the range Re=6 350 . In one case, simulation results show a reduced mass and momentum conservation compared to an equivalent double-precision LB implementation. All other cases demonstrate the utility of the Cell processor for fluid dynamics simulations. Benchmarks on two Cell-based platforms are performed, the Sony Playstation3 and the QS20/QS21 IBM blade, obtaining a speed-up factor of 7 and 21, respectively, compared to the original PC version of the code, and a conservative sustained performance of 28 gigaflops per single Cell processor. Our results suggest that choice of IEEE 754 rounding mode is possibly as important as double-precision support for this specific scientific application.

  12. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; McPherson, Brian J.; Grigg, Reid B.

    Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storagemore » of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.« less

  14. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  15. The Brian Simulator

    PubMed Central

    Goodman, Dan F. M.; Brette, Romain

    2009-01-01

    “Brian” is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience. PMID:20011141

  16. Extraordinary Tools for Extraordinary Science: The Impact ofSciDAC on Accelerator Science&Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert D.

    2006-08-10

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook''. Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now takemore » hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.« less

  17. Extraordinary tools for extraordinary science: the impact of SciDAC on accelerator science and technology

    NASA Astrophysics Data System (ADS)

    Ryne, Robert D.

    2006-09-01

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook.'' Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now take hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.

  18. BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs

    NASA Astrophysics Data System (ADS)

    Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes

    2017-06-01

    Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.

  19. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  20. System Simulation of Nuclear Power Plant by Coupling RELAP5 and Matlab/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng Lin; Dong Hou; Zhihong Xu

    2006-07-01

    Since RELAP5 code has general and advanced features in thermal-hydraulic computation, it has been widely used in transient and accident safety analysis, experiment planning analysis, and system simulation, etc. So we wish to design, analyze, verify a new Instrumentation And Control (I and C) system of Nuclear Power Plant (NPP) based on the best-estimated code, and even develop our engineering simulator. But because of limited function of simulating control and protection system in RELAP5, it is necessary to expand the function for high efficient, accurate, flexible design and simulation of I and C system. Matlab/Simulink, a scientific computation software, justmore » can compensate the limitation, which is a powerful tool in research and simulation of plant process control. The software is selected as I and C part to be coupled with RELAP5 code to realize system simulation of NPPs. There are two key techniques to be solved. One is the dynamic data exchange, by which Matlab/Simulink receives plant parameters and returns control results. Database is used to communicate the two codes. Accordingly, Dynamic Link Library (DLL) is applied to link database in RELAP5, while DLL and S-Function is applied in Matlab/Simulink. The other problem is synchronization between the two codes for ensuring consistency in global simulation time. Because Matlab/Simulink always computes faster than RELAP5, the simulation time is sent by RELAP5 and received by Matlab/Simulink. A time control subroutine is added into the simulation procedure of Matlab/Simulink to control its simulation advancement. Through these ways, Matlab/Simulink is dynamically coupled with RELAP5. Thus, in Matlab/Simulink, we can freely design control and protection logic of NPPs and test it with best-estimated plant model feedback. A test will be shown to illuminate that results of coupling calculation are nearly the same with one of single RELAP5 with control logic. In practice, a real Pressurized Water Reactor (PWR) is modeled by RELAP5 code, and its main control and protection system is duplicated by Matlab/Simulink. Some steady states and transients are calculated under control of these I and C systems, and the results are compared with the plant test curves. The application showed that it can do exact system simulation of NPPs by coupling RELAP5 and Matlab/Simulink. This paper will mainly focus on the coupling method, plant thermal-hydraulic model, main control logics, test and application results. (authors)« less

  1. Chaste: An Open Source C++ Library for Computational Physiology and Biology

    PubMed Central

    Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.

    2013-01-01

    Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  2. Simulating a transmon implementation of the surface code, Part I

    NASA Astrophysics Data System (ADS)

    Tarasinski, Brian; O'Brien, Thomas; Rol, Adriaan; Bultink, Niels; Dicarlo, Leo

    Current experimental efforts aim to realize Surface-17, a distance-3 surface-code logical qubit, using transmon qubits in a circuit QED architecture. Following experimental proposals for this device, and currently achieved fidelities on physical qubits, we define a detailed error model that takes experimentally relevant error sources into account, such as amplitude and phase damping, imperfect gate pulses, and coherent errors due to low-frequency flux noise. Using the GPU-accelerated software package 'quantumsim', we simulate the density matrix evolution of the logical qubit under this error model. Combining the simulation results with a minimum-weight matching decoder, we obtain predictions for the error rate of the resulting logical qubit when used as a quantum memory, and estimate the contribution of different error sources to the logical error budget. Research funded by the Foundation for Fundamental Research on Matter (FOM), the Netherlands Organization for Scientific Research (NWO/OCW), IARPA, an ERC Synergy Grant, the China Scholarship Council, and Intel Corporation.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krasheninnikov, Sergei I.; Angus, Justin; Lee, Wonjae

    The goal of the Edge Simulation Laboratory (ESL) multi-institutional project is to advance scientific understanding of the edge plasma region of magnetic fusion devices via a coordinated effort utilizing modern computing resources, advanced algorithms, and ongoing theoretical development. The UCSD team was involved in the development of the COGENT code for kinetic studies across a magnetic separatrix. This work included a kinetic treatment of electrons and multiple ion species (impurities) and accurate collision operators.

  4. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  5. Developing and Implementing the Data Mining Algorithms in RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, W.

    Building something which could be called {open_quotes}virtual reality{close_quotes} (VR) is something of a challenge, particularly when nobody really seems to agree on a definition of VR. The author wanted to combine scientific visualization with VR, resulting in an environment useful for assisting scientific research. He demonstrates the combination of VR and scientific visualization in a prototype application. The VR application constructed consists of a dataflow based system for performing scientific visualization (AVS), extensions to the system to support VR input devices and a numerical simulation ported into the dataflow environment. The VR system includes two inexpensive, off-the-shelf VR devices andmore » some custom code. A working system was assembled with about two man-months of effort. The system allows the user to specify parameters for a chemical flooding simulation as well as some viewing parameters using VR input devices, as well as view the output using VR output devices. In chemical flooding, there is a subsurface region that contains chemicals which are to be removed. Secondary oil recovery and environmental remediation are typical applications of chemical flooding. The process assumes one or more injection wells, and one or more production wells. Chemicals or water are pumped into the ground, mobilizing and displacing hydrocarbons or contaminants. The placement of the production and injection wells, and other parameters of the wells, are the most important variables in the simulation.« less

  7. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  8. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  9. Quantum simulations of nuclei and nuclear pasta with the multiresolution adaptive numerical environment for scientific simulations

    NASA Astrophysics Data System (ADS)

    Sagert, I.; Fann, G. I.; Fattoyev, F. J.; Postnikov, S.; Horowitz, C. J.

    2016-05-01

    Background: Neutron star and supernova matter at densities just below the nuclear matter saturation density is expected to form a lattice of exotic shapes. These so-called nuclear pasta phases are caused by Coulomb frustration. Their elastic and transport properties are believed to play an important role for thermal and magnetic field evolution, rotation, and oscillation of neutron stars. Furthermore, they can impact neutrino opacities in core-collapse supernovae. Purpose: In this work, we present proof-of-principle three-dimensional (3D) Skyrme Hartree-Fock (SHF) simulations of nuclear pasta with the Multi-resolution ADaptive Numerical Environment for Scientific Simulations (MADNESS). Methods: We perform benchmark studies of 16O, 208Pb, and 238U nuclear ground states and calculate binding energies via 3D SHF simulations. Results are compared with experimentally measured binding energies as well as with theoretically predicted values from an established SHF code. The nuclear pasta simulation is initialized in the so-called waffle geometry as obtained by the Indiana University Molecular Dynamics (IUMD) code. The size of the unit cell is 24 fm with an average density of about ρ =0.05 fm-3 , proton fraction of Yp=0.3 , and temperature of T =0 MeV. Results: Our calculations reproduce the binding energies and shapes of light and heavy nuclei with different geometries. For the pasta simulation, we find that the final geometry is very similar to the initial waffle state. We compare calculations with and without spin-orbit forces. We find that while subtle differences are present, the pasta phase remains in the waffle geometry. Conclusions: Within the MADNESS framework, we can successfully perform calculations of inhomogeneous nuclear matter. By using pasta configurations from IUMD it is possible to explore different geometries and test the impact of self-consistent calculations on the latter.

  10. Gyrokinetic Particle Simulation of Turbulent Transport in Burning Plasmas (GPS - TTBP) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chame, Jacqueline

    2011-05-27

    The goal of this project is the development of the Gyrokinetic Toroidal Code (GTC) Framework and its applications to problems related to the physics of turbulence and turbulent transport in tokamaks,. The project involves physics studies, code development, noise effect mitigation, supporting computer science efforts, diagnostics and advanced visualizations, verification and validation. Its main scientific themes are mesoscale dynamics and non-locality effects on transport, the physics of secondary structures such as zonal flows, and strongly coherent wave-particle interaction phenomena at magnetic precession resonances. Special emphasis is placed on the implications of these themes for rho-star and current scalings and formore » the turbulent transport of momentum. GTC-TTBP also explores applications to electron thermal transport, particle transport; ITB formation and cross-cuts such as edge-core coupling, interaction of energetic particles with turbulence and neoclassical tearing mode trigger dynamics. Code development focuses on major initiatives in the development of full-f formulations and the capacity to simulate flux-driven transport. In addition to the full-f -formulation, the project includes the development of numerical collision models and methods for coarse graining in phase space. Verification is pursued by linear stability study comparisons with the FULL and HD7 codes and by benchmarking with the GKV, GYSELA and other gyrokinetic simulation codes. Validation of gyrokinetic models of ion and electron thermal transport is pursed by systematic stressing comparisons with fluctuation and transport data from the DIII-D and NSTX tokamaks. The physics and code development research programs are supported by complementary efforts in computer sciences, high performance computing, and data management.« less

  11. Collisionless stellar hydrodynamics as an efficient alternative to N-body methods

    NASA Astrophysics Data System (ADS)

    Mitchell, Nigel L.; Vorobyov, Eduard I.; Hensler, Gerhard

    2013-01-01

    The dominant constituents of the Universe's matter are believed to be collisionless in nature and thus their modelling in any self-consistent simulation is extremely important. For simulations that deal only with dark matter or stellar systems, the conventional N-body technique is fast, memory efficient and relatively simple to implement. However when extending simulations to include the effects of gas physics, mesh codes are at a distinct disadvantage compared to Smooth Particle Hydrodynamics (SPH) codes. Whereas implementing the N-body approach into SPH codes is fairly trivial, the particle-mesh technique used in mesh codes to couple collisionless stars and dark matter to the gas on the mesh has a series of significant scientific and technical limitations. These include spurious entropy generation resulting from discreteness effects, poor load balancing and increased communication overhead which spoil the excellent scaling in massively parallel grid codes. In this paper we propose the use of the collisionless Boltzmann moment equations as a means to model the collisionless material as a fluid on the mesh, implementing it into the massively parallel FLASH Adaptive Mesh Refinement (AMR) code. This approach which we term `collisionless stellar hydrodynamics' enables us to do away with the particle-mesh approach and since the parallelization scheme is identical to that used for the hydrodynamics, it preserves the excellent scaling of the FLASH code already demonstrated on peta-flop machines. We find that the classic hydrodynamic equations and the Boltzmann moment equations can be reconciled under specific conditions, allowing us to generate analytic solutions for collisionless systems using conventional test problems. We confirm the validity of our approach using a suite of demanding test problems, including the use of a modified Sod shock test. By deriving the relevant eigenvalues and eigenvectors of the Boltzmann moment equations, we are able to use high order accurate characteristic tracing methods with Riemann solvers to generate numerical solutions which show excellent agreement with our analytic solutions. We conclude by demonstrating the ability of our code to model complex phenomena by simulating the evolution of a two-armed spiral galaxy whose properties agree with those predicted by the swing amplification theory.

  12. Agricultural Baseline (BL0) scenario

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinckel, Chad M [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  13. European Scientific Notes. Volume 37, Number 1.

    DTIC Science & Technology

    1983-01-31

    instantoneous sea-state condition can be tions vary widely in their realism , with computed from a special data base coded some producing dynamic color pictures...between the variables of accuracy, approach channels, the alignment of practicality, realism , and expense. jetties, and the establishment of Because the...tidal current variables The system certainly seems to be valid, have been played into some of the and the smooth dynamics, realism , and simulator runs

  14. MPAS-Ocean NESAP Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petersen, Mark Roger; Arndt, William; Keen, Noel

    NESAP performance improvements on MPAS-Ocean have resulted in a 5% to 7% speed-up on each of the examined systems including Cori-KNL, Cori-Haswell, and Edison. These tests were configured to emulate a production workload by using 128 nodes and a high-resolution ocean domain. Overall, the gap between standard and many-core architecture performance has been narrowed, but Cori-KNL remains considerably under-performing relative to Edison. NESAP code alterations affected 600 lines of code, and most of these improvements will benefit other MPAS codes (sea ice, land ice) that are also components within ACME. Modifications are fully tested within MPAS. Testing in ACME acrossmore » many platforms is underway, and must be completed before the code is merged. In addition, a ten-year production ACME global simulation was conducted on Cori-KNL in late 2016 with the pre-NESAP code in order to test readiness and configurations for scientific studies. Next steps include assessing performance across a range of nodes, threads per node, and ocean resolutions on Cori-KNL.« less

  15. Gyrokinetic micro-turbulence simulations on the NERSC 16-way SMP IBM SP computer: experiences and performance results

    NASA Astrophysics Data System (ADS)

    Ethier, Stephane; Lin, Zhihong

    2001-10-01

    Earlier this year, the National Energy Research Scientific Computing center (NERSC) took delivery of the second most powerful computer in the world. With its 2,528 processors running at a peak performance of 1.5 GFlops, this IBM SP machine has a theoretical performance of almost 3.8 TFlops. To efficiently harness such computing power in one single code is not an easy task and requires a good knowledge of the computer's architecture. Here we present the steps that we followed to improve our gyrokinetic micro-turbulence code GTC in order to take advantage of the new 16-way shared memory nodes of the NERSC IBM SP. Performance results are shown as well as details about the improved mixed-mode MPI-OpenMP model that we use. The enhancements to the code allowed us to tackle much bigger problem sizes, getting closer to our goal of simulating an ITER-size tokamak with both kinetic ions and electrons.(This work is supported by DOE Contract No. DE-AC02-76CH03073 (PPPL), and in part by the DOE Fusion SciDAC Project.)

  16. What can the programming language Rust do for astrophysics?

    NASA Astrophysics Data System (ADS)

    Blanco-Cuaresma, Sergi; Bolmont, Emeline

    2017-06-01

    The astrophysics community uses different tools for computational tasks such as complex systems simulations, radiative transfer calculations or big data. Programming languages like Fortran, C or C++ are commonly present in these tools and, generally, the language choice was made based on the need for performance. However, this comes at a cost: safety. For instance, a common source of error is the access to invalid memory regions, which produces random execution behaviors and affects the scientific interpretation of the results. In 2015, Mozilla Research released the first stable version of a new programming language named Rust. Many features make this new language attractive for the scientific community, it is open source and it guarantees memory safety while offering zero-cost abstraction. We explore the advantages and drawbacks of Rust for astrophysics by re-implementing the fundamental parts of Mercury-T, a Fortran code that simulates the dynamical and tidal evolution of multi-planet systems.

  17. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We believe that the lessons learned and tools developed will be useful in many areas of science beyond the computational mineralogy. Key tools that will be described include: a pure Fortran XML library (FoX) that presents XPath, SAX and DOM interfaces as well as permitting the easy production of valid XML from legacy Fortran programs; a job submission framework that automatically schedules calculations to remote grid resources, handles data staging and metadata capture; and a tool (AgentX) that map concepts from an ontology onto locations in documents of various formats that we use to enable data exchange.

  18. Parallel Computation of the Regional Ocean Modeling System (ROMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, P; Song, Y T; Chao, Y

    2005-04-05

    The Regional Ocean Modeling System (ROMS) is a regional ocean general circulation modeling system solving the free surface, hydrostatic, primitive equations over varying topography. It is free software distributed world-wide for studying both complex coastal ocean problems and the basin-to-global scale ocean circulation. The original ROMS code could only be run on shared-memory systems. With the increasing need to simulate larger model domains with finer resolutions and on a variety of computer platforms, there is a need in the ocean-modeling community to have a ROMS code that can be run on any parallel computer ranging from 10 to hundreds ofmore » processors. Recently, we have explored parallelization for ROMS using the MPI programming model. In this paper, an efficient parallelization strategy for such a large-scale scientific software package, based on an existing shared-memory computing model, is presented. In addition, scientific applications and data-performance issues on a couple of SGI systems, including Columbia, the world's third-fastest supercomputer, are discussed.« less

  19. RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Andrs; Ray Berry; Derek Gaston

    The document contains the simulation results of a steady state model PWR problem with the RELAP-7 code. The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on INL's modern scientific software development framework - MOOSE (Multi-Physics Object-Oriented Simulation Environment). This report summarizes the initial results of simulating a model steady-state single phase PWR problem using the current version of the RELAP-7 code. The major purpose of this demonstration simulation is to show that RELAP-7 code can be rapidly developed to simulate single-phase reactor problems. RELAP-7more » is a new project started on October 1st, 2011. It will become the main reactor systems simulation toolkit for RISMC (Risk Informed Safety Margin Characterization) and the next generation tool in the RELAP reactor safety/systems analysis application series (the replacement for RELAP5). The key to the success of RELAP-7 is the simultaneous advancement of physical models, numerical methods, and software design while maintaining a solid user perspective. Physical models include both PDEs (Partial Differential Equations) and ODEs (Ordinary Differential Equations) and experimental based closure models. RELAP-7 will eventually utilize well posed governing equations for multiphase flow, which can be strictly verified. Closure models used in RELAP5 and newly developed models will be reviewed and selected to reflect the progress made during the past three decades. RELAP-7 uses modern numerical methods, which allow implicit time integration, higher order schemes in both time and space, and strongly coupled multi-physics simulations. RELAP-7 is written with object oriented programming language C++. Its development follows modern software design paradigms. The code is easy to read, develop, maintain, and couple with other codes. Most importantly, the modern software design allows the RELAP-7 code to evolve with time. RELAP-7 is a MOOSE-based application. MOOSE (Multiphysics Object-Oriented Simulation Environment) is a framework for solving computational engineering problems in a well-planned, managed, and coordinated way. By leveraging millions of lines of open source software packages, such as PETSC (a nonlinear solver developed at Argonne National Laboratory) and LibMesh (a Finite Element Analysis package developed at University of Texas), MOOSE significantly reduces the expense and time required to develop new applications. Numerical integration methods and mesh management for parallel computation are provided by MOOSE. Therefore RELAP-7 code developers only need to focus on physics and user experiences. By using the MOOSE development environment, RELAP-7 code is developed by following the same modern software design paradigms used for other MOOSE development efforts. There are currently over 20 different MOOSE based applications ranging from 3-D transient neutron transport, detailed 3-D transient fuel performance analysis, to long-term material aging. Multi-physics and multiple dimensional analyses capabilities can be obtained by coupling RELAP-7 and other MOOSE based applications and by leveraging with capabilities developed by other DOE programs. This allows restricting the focus of RELAP-7 to systems analysis-type simulations and gives priority to retain and significantly extend RELAP5's capabilities.« less

  20. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  1. An Experiment in Scientific Code Semantic Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  2. Supercomputing with TOUGH2 family codes for coupled multi-physics simulations of geologic carbon sequestration

    NASA Astrophysics Data System (ADS)

    Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.

    2015-12-01

    Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent scalabilities showing almost linear speedup against number of processors up to over ten thousand cores. Generally this allows us to perform coupled multi-physics (THC) simulations on high resolution geologic models with multi-million grid in a practical time (e.g., less than a second per time step).

  3. JPRS Report, Science & Technology Europe

    DTIC Science & Technology

    1988-09-08

    with good temperature dependence. In the use of the 1B2B balance code, the average value of the optical power emitted by the photodiode equals one...Workers Clerical staff Total 9.7 6.6 18.8 10.3 28.8 8.4 9.4 General facilities 8.0 Table 2. MANPOWER ( Average staff in 1986) 170 180...Propulsion and High Temperatures Scientific Assistant Technical Assistant Special Assistant, Gas Turbines Modeling and Numerical Simulation in

  4. A new 3D viewer as an interface between the ASDEX Upgrade CAD models and data from plasma modelling and experiment

    NASA Astrophysics Data System (ADS)

    Lunt, T.; Fuchs, J. C.; Mank, K.; Feng, Y.; Brochard, F.; Herrmann, A.; Rohde, V.; Endstrasser, N.; ASDEX Upgrade Team

    2010-11-01

    A generally available and easy-to-use viewer for the simultaneous visualisation of the ASDEX Upgrade vacuum vessel computer aided design models, diagnostics and magnetic geometry, solutions of 3D plasma simulation codes and 2D camera images was developed. Here we report on the working principle of this software and give several examples of its technical and scientific application.

  5. Investigating the impact of the cielo cray XE6 architecture on scientific application codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajan, Mahesh; Barrett, Richard; Pedretti, Kevin Thomas Tauke

    2010-12-01

    Cielo, a Cray XE6, is the Department of Energy NNSA Advanced Simulation and Computing (ASC) campaign's newest capability machine. Rated at 1.37 PFLOPS, it consists of 8,944 dual-socket oct-core AMD Magny-Cours compute nodes, linked using Cray's Gemini interconnect. Its primary mission objective is to enable a suite of the ASC applications implemented using MPI to scale to tens of thousands of cores. Cielo is an evolutionary improvement to a successful architecture previously available to many of our codes, thus enabling a basis for understanding the capabilities of this new architecture. Using three codes strategically important to the ASC campaign, andmore » supplemented with some micro-benchmarks that expose the fundamental capabilities of the XE6, we report on the performance characteristics and capabilities of Cielo.« less

  6. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  7. Validation of numerical codes for impact and explosion cratering: Impacts on strengthless and metal targets

    NASA Astrophysics Data System (ADS)

    Pierazzo, E.; Artemieva, N.; Asphaug, E.; Baldwin, E. C.; Cazamias, J.; Coker, R.; Collins, G. S.; Crawford, D. A.; Davison, T.; Elbeshausen, D.; Holsapple, K. A.; Housen, K. R.; Korycansky, D. G.; Wünnemann, K.

    2008-12-01

    Over the last few decades, rapid improvement of computer capabilities has allowed impact cratering to be modeled with increasing complexity and realism, and has paved the way for a new era of numerical modeling of the impact process, including full, three-dimensional (3D) simulations. When properly benchmarked and validated against observation, computer models offer a powerful tool for understanding the mechanics of impact crater formation. This work presents results from the first phase of a project to benchmark and validate shock codes. A variety of 2D and 3D codes were used in this study, from commercial products like AUTODYN, to codes developed within the scientific community like SOVA, SPH, ZEUS-MP, iSALE, and codes developed at U.S. National Laboratories like CTH, SAGE/RAGE, and ALE3D. Benchmark calculations of shock wave propagation in aluminum-on-aluminum impacts were performed to examine the agreement between codes for simple idealized problems. The benchmark simulations show that variability in code results is to be expected due to differences in the underlying solution algorithm of each code, artificial stability parameters, spatial and temporal resolution, and material models. Overall, the inter-code variability in peak shock pressure as a function of distance is around 10 to 20%. In general, if the impactor is resolved by at least 20 cells across its radius, the underestimation of peak shock pressure due to spatial resolution is less than 10%. In addition to the benchmark tests, three validation tests were performed to examine the ability of the codes to reproduce the time evolution of crater radius and depth observed in vertical laboratory impacts in water and two well-characterized aluminum alloys. Results from these calculations are in good agreement with experiments. There appears to be a general tendency of shock physics codes to underestimate the radius of the forming crater. Overall, the discrepancy between the model and experiment results is between 10 and 20%, similar to the inter-code variability.

  8. Unsteady flow simulations around complex geometries using stationary or rotating unstructured grids

    NASA Astrophysics Data System (ADS)

    Sezer-Uzol, Nilay

    In this research, the computational analysis of three-dimensional, unsteady, separated, vortical flows around complex geometries is studied by using stationary or moving unstructured grids. Two main engineering problems are investigated. The first problem is the unsteady simulation of a ship airwake, where helicopter operations become even more challenging, by using stationary unstructured grids. The second problem is the unsteady simulation of wind turbine rotor flow fields by using moving unstructured grids which are rotating with the whole three-dimensional rigid rotor geometry. The three dimensional, unsteady, parallel, unstructured, finite volume flow solver, PUMA2, is used for the computational fluid dynamics (CFD) simulations considered in this research. The code is modified to have a moving grid capability to perform three-dimensional, time-dependent rotor simulations. An instantaneous log-law wall model for Large Eddy Simulations is also implemented in PUMA2 to investigate the very large Reynolds number flow fields of rotating blades. To verify the code modifications, several sample test cases are also considered. In addition, interdisciplinary studies, which are aiming to provide new tools and insights to the aerospace and wind energy scientific communities, are done during this research by focusing on the coupling of ship airwake CFD simulations with the helicopter flight dynamics and control analysis, the coupling of wind turbine rotor CFD simulations with the aeroacoustic analysis, and the analysis of these time-dependent and large-scale CFD simulations with the help of a computational monitoring, steering and visualization tool, POSSE.

  9. Bottled SAFT: A Web App Providing SAFT-γ Mie Force Field Parameters for Thousands of Molecular Fluids.

    PubMed

    Ervik, Åsmund; Mejía, Andrés; Müller, Erich A

    2016-09-26

    Coarse-grained molecular simulation has become a popular tool for modeling simple and complex fluids alike. The defining aspects of a coarse grained model are the force field parameters, which must be determined for each particular fluid. Because the number of molecular fluids of interest in nature and in engineering processes is immense, constructing force field parameter tables by individually fitting to experimental data is a futile task. A step toward solving this challenge was taken recently by Mejía et al., who proposed a correlation that provides SAFT-γ Mie force field parameters for a fluid provided one knows the critical temperature, the acentric factor and a liquid density, all relatively accessible properties. Building on this, we have applied the correlation to more than 6000 fluids, and constructed a web application, called "Bottled SAFT", which makes this data set easily searchable by CAS number, name or chemical formula. Alternatively, the application allows the user to calculate parameters for components not present in the database. Once the intermolecular potential has been found through Bottled SAFT, code snippets are provided for simulating the desired substance using the "raaSAFT" framework, which leverages established molecular dynamics codes to run the simulations. The code underlying the web application is written in Python using the Flask microframework; this allows us to provide a modern high-performance web app while also making use of the scientific libraries available in Python. Bottled SAFT aims at taking the complexity out of obtaining force field parameters for a wide range of molecular fluids, and facilitates setting up and running coarse-grained molecular simulations. The web application is freely available at http://www.bottledsaft.org . The underlying source code is available on Bitbucket under a permissive license.

  10. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  11. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  12. NEAMS Update. Quarterly Report for October - December 2011.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, K.

    2012-02-16

    The Advanced Modeling and Simulation Office within the DOE Office of Nuclear Energy (NE) has been charged with revolutionizing the design tools used to build nuclear power plants during the next 10 years. To accomplish this, the DOE has brought together the national laboratories, U.S. universities, and the nuclear energy industry to establish the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program. The mission of NEAMS is to modernize computer modeling of nuclear energy systems and improve the fidelity and validity of modeling results using contemporary software environments and high-performance computers. NEAMS will create a set of engineering-level codes aimedmore » at designing and analyzing the performance and safety of nuclear power plants and reactor fuels. The truly predictive nature of these codes will be achieved by modeling the governing phenomena at the spatial and temporal scales that dominate the behavior. These codes will be executed within a simulation environment that orchestrates code integration with respect to spatial meshing, computational resources, and execution to give the user a common 'look and feel' for setting up problems and displaying results. NEAMS is building upon a suite of existing simulation tools, including those developed by the federal Scientific Discovery through Advanced Computing and Advanced Simulation and Computing programs. NEAMS also draws upon existing simulation tools for materials and nuclear systems, although many of these are limited in terms of scale, applicability, and portability (their ability to be integrated into contemporary software and hardware architectures). NEAMS investments have directly and indirectly supported additional NE research and development programs, including those devoted to waste repositories, safeguarded separations systems, and long-term storage of used nuclear fuel. NEAMS is organized into two broad efforts, each comprising four elements. The quarterly highlights October-December 2011 are: (1) Version 1.0 of AMP, the fuel assembly performance code, was tested on the JAGUAR supercomputer and released on November 1, 2011, a detailed discussion of this new simulation tool is given; (2) A coolant sub-channel model and a preliminary UO{sub 2} smeared-cracking model were implemented in BISON, the single-pin fuel code, more information on how these models were developed and benchmarked is given; (3) The Object Kinetic Monte Carlo model was implemented to account for nucleation events in meso-scale simulations and a discussion of the significance of this advance is given; (4) The SHARP neutronics module, PROTEUS, was expanded to be applicable to all types of reactors, and a discussion of the importance of PROTEUS is given; (5) A plan has been finalized for integrating the high-fidelity, three-dimensional reactor code SHARP with both the systems-level code RELAP7 and the fuel assembly code AMP. This is a new initiative; (6) Work began to evaluate the applicability of AMP to the problem of dry storage of used fuel and to define a relevant problem to test the applicability; (7) A code to obtain phonon spectra from the force-constant matrix for a crystalline lattice has been completed. This important bridge between subcontinuum and continuum phenomena is discussed; (8) Benchmarking was begun on the meso-scale, finite-element fuels code MARMOT to validate its new variable splitting algorithm; (9) A very computationally demanding simulation of diffusion-driven nucleation of new microstructural features has been completed. An explanation of the difficulty of this simulation is given; (10) Experiments were conducted with deformed steel to validate a crystal plasticity finite-element code for bodycentered cubic iron; (11) The Capability Transfer Roadmap was completed and published as an internal laboratory technical report; (12) The AMP fuel assembly code input generator was integrated into the NEAMS Integrated Computational Environment (NiCE). More details on the planned NEAMS computing environment is given; and (13) The NEAMS program website (neams.energy.gov) is nearly ready to launch.« less

  13. Scientific Studies of the High-Latitude Ionosphere with the Ionosphere Dynamics and ElectroDynamics - Data Assimilation (IDED-DA) Model

    DTIC Science & Technology

    2014-09-23

    conduct simulations with a high-latitude data assimilation model. The specific objectives are to study magnetosphere-ionosphere ( M -I) coupling processes...based on three physics-based models, including a magnetosphere-ionosphere ( M -I) electrodynamics model, an ionosphere model, and a magnetic...inversion code. The ionosphere model is a high-resolution version of the Ionosphere Forecast Model ( IFM ), which is a 3-D, multi-ion model of the ionosphere

  14. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  15. Publicly Releasing a Large Simulation Dataset with NDS Labs

    NASA Astrophysics Data System (ADS)

    Goldbaum, Nathan

    2016-03-01

    Optimally, all publicly funded research should be accompanied by the tools, code, and data necessary to fully reproduce the analysis performed in journal articles describing the research. This ideal can be difficult to attain, particularly when dealing with large (>10 TB) simulation datasets. In this lightning talk, we describe the process of publicly releasing a large simulation dataset to accompany the submission of a journal article. The simulation was performed using Enzo, an open source, community-developed N-body/hydrodynamics code and was analyzed using a wide range of community- developed tools in the scientific Python ecosystem. Although the simulation was performed and analyzed using an ecosystem of sustainably developed tools, we enable sustainable science using our data by making it publicly available. Combining the data release with the NDS Labs infrastructure allows a substantial amount of added value, including web-based access to analysis and visualization using the yt analysis package through an IPython notebook interface. In addition, we are able to accompany the paper submission to the arXiv preprint server with links to the raw simulation data as well as interactive real-time data visualizations that readers can explore on their own or share with colleagues during journal club discussions. It is our hope that the value added by these services will substantially increase the impact and readership of the paper.

  16. Numerical ‘health check’ for scientific codes: the CADNA approach

    NASA Astrophysics Data System (ADS)

    Scott, N. S.; Jézéquel, F.; Denis, C.; Chesneaux, J.-M.

    2007-04-01

    Scientific computation has unavoidable approximations built into its very fabric. One important source of error that is difficult to detect and control is round-off error propagation which originates from the use of finite precision arithmetic. We propose that there is a need to perform regular numerical 'health checks' on scientific codes in order to detect the cancerous effect of round-off error propagation. This is particularly important in scientific codes that are built on legacy software. We advocate the use of the CADNA library as a suitable numerical screening tool. We present a case study to illustrate the practical use of CADNA in scientific codes that are of interest to the Computer Physics Communications readership. In doing so we hope to stimulate a greater awareness of round-off error propagation and present a practical means by which it can be analyzed and managed.

  17. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  18. A system for environmental model coupling and code reuse: The Great Rivers Project

    NASA Astrophysics Data System (ADS)

    Eckman, B.; Rice, J.; Treinish, L.; Barford, C.

    2008-12-01

    As part of the Great Rivers Project, IBM is collaborating with The Nature Conservancy and the Center for Sustainability and the Global Environment (SAGE) at the University of Wisconsin, Madison to build a Modeling Framework and Decision Support System (DSS) designed to help policy makers and a variety of stakeholders (farmers, fish & wildlife managers, hydropower operators, et al.) to assess, come to consensus, and act on land use decisions representing effective compromises between human use and ecosystem preservation/restoration. Initially focused on Brazil's Paraguay-Parana, China's Yangtze, and the Mississippi Basin in the US, the DSS integrates data and models from a wide variety of environmental sectors, including water balance, water quality, carbon balance, crop production, hydropower, and biodiversity. In this presentation we focus on the modeling framework aspect of this project. In our approach to these and other environmental modeling projects, we see a flexible, extensible modeling framework infrastructure for defining and running multi-step analytic simulations as critical. In this framework, we divide monolithic models into atomic components with clearly defined semantics encoded via rich metadata representation. Once models and their semantics and composition rules have been registered with the system by their authors or other experts, non-expert users may construct simulations as workflows of these atomic model components. A model composition engine enforces rules/constraints for composing model components into simulations, to avoid the creation of Frankenmodels, models that execute but produce scientifically invalid results. A common software environment and common representations of data and models are required, as well as an adapter strategy for code written in e.g., Fortran or python, that still enables efficient simulation runs, including parallelization. Since each new simulation, as a new composition of model components, requires calibration of parameters (fudge factors) to produce scientifically valid results, we are also developing an autocalibration engine. Finally, visualization is a key element of this modeling framework strategy, both to convey complex scientific data effectively, and also to enable non-expert users to make full use of the relevant features of the framework. We are developing a visualization environment with a strong data model, to enable visualizations, model results, and data all to be handled similarly.

  19. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  20. OpenGeoSys: Performance-Oriented Computational Methods for Numerical Modeling of Flow in Large Hydrogeological Systems

    NASA Astrophysics Data System (ADS)

    Naumov, D.; Fischer, T.; Böttcher, N.; Watanabe, N.; Walther, M.; Rink, K.; Bilke, L.; Shao, H.; Kolditz, O.

    2014-12-01

    OpenGeoSys (OGS) is a scientific open source code for numerical simulation of thermo-hydro-mechanical-chemical processes in porous and fractured media. Its basic concept is to provide a flexible numerical framework for solving multi-field problems for applications in geoscience and hydrology as e.g. for CO2 storage applications, geothermal power plant forecast simulation, salt water intrusion, water resources management, etc. Advances in computational mathematics have revolutionized the variety and nature of the problems that can be addressed by environmental scientists and engineers nowadays and an intensive code development in the last years enables in the meantime the solutions of much larger numerical problems and applications. However, solving environmental processes along the water cycle at large scales, like for complete catchment or reservoirs, stays computationally still a challenging task. Therefore, we started a new OGS code development with focus on execution speed and parallelization. In the new version, a local data structure concept improves the instruction and data cache performance by a tight bundling of data with an element-wise numerical integration loop. Dedicated analysis methods enable the investigation of memory-access patterns in the local and global assembler routines, which leads to further data structure optimization for an additional performance gain. The concept is presented together with a technical code analysis of the recent development and a large case study including transient flow simulation in the unsaturated / saturated zone of the Thuringian Syncline, Germany. The analysis is performed on a high-resolution mesh (up to 50M elements) with embedded fault structures.

  1. Agricultural Baseline (BL0) scenario of the 2016 Billion-Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee, APAC] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  2. A Semantic Analysis Method for Scientific and Engineering Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  3. Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacFarlane, Joseph J.; Golovkin, I. E.; Woodruff, P. R.

    2009-08-07

    This Final Report summarizes work performed under DOE STTR Phase II Grant No. DE-FG02-05ER86258 during the project period from August 2006 to August 2009. The project, “Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments,” was led by Prism Computational Sciences (Madison, WI), and involved collaboration with subcontractors University of Nevada-Reno and Voss Scientific (Albuquerque, NM). In this project, we have: Developed and implemented a multi-dimensional, multi-frequency radiation transport model in the LSP hybrid fluid-PIC (particle-in-cell) code [1,2]. Updated the LSP code to support the use of accurate equation-of-state (EOS) tables generated by Prism’smore » PROPACEOS [3] code to compute more accurate temperatures in high energy density physics (HEDP) plasmas. Updated LSP to support the use of Prism’s multi-frequency opacity tables. Generated equation of state and opacity data for LSP simulations for several materials being used in plasma jet experimental studies. Developed and implemented parallel processing techniques for the radiation physics algorithms in LSP. Benchmarked the new radiation transport and radiation physics algorithms in LSP and compared simulation results with analytic solutions and results from numerical radiation-hydrodynamics calculations. Performed simulations using Prism radiation physics codes to address issues related to radiative cooling and ionization dynamics in plasma jet experiments. Performed simulations to study the effects of radiation transport and radiation losses due to electrode contaminants in plasma jet experiments. Updated the LSP code to generate output using NetCDF to provide a better, more flexible interface to SPECT3D [4] in order to post-process LSP output. Updated the SPECT3D code to better support the post-processing of large-scale 2-D and 3-D datasets generated by simulation codes such as LSP. Updated atomic physics modeling to provide for more comprehensive and accurate atomic databases that feed into the radiation physics modeling (spectral simulations and opacity tables). Developed polarization spectroscopy modeling techniques suitable for diagnosing energetic particle characteristics in HEDP experiments. A description of these items is provided in this report. The above efforts lay the groundwork for utilizing the LSP and SPECT3D codes in providing simulation support for DOE-sponsored HEDP experiments, such as plasma jet and fast ignition physics experiments. We believe that taken together, the LSP and SPECT3D codes have unique capabilities for advancing our understanding of the physics of these HEDP plasmas. Based on conversations early in this project with our DOE program manager, Dr. Francis Thio, our efforts emphasized developing radiation physics and atomic modeling capabilities that can be utilized in the LSP PIC code, and performing radiation physics studies for plasma jets. A relatively minor component focused on the development of methods to diagnose energetic particle characteristics in short-pulse laser experiments related to fast ignition physics. The period of performance for the grant was extended by one year to August 2009 with a one-year no-cost extension, at the request of subcontractor University of Nevada-Reno.« less

  4. Simulating Hydrologic Flow and Reactive Transport with PFLOTRAN and PETSc on Emerging Fine-Grained Parallel Computer Architectures

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Rupp, K.; Smith, B. F.; Brown, J.; Knepley, M.; Zhang, H.; Adams, M.; Hammond, G. E.

    2017-12-01

    As the high-performance computing community pushes towards the exascale horizon, power and heat considerations have driven the increasing importance and prevalence of fine-grained parallelism in new computer architectures. High-performance computing centers have become increasingly reliant on GPGPU accelerators and "manycore" processors such as the Intel Xeon Phi line, and 512-bit SIMD registers have even been introduced in the latest generation of Intel's mainstream Xeon server processors. The high degree of fine-grained parallelism and more complicated memory hierarchy considerations of such "manycore" processors present several challenges to existing scientific software. Here, we consider how the massively parallel, open-source hydrologic flow and reactive transport code PFLOTRAN - and the underlying Portable, Extensible Toolkit for Scientific Computation (PETSc) library on which it is built - can best take advantage of such architectures. We will discuss some key features of these novel architectures and our code optimizations and algorithmic developments targeted at them, and present experiences drawn from working with a wide range of PFLOTRAN benchmark problems on these architectures.

  5. Comparison of PASCAL and FORTRAN for solving problems in the physical sciences

    NASA Technical Reports Server (NTRS)

    Watson, V. R.

    1981-01-01

    The paper compares PASCAL and FORTRAN for problem solving in the physical sciences, due to requests NASA has received to make PASCAL available on the Numerical Aerodynamic Simulator (scheduled to be operational in 1986). PASCAL disadvantages include the lack of scientific utility procedures equivalent to the IBM scientific subroutine package or the IMSL package which are available in FORTRAN. Advantages include a well-organized, easy to read and maintain writing code, range checking to prevent errors, and a broad selection of data types. It is concluded that FORTRAN may be the better language, although ADA (patterned after PASCAL) may surpass FORTRAN due to its ability to add complex and vector math, and the specify the precision and range of variables.

  6. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  7. MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, C. M.; Boyle, K. L.; Reagan, M.

    2013-09-30

    Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly usefulmore » tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.« less

  8. General purpose molecular dynamics simulations fully implemented on graphics processing units

    NASA Astrophysics Data System (ADS)

    Anderson, Joshua A.; Lorenz, Chris D.; Travesset, A.

    2008-05-01

    Graphics processing units (GPUs), originally developed for rendering real-time effects in computer games, now provide unprecedented computational power for scientific applications. In this paper, we develop a general purpose molecular dynamics code that runs entirely on a single GPU. It is shown that our GPU implementation provides a performance equivalent to that of fast 30 processor core distributed memory cluster. Our results show that GPUs already provide an inexpensive alternative to such clusters and discuss implications for the future.

  9. Ethical conduct for research : a code of scientific ethics

    Treesearch

    Marcia Patton-Mallory; Kathleen Franzreb; Charles Carll; Richard Cline

    2000-01-01

    The USDA Forest Service recently developed and adopted a code of ethical conduct for scientific research and development. The code addresses issues related to research misconduct, such as fabrication, falsification, or plagiarism in proposing, performing, or reviewing research or in reporting research results, as well as issues related to professional misconduct, such...

  10. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  11. Real simulation tools in introductory courses: packaging and repurposing our research code.

    NASA Astrophysics Data System (ADS)

    Heagy, L. J.; Cockett, R.; Kang, S.; Oldenburg, D.

    2015-12-01

    Numerical simulations are an important tool for scientific research and applications in industry. They provide a means to experiment with physics in a tangible, visual way, often providing insights into the problem. Over the last two years, we have been developing course and laboratory materials for an undergraduate geophysics course primarily taken by non-geophysics majors, including engineers and geologists. Our aim is to provide the students with resources to build intuition about geophysical techniques, promote curiosity driven exploration, and help them develop the skills necessary to communicate across disciplines. Using open-source resources and our existing research code, we have built modules around simulations, with supporting content to give student interactive tools for exploration into the impacts of input parameters and visualization of the resulting fields, fluxes and data for a variety of problems in applied geophysics, including magnetics, seismic, electromagnetics, and direct current resistivity. The content provides context for the problems, along with exercises that are aimed at getting students to experiment and ask 'what if...?' questions. In this presentation, we will discuss our approach for designing the structure of the simulation-based modules, the resources we have used, challenges we have encountered, general feedback from students and instructors, as well as our goals and roadmap for future improvement. We hope that our experiences and approach will be beneficial to other instructors who aim to put simulation tools in the hands of students.

  12. SIM_EXPLORE: Software for Directed Exploration of Complex Systems

    NASA Technical Reports Server (NTRS)

    Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.

    2013-01-01

    Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software also makes use of the LEGION distributed computing framework to leverage the power of a set of compute nodes. The approach has been demonstrated on a planetary science application in which numerical simulations are used to study the formation of asteroid families.

  13. Novel 3D/VR interactive environment for MD simulations, visualization and analysis.

    PubMed

    Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P

    2014-12-18

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.

  14. Load Balancing Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pearce, Olga Tkachyshyn

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one atmore » the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.« less

  15. Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis

    PubMed Central

    Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.

    2014-01-01

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300

  16. Lessons Learned through the Development and Publication of AstroImageJ

    NASA Astrophysics Data System (ADS)

    Collins, Karen

    2018-01-01

    As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.

  17. OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments

    NASA Astrophysics Data System (ADS)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.

  18. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in Flash

    NASA Technical Reports Server (NTRS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-01-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  19. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in FLASH

    NASA Astrophysics Data System (ADS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-08-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid-structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  20. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.

  1. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB.

    PubMed

    Sinha, Shriprakash

    2016-12-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.

  2. A Parallel Numerical Micromagnetic Code Using FEniCS

    NASA Astrophysics Data System (ADS)

    Nagy, L.; Williams, W.; Mitchell, L.

    2013-12-01

    Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.

  3. Field-scale multi-phase LNAPL remediation: Validating a new computational framework against sequential field pilot trials.

    PubMed

    Sookhak Lari, Kaveh; Johnston, Colin D; Rayner, John L; Davis, Greg B

    2018-03-05

    Remediation of subsurface systems, including groundwater, soil and soil gas, contaminated with light non-aqueous phase liquids (LNAPLs) is challenging. Field-scale pilot trials of multi-phase remediation were undertaken at a site to determine the effectiveness of recovery options. Sequential LNAPL skimming and vacuum-enhanced skimming, with and without water table drawdown were trialled over 78days; in total extracting over 5m 3 of LNAPL. For the first time, a multi-component simulation framework (including the multi-phase multi-component code TMVOC-MP and processing codes) was developed and applied to simulate the broad range of multi-phase remediation and recovery methods used in the field trials. This framework was validated against the sequential pilot trials by comparing predicted and measured LNAPL mass removal rates and compositional changes. The framework was tested on both a Cray supercomputer and a cluster. Simulations mimicked trends in LNAPL recovery rates (from 0.14 to 3mL/s) across all remediation techniques each operating over periods of 4-14days over the 78day trial. The code also approximated order of magnitude compositional changes of hazardous chemical concentrations in extracted gas during vacuum-enhanced recovery. The verified framework enables longer term prediction of the effectiveness of remediation approaches allowing better determination of remediation endpoints and long-term risks. Copyright © 2017 Commonwealth Scientific and Industrial Research Organisation. Published by Elsevier B.V. All rights reserved.

  4. Code of Practice for Scientific Diving: Principles for the Safe Practice of Scientific Diving in Different Environments. Unesco Technical Papers in Marine Science 53.

    ERIC Educational Resources Information Center

    Flemming, N. C., Ed.; Max, M. D., Ed.

    This publication has been prepared to provide scientific divers with guidance on safe practice under varying experimental and environmental conditions. The Code offers advice and recommendations on administrative practices, insurance, terms of employment, medical standards, training standards, dive planning, safety with different breathing gases…

  5. Scientific and Technical Publishing at Goddard Space Flight Center in Fiscal Year 1994

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This publication is a compilation of scientific and technical material that was researched, written, prepared, and disseminated by the Center's scientists and engineers during FY94. It is presented in numerical order of the GSFC author's sponsoring technical directorate; i.e., Code 300 is the Office of Flight Assurance, Code 400 is the Flight Projects Directorate, Code 500 is the Mission Operations and Data Systems Directorate, Code 600 is the Space Sciences Directorate, Code 700 is the Engineering Directorate, Code 800 is the Suborbital Projects and Operations Directorate, and Code 900 is the Earth Sciences Directorate. The publication database contains publication or presentation title, author(s), document type, sponsor, and organizational code. This is the second annual compilation for the Center.

  6. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Chacón, L.; Cappello, S.

    2010-08-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacón, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code in cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.

  7. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonfiglio, Daniele; Chacon, Luis; Cappello, Susanna

    2010-01-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacon, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code inmore » cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.« less

  8. Plasma particle simulations on interactions between spacecraft and cold streaming plasmas

    NASA Astrophysics Data System (ADS)

    Miyake, Y.; Usui, H.; Nakashima, H.

    2012-12-01

    In order to better assess space weather effects on spacecraft system, we require in-depth understanding of fundamental processes of spacecraft-plasma interactions. Particularly in scientific spacecraft missions, the wake and photoelectron cloud formation as well as the spacecraft charging are significant factors influencing their operations, because onboard scientific instruments are often susceptible to such plasma disturbances. In this paper, we focus on the wake formation resulting from spacecraft interactions with a cold streaming plasma and study it by means of numerical simulations using modern supercomputers. We apply the particle-in-cell (PIC) method to the study of wake structure around a scientific spacecraft. We use our original plasma particle simulation code EMSES [2], which enables us to include solid spacecraft and sensor surfaces as internal boundaries. Although there are a number of preceding PIC simulation works regarding the wake structure behind a spacecraft [3], we here extend the studies by including numerical models of both spacecraft body and conducting booms simultaneously in the simulation system. The current analysis focuses on the wake structures behind the Cluster satellite in a tenuous plasma flow. We have included the conducting surfaces of wire booms as well as the spacecraft body in the simulations, the both of which can contribute to the wake formation. The major outcomes of the simulations are summarized as follows [4]; 1. not only a spacecraft body but also a thin (in an order of mm) wire boom contribute substantially to the formation of an electrostatic wake, particularly when the spacecraft has a positive potential of a few tens of volts; 2. in such a condition, the spatial scale of the wake reaches up to 100 m, leading to the detection of a wake electric field pattern that is very similar to that observed in the presence of a uniform ambient electric field; 3. spurious electric field can be detected even in subsonic ion flows occasionally, which is caused by an asymmetric potential pattern between the up- and down- streams of the spacecraft. We will report some details of these results as well as the comparison of the numerical results with observational data. [References] [1] André, M., and C. M. Cully (2012), Low-energy ions: A previously hidden solar system particle population, Geophys. Res. Lett., 39, L03101, doi:10.1029/ 2011GL050242. [2] Miyake, Y., and H. Usui (2009), New electromagnetic particle simulation code for the analysis of spacecraft-plasma interactions, Phys. Plasmas, 16, 062904, doi:10.1063/1.3147922. [3] Engwall, E., A. I. Eriksson, and J. Forest (2006), Wake formation behind positively charged spacecraft in flowing tenuous plasmas, Phys. Plasmas, 13, 062904, doi:10.1063/1.2199207. [4] Miyake, Y., and H. Usui (2012), Particle simulations of wake effects on electric field measurements in multi-species ion flows, Proc. of 12th Spacecraft Charging Technology Conference, Kitakyushu, Japan.

  9. Semiannual Technical Summary, 1 April-30 September 1990 (Royal Norwegian Council for Scientific and Industrial Research)

    DTIC Science & Technology

    1990-11-01

    Royal Norwegian Council for Scientific and Industrial Research (NTNF) ____ AD-A241 670 NORSAR Scientific Report No . 1-90/91 Semiannual Technical...ARPA Order No . 4138 AMD # 16 Program Code No . OF10 Name of Contractor Royal Norwegian Council for Scientific and Industrial Research Effective Date of...Advanced (If applicable) Contract No . F08606-89-C-0005 Research Projects Agency NMRO 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS

  10. Cultural and Technological Issues and Solutions for Geodynamics Software Citation

    NASA Astrophysics Data System (ADS)

    Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.

    2014-12-01

    Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.

  11. ORNL Cray X1 evaluation status report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, P.K.; Alexander, R.A.; Apra, E.

    2004-05-01

    On August 15, 2002 the Department of Energy (DOE) selected the Center for Computational Sciences (CCS) at Oak Ridge National Laboratory (ORNL) to deploy a new scalable vector supercomputer architecture for solving important scientific problems in climate, fusion, biology, nanoscale materials and astrophysics. ''This program is one of the first steps in an initiative designed to provide U.S. scientists with the computational power that is essential to 21st century scientific leadership,'' said Dr. Raymond L. Orbach, director of the department's Office of Science. In FY03, CCS procured a 256-processor Cray X1 to evaluate the processors, memory subsystem, scalability of themore » architecture, software environment and to predict the expected sustained performance on key DOE applications codes. The results of the micro-benchmarks and kernel bench marks show the architecture of the Cray X1 to be exceptionally fast for most operations. The best results are shown on large problems, where it is not possible to fit the entire problem into the cache of the processors. These large problems are exactly the types of problems that are important for the DOE and ultra-scale simulation. Application performance is found to be markedly improved by this architecture: - Large-scale simulations of high-temperature superconductors run 25 times faster than on an IBM Power4 cluster using the same number of processors. - Best performance of the parallel ocean program (POP v1.4.3) is 50 percent higher than on Japan s Earth Simulator and 5 times higher than on an IBM Power4 cluster. - A fusion application, global GYRO transport, was found to be 16 times faster on the X1 than on an IBM Power3. The increased performance allowed simulations to fully resolve questions raised by a prior study. - The transport kernel in the AGILE-BOLTZTRAN astrophysics code runs 15 times faster than on an IBM Power4 cluster using the same number of processors. - Molecular dynamics simulations related to the phenomenon of photon echo run 8 times faster than previously achieved. Even at 256 processors, the Cray X1 system is already outperforming other supercomputers with thousands of processors for a certain class of applications such as climate modeling and some fusion applications. This evaluation is the outcome of a number of meetings with both high-performance computing (HPC) system vendors and application experts over the past 9 months and has received broad-based support from the scientific community and other agencies.« less

  12. Resilient workflows for computational mechanics platforms

    NASA Astrophysics Data System (ADS)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  13. Characterization of a plasma photonic crystal using a multi-fluid plasma model

    NASA Astrophysics Data System (ADS)

    Thomas, W. R.; Shumlak, U.; Wang, B.; Righetti, F.; Cappelli, M. A.; Miller, S. T.

    2017-10-01

    Plasma photonic crystals have the potential to significantly expand the capabilities of current microwave filtering and switching technologies by providing high speed (μs) control of energy band-gap/pass characteristics in the GHz through low THz range. While photonic crystals consisting of dielectric, semiconductor, and metallic matrices have seen thousands of articles published over the last several decades, plasma-based photonic crystals remain a relatively unexplored field. Numerical modeling efforts so far have largely used the standard methods of analysis for photonic crystals (the Plane Wave Expansion Method, Finite Difference Time Domain, and ANSYS finite element electromagnetic code HFSS), none of which capture nonlinear plasma-radiation interactions. In this study, a 5N-moment multi-fluid plasma model is implemented using University of Washington's WARPXM finite element multi-physics code. A two-dimensional plasma-vacuum photonic crystal is simulated and its behavior is characterized through the generation of dispersion diagrams and transmission spectra. These results are compared with theory, experimental data, and ANSYS HFSS simulation results. This research is supported by a Grant from United States Air Force Office of Scientific Research.

  14. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  15. TOUGH3: A new efficient version of the TOUGH suite of multiphase flow and transport simulators

    NASA Astrophysics Data System (ADS)

    Jung, Yoojin; Pau, George Shu Heng; Finsterle, Stefan; Pollyea, Ryan M.

    2017-11-01

    The TOUGH suite of nonisothermal multiphase flow and transport simulators has been updated by various developers over many years to address a vast range of challenging subsurface problems. The increasing complexity of the simulated processes as well as the growing size of model domains that need to be handled call for an improvement in the simulator's computational robustness and efficiency. Moreover, modifications have been frequently introduced independently, resulting in multiple versions of TOUGH that (1) led to inconsistencies in feature implementation and usage, (2) made code maintenance and development inefficient, and (3) caused confusion to users and developers. TOUGH3-a new base version of TOUGH-addresses these issues. It consolidates both the serial (TOUGH2 V2.1) and parallel (TOUGH2-MP V2.0) implementations, enabling simulations to be performed on desktop computers and supercomputers using a single code. New PETSc parallel linear solvers are added to the existing serial solvers of TOUGH2 and the Aztec solver used in TOUGH2-MP. The PETSc solvers generally perform better than the Aztec solvers in parallel and the internal TOUGH3 linear solver in serial. TOUGH3 also incorporates many new features, addresses bugs, and improves the flexibility of data handling. Due to the improved capabilities and usability, TOUGH3 is more robust and efficient for solving tough and computationally demanding problems in diverse scientific and practical applications related to subsurface flow modeling.

  16. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.

  17. UCLA Final Technical Report for the "Community Petascale Project for Accelerator Science and Simulation”.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Warren

    The UCLA Plasma Simulation Group is a major partner of the “Community Petascale Project for Accelerator Science and Simulation”. This is the final technical report. We include an overall summary, a list of publications, progress for the most recent year, and individual progress reports for each year. We have made tremendous progress during the three years. SciDAC funds have contributed to the development of a large number of skeleton codes that illustrate how to write PIC codes with a hierarchy of parallelism. These codes cover 2D and 3D as well as electrostatic solvers (which are used in beam dynamics codesmore » and quasi-static codes) and electromagnetic solvers (which are used in plasma based accelerator codes). We also used these ideas to develop a GPU enabled version of OSIRIS. SciDAC funds were also contributed to the development of strategies to eliminate the Numerical Cerenkov Instability (NCI) which is an issue when carrying laser wakefield accelerator (LWFA) simulations in a boosted frame and when quantifying the emittance and energy spread of self-injected electron beams. This work included the development of a new code called UPIC-EMMA which is an FFT based electromagnetic PIC code and to new hybrid algorithms in OSIRIS. A new hybrid (PIC in r-z and gridless in φ) algorithm was implemented into OSIRIS. In this algorithm the fields and current are expanded into azimuthal harmonics and the complex amplitude for each harmonic is calculated separately. The contributions from each harmonic are summed and then used to push the particles. This algorithm permits modeling plasma based acceleration with some 3D effects but with the computational load of an 2D r-z PIC code. We developed a rigorously charge conserving current deposit for this algorithm. Very recently, we made progress in combining the speed up from the quasi-3D algorithm with that from the Lorentz boosted frame. SciDAC funds also contributed to the improvement and speed up of the quasi-static PIC code QuickPIC. We have also used our suite of PIC codes to make scientific discovery. Highlights include supporting FACET experiments which achieved the milestones of showing high beam loading and energy transfer efficiency from a drive electron beam to a witness electron beam and the discovery of a self-loading regime a for high gradient acceleration of a positron beam. Both of these experimental milestones were published in Nature together with supporting QuickPIC simulation results. Simulation results from QuickPIC were used on the cover of Nature in one case. We are also making progress on using highly resolved QuickPIC simulations to show that ion motion may not lead to catastrophic emittance growth for tightly focused electron bunches loaded into nonlinear wakefields. This could mean that fully self-consistent beam loading scenarios are possible. This work remains in progress. OSIRIS simulations were used to discover how 200 MeV electron rings are formed in LWFA experiments, on how to generate electrons that have a series of bunches on nanometer scale, and how to transport electron beams from (into) plasma sections into (from) conventional beam optic sections.« less

  18. High performance Python for direct numerical simulations of turbulent flows

    NASA Astrophysics Data System (ADS)

    Mortensen, Mikael; Langtangen, Hans Petter

    2016-06-01

    Direct Numerical Simulations (DNS) of the Navier Stokes equations is an invaluable research tool in fluid dynamics. Still, there are few publicly available research codes and, due to the heavy number crunching implied, available codes are usually written in low-level languages such as C/C++ or Fortran. In this paper we describe a pure scientific Python pseudo-spectral DNS code that nearly matches the performance of C++ for thousands of processors and billions of unknowns. We also describe a version optimized through Cython, that is found to match the speed of C++. The solvers are written from scratch in Python, both the mesh, the MPI domain decomposition, and the temporal integrators. The solvers have been verified and benchmarked on the Shaheen supercomputer at the KAUST supercomputing laboratory, and we are able to show very good scaling up to several thousand cores. A very important part of the implementation is the mesh decomposition (we implement both slab and pencil decompositions) and 3D parallel Fast Fourier Transforms (FFT). The mesh decomposition and FFT routines have been implemented in Python using serial FFT routines (either NumPy, pyFFTW or any other serial FFT module), NumPy array manipulations and with MPI communications handled by MPI for Python (mpi4py). We show how we are able to execute a 3D parallel FFT in Python for a slab mesh decomposition using 4 lines of compact Python code, for which the parallel performance on Shaheen is found to be slightly better than similar routines provided through the FFTW library. For a pencil mesh decomposition 7 lines of code is required to execute a transform.

  19. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  20. Report on the Brazilian Scientific Balloon Program

    NASA Astrophysics Data System (ADS)

    Braga, Joao

    We report on the recent scientific ballooning activities in Brazil, including important international collaborations, and present the plans for the next few years. We also present the recent progress achieved in the development and calibration of the protoMIRAX balloon experiment, especially about the detector system. protoMIRAX is a balloon-borne X-ray imaging telescope under development at INPE as a pathfinder for the MIRAX (Monitor e Imageador de Raios X) satellite mission. The experiment consists essentially in a hard X-ray (30-200 keV) coded-aperture imager which employs a square array of 196 10mm x 10mm x 2mm CdZnTe (CZT) planar detector. A collimator defines a fully-coded field-of-view of 20(°) x 20(°) , with 4(°) x 4(°) of full sensitivity. The angular resolution will be of 1.7(°) , defined by the use of a 1mm-thick lead coded-mask with an extended (˜4x4) 13x13 MURA pattern will 20mm-side cells, placed at a distance of 650 mm from the detector plane. We describe the design and development of the front-end electronics, with charge preamplifiers and shaping amplifiers customized for these detectors. We present spectral results obtained in the laboratory as well as initial calibration results of the acquisition system designed to get positions and energies in the detector plane. We show simulations of the flight background and the expected flight images of bright sources.

  1. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates - as reported by a cache simulation tool, and confirmed by hardware counters - only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  2. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  3. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  4. PARLO: PArallel Run-Time Layout Optimization for Scientific Data Explorations with Heterogeneous Access Pattern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gong, Zhenhuan; Boyuka, David; Zou, X

    Download Citation Email Print Request Permissions Save to Project The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their run-time environments. The growing gap is exacerbated by exploratory, data-intensive analytics, such as querying simulation data with multivariate, spatio-temporal constraints, which induces heterogeneous access patterns that stress the performance of the underlying storage system. Previous work addresses data layout and indexing techniques to improve query performance for a single access pattern, which is not sufficient for complex analytics jobs. We present PARLO a parallel run-time layout optimization framework, to achieve multi-levelmore » data layout optimization for scientific applications at run-time before data is written to storage. The layout schemes optimize for heterogeneous access patterns with user-specified priorities. PARLO is integrated with ADIOS, a high-performance parallel I/O middleware for large-scale HPC applications, to achieve user-transparent, light-weight layout optimization for scientific datasets. It offers simple XML-based configuration for users to achieve flexible layout optimization without the need to modify or recompile application codes. Experiments show that PARLO improves performance by 2 to 26 times for queries with heterogeneous access patterns compared to state-of-the-art scientific database management systems. Compared to traditional post-processing approaches, its underlying run-time layout optimization achieves a 56% savings in processing time and a reduction in storage overhead of up to 50%. PARLO also exhibits a low run-time resource requirement, while also limiting the performance impact on running applications to a reasonable level.« less

  5. Optimal Design of a Planar Textile Antenna for Industrial Scientific Medical (ISM) 2.4 GHz Wireless Body Area Networks (WBAN) with the CRO-SL Algorithm.

    PubMed

    Sánchez-Montero, Rocío; Camacho-Gómez, Carlos; López-Espí, Pablo-Luís; Salcedo-Sanz, Sancho

    2018-06-21

    This paper proposes a low-profile textile-modified meander line Inverted-F Antenna (IFA) with variable width and spacing meanders, for Industrial Scientific Medical (ISM) 2.4-GHz Wireless Body Area Networks (WBAN), optimized with a novel metaheuristic algorithm. Specifically, a metaheuristic known as Coral Reefs Optimization with Substrate Layer (CRO-SL) is used to obtain an optimal antenna for sensor systems, which allows covering properly and resiliently the 2.4⁻2.45-GHz industrial scientific medical bandwidth. Flexible pad foam has been used to make the designed prototype with a 1.1-mm thickness. We have used a version of the algorithm that is able to combine different searching operators within a single population of solutions. This approach is ideal to deal with hard optimization problems, such as the design of the proposed meander line IFA. During the optimization phase with the CRO-SL, the proposed antenna has been simulated using CST Microwave Studio software, linked to the CRO-SL by means of MATLAB implementation and Visual Basic Applications (VBA) code. We fully describe the antenna design process, the adaptation of the CRO-SL approach to this problem and several practical aspects of the optimization and details on the algorithm’s performance. To validate the simulation results, we have constructed and measured two prototypes of the antenna, designed with the proposed algorithm. Several practical aspects such as sensitivity during the antenna manufacturing or the agreement between the simulated and constructed antenna are also detailed in the paper.

  6. Proceedings of the 14th International Conference on the Numerical Simulation of Plasmas

    NASA Astrophysics Data System (ADS)

    Partial Contents are as follows: Numerical Simulations of the Vlasov-Maxwell Equations by Coupled Particle-Finite Element Methods on Unstructured Meshes; Electromagnetic PIC Simulations Using Finite Elements on Unstructured Grids; Modelling Travelling Wave Output Structures with the Particle-in-Cell Code CONDOR; SST--A Single-Slice Particle Simulation Code; Graphical Display and Animation of Data Produced by Electromagnetic, Particle-in-Cell Codes; A Post-Processor for the PEST Code; Gray Scale Rendering of Beam Profile Data; A 2D Electromagnetic PIC Code for Distributed Memory Parallel Computers; 3-D Electromagnetic PIC Simulation on the NRL Connection Machine; Plasma PIC Simulations on MIMD Computers; Vlasov-Maxwell Algorithm for Electromagnetic Plasma Simulation on Distributed Architectures; MHD Boundary Layer Calculation Using the Vortex Method; and Eulerian Codes for Plasma Simulations.

  7. Performance analysis of LDPC codes on OOK terahertz wireless channels

    NASA Astrophysics Data System (ADS)

    Chun, Liu; Chang, Wang; Jun-Cheng, Cao

    2016-02-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. Project supported by the National Key Basic Research Program of China (Grant No. 2014CB339803), the National High Technology Research and Development Program of China (Grant No. 2011AA010205), the National Natural Science Foundation of China (Grant Nos. 61131006, 61321492, and 61204135), the Major National Development Project of Scientific Instrument and Equipment (Grant No. 2011YQ150021), the National Science and Technology Major Project (Grant No. 2011ZX02707), the International Collaboration and Innovation Program on High Mobility Materials Engineering of the Chinese Academy of Sciences, and the Shanghai Municipal Commission of Science and Technology (Grant No. 14530711300).

  8. Scientific Ethics in Chemical Education

    NASA Astrophysics Data System (ADS)

    Kovac, Jeffrey

    1996-10-01

    Scientific ethics is a subset of professional ethics, the special rules of conduct adhered to by people engaged in those pursuits called professions. It is distinct from, but consistent with, both ordinary morality and moral theory. The codes of professional ethics derive from the two bargains that define a profession: the internal code of practice and the external bargain between the profession and society. While the informal code of professional conduct is well understood by working scientists, it is rarely explicitly included in the chemistry curriculum. Instead, we have relied on informal methods to teach students scientific ethics, a strategy that is haphazard at best. In this paper I argue that scientific ethics can and must be taught as part of the chemistry curriculum and that this is the best done through the case-study method. Many decisions made by working scientists have both a technical and an ethical component. Students need to learn how to make good decisions in professional ethics. The alternative is, at best, sloppy science and, at worst, scientific misconduct.

  9. Trident and MISTY: a universal pipeline for generating and sharing synthetic spectra

    NASA Astrophysics Data System (ADS)

    Hummels, Cameron; Smith, Britton; Silvia, Devin; Peeples, Molly; Prochaska, X.; Tejos, Nicolas

    2016-03-01

    Astrophysical simulations are useful insofar as they aid in the interpretation of telescopic observations. Thus, a primary task in simulation analysis is in producing synthetic observations for direct comparison against observational data. Furthermore, we as a field need an effective means for storing these synthetic observable data products, such that they are accessible and searchable by the entire population of researchers. In this talk, we present Trident, a universal pipeline for producing synthetic spectra from any of the major hydrodynamics codes, and MISTY, a means of storing these spectra on the HST MAST data archive. Trident and MISTY are our attempts to solve the difficult problems of synthetic data production and publicly-accessible storage for the scientific communities studying the intergalactic medium and circumgalactic medium.

  10. European Code against Cancer 4th Edition: Process of reviewing the scientific evidence and revising the recommendations.

    PubMed

    Minozzi, Silvia; Armaroli, Paola; Espina, Carolina; Villain, Patricia; Wiseman, Martin; Schüz, Joachim; Segnan, Nereo

    2015-12-01

    The European Code Against Cancer is a set of recommendations to give advice on cancer prevention. Its 4th edition is an update of the 3rd edition, from 2003. Working Groups of independent experts from different fields of cancer prevention were appointed to review the recommendations, supported by a Literature Group to provide scientific and technical support in the assessment of the scientific evidence, through systematic reviews of the literature. Common procedures were developed to guide the experts in identifying, retrieving, assessing, interpreting and summarizing the scientific evidence in order to revise the recommendations. The Code strictly followed the concept of providing advice to European Union citizens based on the current best available science. The advice, if followed, would be expected to reduce cancer risk, referring both to avoiding or reducing exposure to carcinogenic agents or changing behaviour related to cancer risk and to participating in medical interventions able to avert specific cancers or their consequences. The information sources and procedures for the review of the scientific evidence are described here in detail. The 12 recommendations of the 4th edition of the European Code Against Cancer were ultimately approved by a Scientific Committee of leading European cancer and public health experts. Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.

  11. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  12. Accelerating Pseudo-Random Number Generator for MCNP on GPU

    NASA Astrophysics Data System (ADS)

    Gong, Chunye; Liu, Jie; Chi, Lihua; Hu, Qingfeng; Deng, Li; Gong, Zhenghu

    2010-09-01

    Pseudo-random number generators (PRNG) are intensively used in many stochastic algorithms in particle simulations, artificial neural networks and other scientific computation. The PRNG in Monte Carlo N-Particle Transport Code (MCNP) requires long period, high quality, flexible jump and fast enough. In this paper, we implement such a PRNG for MCNP on NVIDIA's GTX200 Graphics Processor Units (GPU) using CUDA programming model. Results shows that 3.80 to 8.10 times speedup are achieved compared with 4 to 6 cores CPUs and more than 679.18 million double precision random numbers can be generated per second on GPU.

  13. Modeling Subsurface Reactive Flows Using Leadership-Class Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Richard T; Hammond, Glenn; Lichtner, Peter

    2009-01-01

    We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  14. Sirepo: a web-based interface for physical optics simulations - its deployment and use at NSLS-II

    NASA Astrophysics Data System (ADS)

    Rakitin, Maksim S.; Chubar, Oleg; Moeller, Paul; Nagler, Robert; Bruhwiler, David L.

    2017-08-01

    "Sirepo" is an open source cloud-based software framework which provides a convenient and user-friendly web-interface for scientific codes such as Synchrotron Radiation Workshop (SRW) running on a local machine or a remote server side. SRW is a physical optics code allowing to simulate the synchrotron radiation from various insertion devices (undulators and wigglers) and bending magnets. Another feature of SRW is a support of high-accuracy simulation of fully- and partially-coherent radiation propagation through X-ray optical beamlines, facilitated by so-called "Virtual Beamline" module. In the present work, we will discuss the most important features of Sirepo/SRW interface with emphasis on their use for commissioning of beamlines and simulation of experiments at National Synchrotron Light Source II. In particular, "Flux through Finite Aperture" and "Intensity" reports, visualizing results of the corresponding SRW calculations, are being routinely used for commissioning of undulators and X-ray optical elements. Material properties of crystals, compound refractive lenses, and some other optical elements can be dynamically obtained for the desired photon energy from the databases publicly available at Argonne National Lab and at Lawrence Berkeley Lab. In collaboration with the Center for Functional Nanomaterials (CFN) of BNL, a library of samples for coherent scattering experiments has been implemented in SRW and the corresponding Sample optical element was added to Sirepo. Electron microscope images of artificially created nanoscale samples can be uploaded to Sirepo to simulate scattering patterns created by synchrotron radiation in different experimental schemes that can be realized at beamlines.

  15. An assessment of multibody simulation tools for articulated spacecraft

    NASA Technical Reports Server (NTRS)

    Man, Guy K.; Sirlin, Samuel W.

    1989-01-01

    A survey of multibody simulation codes was conducted in the spring of 1988, to obtain an assessment of the state of the art in multibody simulation codes from the users of the codes. This survey covers the most often used articulated multibody simulation codes in the spacecraft and robotics community. There was no attempt to perform a complete survey of all available multibody codes in all disciplines. Furthermore, this is not an exhaustive evaluation of even robotics and spacecraft multibody simulation codes, as the survey was designed to capture feedback on issues most important to the users of simulation codes. We must keep in mind that the information received was limited and the technical background of the respondents varied greatly. Therefore, only the most often cited observations from the questionnaire are reported here. In this survey, it was found that no one code had both many users (reports) and no limitations. The first section is a report on multibody code applications. Following applications is a discussion of execution time, which is the most troublesome issue for flexible multibody codes. The representation of component flexible bodies, which affects both simulation setup time as well as execution time, is presented next. Following component data preparation, two sections address the accessibility or usability of a code, evaluated by considering its user interface design and examining the overall simulation integrated environment. A summary of user efforts at code verification is reported, before a tabular summary of the questionnaire responses. Finally, some conclusions are drawn.

  16. GPU accelerated simulations of 3D deterministic particle transport using discrete ordinates method

    NASA Astrophysics Data System (ADS)

    Gong, Chunye; Liu, Jie; Chi, Lihua; Huang, Haowei; Fang, Jingyue; Gong, Zhenghu

    2011-07-01

    Graphics Processing Unit (GPU), originally developed for real-time, high-definition 3D graphics in computer games, now provides great faculty in solving scientific applications. The basis of particle transport simulation is the time-dependent, multi-group, inhomogeneous Boltzmann transport equation. The numerical solution to the Boltzmann equation involves the discrete ordinates ( Sn) method and the procedure of source iteration. In this paper, we present a GPU accelerated simulation of one energy group time-independent deterministic discrete ordinates particle transport in 3D Cartesian geometry (Sweep3D). The performance of the GPU simulations are reported with the simulations of vacuum boundary condition. The discussion of the relative advantages and disadvantages of the GPU implementation, the simulation on multi GPUs, the programming effort and code portability are also reported. The results show that the overall performance speedup of one NVIDIA Tesla M2050 GPU ranges from 2.56 compared with one Intel Xeon X5670 chip to 8.14 compared with one Intel Core Q6600 chip for no flux fixup. The simulation with flux fixup on one M2050 is 1.23 times faster than on one X5670.

  17. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  18. Code Samples Used for Complexity and Control

    NASA Astrophysics Data System (ADS)

    Ivancevic, Vladimir G.; Reid, Darryn J.

    2015-11-01

    The following sections are included: * MathematicaⓇ Code * Generic Chaotic Simulator * Vector Differential Operators * NLS Explorer * 2C++ Code * C++ Lambda Functions for Real Calculus * Accelerometer Data Processor * Simple Predictor-Corrector Integrator * Solving the BVP with the Shooting Method * Linear Hyperbolic PDE Solver * Linear Elliptic PDE Solver * Method of Lines for a Set of the NLS Equations * C# Code * Iterative Equation Solver * Simulated Annealing: A Function Minimum * Simple Nonlinear Dynamics * Nonlinear Pendulum Simulator * Lagrangian Dynamics Simulator * Complex-Valued Crowd Attractor Dynamics * Freeform Fortran Code * Lorenz Attractor Simulator * Complex Lorenz Attractor * Simple SGE Soliton * Complex Signal Presentation * Gaussian Wave Packet * Hermitian Matrices * Euclidean L2-Norm * Vector/Matrix Operations * Plain C-Code: Levenberg-Marquardt Optimizer * Free Basic Code: 2D Crowd Dynamics with 3000 Agents

  19. PetIGA: A framework for high-performance isogeometric analysis

    DOE PAGES

    Dalcin, Lisandro; Collier, Nathaniel; Vignal, Philippe; ...

    2016-05-25

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility ofmore » PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. Lastly, we show strong scaling results on up to 4096 cores, which confirm the suitability of PetIGA for large scale simulations.« less

  20. Paradigms and strategies for scientific computing on distributed memory concurrent computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.T.; Walker, D.W.

    1994-06-01

    In this work we examine recent advances in parallel languages and abstractions that have the potential for improving the programmability and maintainability of large-scale, parallel, scientific applications running on high performance architectures and networks. This paper focuses on Fortran M, a set of extensions to Fortran 77 that supports the modular design of message-passing programs. We describe the Fortran M implementation of a particle-in-cell (PIC) plasma simulation application, and discuss issues in the optimization of the code. The use of two other methodologies for parallelizing the PIC application are considered. The first is based on the shared object abstraction asmore » embodied in the Orca language. The second approach is the Split-C language. In Fortran M, Orca, and Split-C the ability of the programmer to control the granularity of communication is important is designing an efficient implementation.« less

  1. High Performance Input/Output for Parallel Computer Systems

    NASA Technical Reports Server (NTRS)

    Ligon, W. B.

    1996-01-01

    The goal of our project is to study the I/O characteristics of parallel applications used in Earth Science data processing systems such as Regional Data Centers (RDCs) or EOSDIS. Our approach is to study the runtime behavior of typical programs and the effect of key parameters of the I/O subsystem both under simulation and with direct experimentation on parallel systems. Our three year activity has focused on two items: developing a test bed that facilitates experimentation with parallel I/O, and studying representative programs from the Earth science data processing application domain. The Parallel Virtual File System (PVFS) has been developed for use on a number of platforms including the Tiger Parallel Architecture Workbench (TPAW) simulator, The Intel Paragon, a cluster of DEC Alpha workstations, and the Beowulf system (at CESDIS). PVFS provides considerable flexibility in configuring I/O in a UNIX- like environment. Access to key performance parameters facilitates experimentation. We have studied several key applications fiom levels 1,2 and 3 of the typical RDC processing scenario including instrument calibration and navigation, image classification, and numerical modeling codes. We have also considered large-scale scientific database codes used to organize image data.

  2. CDAC Student Report: Summary of LLNL Internship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herriman, Jane E.

    Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less

  3. Real science at the petascale.

    PubMed

    Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V

    2009-06-28

    We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.

  4. Modeling the Blast Load Simulator Airblast Environment using First Principles Codes. Report 1, Blast Load Simulator Environment

    DTIC Science & Technology

    2016-11-01

    ER D C/ G SL T R- 16 -3 1 Modeling the Blast Load Simulator Airblast Environment Using First Principles Codes Report 1, Blast Load...Simulator Airblast Environment using First Principles Codes Report 1, Blast Load Simulator Environment Gregory C. Bessette, James L. O’Daniel...evaluate several first principles codes (FPCs) for modeling airblast environments typical of those encountered in the BLS. The FPCs considered were

  5. Final Report for ALCC Allocation: Predictive Simulation of Complex Flow in Wind Farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew F.; Ananthan, Shreyas; Churchfield, Matt

    This report documents work performed using ALCC computing resources granted under a proposal submitted in February 2016, with the resource allocation period spanning the period July 2016 through June 2017. The award allocation was 10.7 million processor-hours at the National Energy Research Scientific Computing Center. The simulations performed were in support of two projects: the Atmosphere to Electrons (A2e) project, supported by the DOE EERE office; and the Exascale Computing Project (ECP), supported by the DOE Office of Science. The project team for both efforts consists of staff scientists and postdocs from Sandia National Laboratories and the National Renewable Energymore » Laboratory. At the heart of these projects is the open-source computational-fluid-dynamics (CFD) code, Nalu. Nalu solves the low-Mach-number Navier-Stokes equations using an unstructured- grid discretization. Nalu leverages the open-source Trilinos solver library and the Sierra Toolkit (STK) for parallelization and I/O. This report documents baseline computational performance of the Nalu code on problems of direct relevance to the wind plant physics application - namely, Large Eddy Simulation (LES) of an atmospheric boundary layer (ABL) flow and wall-modeled LES of a flow past a static wind turbine rotor blade. Parallel performance of Nalu and its constituent solver routines residing in the Trilinos library has been assessed previously under various campaigns. However, both Nalu and Trilinos have been, and remain, in active development and resources have not been available previously to rigorously track code performance over time. With the initiation of the ECP, it is important to establish and document baseline code performance on the problems of interest. This will allow the project team to identify and target any deficiencies in performance, as well as highlight any performance bottlenecks as we exercise the code on a greater variety of platforms and at larger scales. The current study is rather modest in scale, examining performance on problem sizes of O(100 million) elements and core counts up to 8k cores. This will be expanded as more computational resources become available to the projects.« less

  6. Core domains of shared decision-making during psychiatric visits: scientific and preference-based discussions.

    PubMed

    Fukui, Sadaaki; Matthias, Marianne S; Salyers, Michelle P

    2015-01-01

    Shared decision-making (SDM) is imperative to person-centered care, yet little is known about what aspects of SDM are targeted during psychiatric visits. This secondary data analysis (191 psychiatric visits with 11 providers, coded with a validated SDM coding system) revealed two factors (scientific and preference-based discussions) underlying SDM communication. Preference-based discussion occurred less. Both provider and consumer initiation of SDM elements and decision complexity were associated with greater discussions in both factors, but were more strongly associated with scientific discussion. Longer visit length correlated with only scientific discussion. Providers' understanding of core domains could facilitate engaging consumers in SDM.

  7. How Do We Ensure Research and Scientific Integrity? A Diverse Panel Discusses the Critical Components and Challenges of Crafting and Implementing Effective Scientific Integrity Policies.

    NASA Astrophysics Data System (ADS)

    Werkheiser, W. H.

    2016-12-01

    10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.

  8. How Do We Ensure Research and Scientific Integrity? A Diverse Panel Discusses the Critical Components and Challenges of Crafting and Implementing Effective Scientific Integrity Policies.

    NASA Astrophysics Data System (ADS)

    Werkheiser, W. H.

    2017-12-01

    10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.

  9. The Bern Simple Climate Model (BernSCM) v1.0: an extensible and fully documented open-source re-implementation of the Bern reduced-form model for global carbon cycle-climate simulations

    NASA Astrophysics Data System (ADS)

    Strassmann, Kuno M.; Joos, Fortunat

    2018-05-01

    The Bern Simple Climate Model (BernSCM) is a free open-source re-implementation of a reduced-form carbon cycle-climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs). The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate-carbon cycle response simulated by more complex and detailed models. Model code (in Fortran) was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle-climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs), for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.

  10. Testing Scientific Software: A Systematic Literature Review

    PubMed Central

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  11. Assessment of Molecular Modeling & Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materialsmore » modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.« less

  12. New developments in the McStas neutron instrument simulation package

    NASA Astrophysics Data System (ADS)

    Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.

    2014-07-01

    The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.

  13. Implementing Scientific Simulation Codes Highly Tailored for Vector Architectures Using Custom Configurable Computing Machines

    NASA Technical Reports Server (NTRS)

    Rutishauser, David

    2006-01-01

    The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters that attempts to minimize execution time, while staying within resource constraints. The flexibility of using a custom reconfigurable implementation is exploited in a unique manner to leverage the lessons learned in vector supercomputer development. The vector processing framework is tailored to the application, with variable parameters that are fixed in traditional vector processing. Benchmark data that demonstrates the functionality and utility of the approach is presented. The benchmark data includes an identified bottleneck in a real case study example vector code, the NASA Langley Terminal Area Simulation System (TASS) application.

  14. Determining Attitudes of Postgraduate Students towards Scientific Research and Codes of Conduct, Supported by Digital Script

    ERIC Educational Resources Information Center

    Tavukcu, Tahir

    2016-01-01

    In this research, it is aimed to determine the effect of the attitudes of postgraduate students towards scientific research and codes of conduct, supported by digital script. This research is a quantitative study, and it has been formed according to pre-test & post-test research model of experiment and control group. In both groups, lessons…

  15. A comparison between implicit and hybrid methods for the calculation of steady and unsteady inlet flows

    NASA Technical Reports Server (NTRS)

    Coakley, T. J.; Hsieh, T.

    1985-01-01

    Numerical simulation of steady and unsteady transonic diffuser flows using two different computer codes are discussed and compared with experimental data. The codes solve the Reynolds-averaged, compressible, Navier-Stokes equations using various turbulence models. One of the codes has been applied extensively to diffuser flows and uses the hybrid method of MacCormack. This code is relatively inefficient numerically. The second code, which was developed more recently, is fully implicit and is relatively efficient numerically. Simulations of steady flows using the implicit code are shown to be in good agreement with simulations using the hybrid code. Both simulations are in good agreement with experimental results. Simulations of unsteady flows using the two codes are in good qualitative agreement with each other, although the quantitative agreement is not as good as in the steady flow cases. The implicit code is shown to be eight times faster than the hybrid code for unsteady flow calculations and up to 32 times faster for steady flow calculations. Results of calculations using alternative turbulence models are also discussed.

  16. Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images

    DTIC Science & Technology

    2009-12-01

    Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong...Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong DRDC Ottawa...for simulating radar images of a target is obtained, through direct simulation-to-measurement comparisons. A 3-dimensional computer-aided design

  17. The science behind codes and standards for safe walkways: changes in level, stairways, stair handrails and slip resistance.

    PubMed

    Nemire, Kenneth; Johnson, Daniel A; Vidal, Keith

    2016-01-01

    Walkway codes and standards are often created through consensus by committees based on a number of factors, including historical precedence, common practice, cost, and empirical data. The authors maintain that in the formulation of codes and standards that impact pedestrian safety, the results of pertinent scientific research should be given significant weight. This article examines many elements of common walkway codes and standards related to changes in level, stairways, stair handrails, and slip resistance. It identifies which portions are based on or supported by empirical data; and which could benefit from additional scientific research. This article identifies areas in which additional research, codes, and standards may be beneficial to enhance pedestrian safety. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  18. Practices in source code sharing in astrophysics

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly

    2013-02-01

    While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.

  19. Biosemiotics: a new understanding of life.

    PubMed

    Barbieri, Marcello

    2008-07-01

    Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes--copying and coding--and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.

  20. GPU Multi-Scale Particle Tracking and Multi-Fluid Simulations of the Radiation Belts

    NASA Astrophysics Data System (ADS)

    Ziemba, T.; Carscadden, J.; O'Donnell, D.; Winglee, R.; Harnett, E.; Cash, M.

    2007-12-01

    The properties of the radiation belts can vary dramatically under the influence of magnetic storms and storm-time substorms. The task of understanding and predicting radiation belt properties is made difficult because their properties determined by global processes as well as small-scale wave-particle interactions. A full solution to the problem will require major innovations in technique and computer hardware. The proposed work will demonstrates liked particle tracking codes with new multi-scale/multi-fluid global simulations that provide the first means to include small-scale processes within the global magnetospheric context. A large hurdle to the problem is having sufficient computer hardware that is able to handle the dissipate temporal and spatial scale sizes. A major innovation of the work is that the codes are designed to run of graphics processing units (GPUs). GPUs are intrinsically highly parallelized systems that provide more than an order of magnitude computing speed over a CPU based systems, for little more cost than a high end-workstation. Recent advancements in GPU technologies allow for full IEEE float specifications with performance up to several hundred GFLOPs per GPU and new software architectures have recently become available to ease the transition from graphics based to scientific applications. This allows for a cheap alternative to standard supercomputing methods and should increase the time to discovery. A demonstration of the code pushing more than 500,000 particles faster than real time is presented, and used to provide new insight into radiation belt dynamics.

  1. Neuromas

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  2. A combined Compton and coded-aperture telescope for medium-energy gamma-ray astrophysics

    NASA Astrophysics Data System (ADS)

    Galloway, Michelle; Zoglauer, Andreas; Boggs, Steven E.; Amman, Mark

    2018-06-01

    A future mission in medium-energy gamma-ray astrophysics would allow for many scientific advancements, such as a possible explanation for the excess positron emission from the Galactic center, a better understanding of nucleosynthesis and explosion mechanisms in Type Ia supernovae, and a look at the physical forces at play in compact objects such as black holes and neutron stars. Additionally, further observation in this energy regime would significantly extend the search parameter space for low-mass dark matter. In order to achieve these objectives, an instrument with good energy resolution, good angular resolution, and high sensitivity is required. In this paper we present the design and simulation of a Compton telescope consisting of cubic-centimeter cadmium zinc telluride detectors as absorbers behind a silicon tracker with the addition of a passive coded mask. The goal of the design was to create a very sensitive instrument that is capable of high angular resolution. The simulated telescope achieved energy resolutions of 1.68% FWHM at 511 keV and 1.11% at 1809 keV, on-axis angular resolutions in Compton mode of 2.63° FWHM at 511 keV and 1.30° FWHM at 1809 keV, and is capable of resolving sources to at least 0.2° at lower energies with the use of the coded mask. An initial assessment of the instrument in Compton-imaging mode yields an effective area of 183 cm2 at 511 keV and an anticipated all-sky sensitivity of 3.6 × 10-6 photons cm-2 s-1 for a broadened 511 keV source over a two-year observation time. Additionally, combining a coded mask with a Compton imager to improve point-source localization for positron detection has been demonstrated.

  3. Computational Cosmology at the Bleeding Edge

    NASA Astrophysics Data System (ADS)

    Habib, Salman

    2013-04-01

    Large-area sky surveys are providing a wealth of cosmological information to address the mysteries of dark energy and dark matter. Observational probes based on tracking the formation of cosmic structure are essential to this effort, and rely crucially on N-body simulations that solve the Vlasov-Poisson equation in an expanding Universe. As statistical errors from survey observations continue to shrink, and cosmological probes increase in number and complexity, simulations are entering a new regime in their use as tools for scientific inference. Changes in supercomputer architectures provide another rationale for developing new parallel simulation and analysis capabilities that can scale to computational concurrency levels measured in the millions to billions. In this talk I will outline the motivations behind the development of the HACC (Hardware/Hybrid Accelerated Cosmology Code) extreme-scale cosmological simulation framework and describe its essential features. By exploiting a novel algorithmic structure that allows flexible tuning across diverse computer architectures, including accelerated and many-core systems, HACC has attained a performance of 14 PFlops on the IBM BG/Q Sequoia system at 69% of peak, using more than 1.5 million cores.

  4. Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis

    DOE PAGES

    Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders; ...

    2017-09-01

    Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less

  5. Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders

    Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less

  6. Foot Surgery

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  7. Athlete's Foot

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  8. Heel Pain

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  9. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less

  10. Fast high-energy X-ray imaging for Severe Accidents experiments on the future PLINIUS-2 platform

    NASA Astrophysics Data System (ADS)

    Berge, L.; Estre, N.; Tisseur, D.; Payan, E.; Eck, D.; Bouyer, V.; Cassiaut-Louis, N.; Journeau, C.; Tellier, R. Le; Pluyette, E.

    2018-01-01

    The future PLINIUS-2 platform of CEA Cadarache will be dedicated to the study of corium interactions in severe nuclear accidents, and will host innovative large-scale experiments. The Nuclear Measurement Laboratory of CEA Cadarache is in charge of real-time high-energy X-ray imaging set-ups, for the study of the corium-water and corium-sodium interaction, and of the corium stratification process. Imaging such large and high-density objects requires a 15 MeV linear electron accelerator coupled to a tungsten target creating a high-energy Bremsstrahlung X-ray flux, with corresponding dose rate about 100 Gy/min at 1 m. The signal is detected by phosphor screens coupled to high-framerate scientific CMOS cameras. The imaging set-up is established using an experimentally-validated home-made simulation software (MODHERATO). The code computes quantitative radiographic signals from the description of the source, object geometry and composition, detector, and geometrical configuration (magnification factor, etc.). It accounts for several noise sources (photonic and electronic noises, swank and readout noise), and for image blur due to the source spot-size and to the detector unsharpness. In a view to PLINIUS-2, the simulation has been improved to account for the scattered flux, which is expected to be significant. The paper presents the scattered flux calculation using the MCNP transport code, and its integration into the MODHERATO simulation. Then the validation of the improved simulation is presented, through confrontation to real measurement images taken on a small-scale equivalent set-up on the PLINIUS platform. Excellent agreement is achieved. This improved simulation is therefore being used to design the PLINIUS-2 imaging set-ups (source, detectors, cameras, etc.).

  11. Biosemiotics: a new understanding of life

    NASA Astrophysics Data System (ADS)

    Barbieri, Marcello

    2008-07-01

    Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes—copying and coding—and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.

  12. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  13. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  14. Sweaty Feet (Hyperhidrosis)

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  15. Corns and Calluses

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  16. Diabetic Wound Care

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  17. Toenail Fungus (Onychomycosis)

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  18. XML-Based Generator of C++ Code for Integration With GUIs

    NASA Technical Reports Server (NTRS)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  19. ANNarchy: a code generation approach to neural simulations on parallel hardware

    PubMed Central

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  20. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  1. Open-Source Python Tools for Deploying Interactive GIS Dashboards for a Billion Datapoints on a Laptop

    NASA Astrophysics Data System (ADS)

    Steinberg, P. D.; Bednar, J. A.; Rudiger, P.; Stevens, J. L. R.; Ball, C. E.; Christensen, S. D.; Pothina, D.

    2017-12-01

    The rich variety of software libraries available in the Python scientific ecosystem provides a flexible and powerful alternative to traditional integrated GIS (geographic information system) programs. Each such library focuses on doing a certain set of general-purpose tasks well, and Python makes it relatively simple to glue the libraries together to solve a wide range of complex, open-ended problems in Earth science. However, choosing an appropriate set of libraries can be challenging, and it is difficult to predict how much "glue code" will be needed for any particular combination of libraries and tasks. Here we present a set of libraries that have been designed to work well together to build interactive analyses and visualizations of large geographic datasets, in standard web browsers. The resulting workflows run on ordinary laptops even for billions of data points, and easily scale up to larger compute clusters when available. The declarative top-level interface used in these libraries means that even complex, fully interactive applications can be built and deployed as web services using only a few dozen lines of code, making it simple to create and share custom interactive applications even for datasets too large for most traditional GIS systems. The libraries we will cover include GeoViews (HoloViews extended for geographic applications) for declaring visualizable/plottable objects, Bokeh for building visual web applications from GeoViews objects, Datashader for rendering arbitrarily large datasets faithfully as fixed-size images, Param for specifying user-modifiable parameters that model your domain, Xarray for computing with n-dimensional array data, Dask for flexibly dispatching computational tasks across processors, and Numba for compiling array-based Python code down to fast machine code. We will show how to use the resulting workflow with static datasets and with simulators such as GSSHA or AdH, allowing you to deploy flexible, high-performance web-based dashboards for your GIS data or simulations without needing major investments in code development or maintenance.

  2. The ePLAS Code for Ignition Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mason, Rodney J

    2012-09-20

    Inertial Confinement Fusion (ICF) presents unique opportunities for the extraction of clean energy from Fusion. Intense lasers and particle beams can create and interact with such plasmas, potentially yielding sufficient energy to satisfy all our national needs. However, few models are available to help aid the scientific community in the study and optimization of such interactions. This project enhanced and disseminated the computer code ePLAS for the early understanding and control of Ignition in ICF. ePLAS is a unique simulation code that tracks the transport of laser light to a target, the absorption of that light resulting in the generationmore » and transport of hot electrons, and the heating and flow dynamics of the background plasma. It uses an implicit electromagnetic field-solving method to greatly reduce computing demands, so that useful target interaction studies can often be completed in 15 minutes on a portable 2.1 GHz PC. The code permits the rapid scoping of calculations for the optimization of laser target interactions aimed at fusion. Recent efforts have initiated the use of analytic equations of state (EOS), K-alpha image rendering graphics, allocatable memory for source-free usage, and adaption to the latest Mac and Linux Operating Systems. The speed and utility of ePLAS are unequaled in the ICF simulation community. This project evaluated the effects of its new EOSs on target heating, compared fluid and particle models for the ions, initiated the simultaneous use of both ion models in the code, and studied long time scale 500 ps hot electron deposition for shock ignition. ePLAS has been granted EAR99 export control status, permitting export without a license to most foreign countries. Beta-test versions of ePLAS have been granted to several Universities and Commercial users. The net Project was aimed at achieving early success in the laboratory ignition of thermonuclear targets and the mastery of controlled fusion power for the nation.« less

  3. Sprains, Strains and Fractures

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  4. Arthritis and the Feet

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  5. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  6. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling

    PubMed Central

    Punzalan, Florencio Rusty; Kunieda, Yoshitoshi; Amano, Akira

    2015-01-01

    Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs). Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code generator can be used to generate code for physiological simulations and provides a tool for studying cardiac electrophysiology. PMID:26356082

  7. Tristan code and its application

    NASA Astrophysics Data System (ADS)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  8. X-Ray Astronomy

    NASA Technical Reports Server (NTRS)

    Wu, S. T.

    2000-01-01

    Dr. S. N. Zhang has lead a seven member group (Dr. Yuxin Feng, Mr. XuejunSun, Mr. Yongzhong Chen, Mr. Jun Lin, Mr. Yangsen Yao, and Ms. Xiaoling Zhang). This group has carried out the following activities: continued data analysis from space astrophysical missions CGRO, RXTE, ASCA and Chandra. Significant scientific results have been produced as results of their work. They discovered the three-layered accretion disk structure around black holes in X-ray binaries; their paper on this discovery is to appear in the prestigious Science magazine. They have also developed a new method for energy spectral analysis of black hole X-ray binaries; four papers on this topics were presented at the most recent Atlanta AAS meeting. They have also carried Monte-Carlo simulations of X-ray detectors, in support to the hardware development efforts at Marshall Space Flight Center (MSFC). These computation-intensive simulations have been carried out entirely on the computers at UAH. They have also carried out extensive simulations for astrophysical applications, taking advantage of the Monte-Carlo simulation codes developed previously at MSFC and further improved at UAH for detector simulations. One refereed paper and one contribution to conference proceedings have been resulted from this effort.

  9. Comparison of conversion coefficients for equivalent dose in terms of air kerma for photons using a male adult voxel simulator in sitting and standing posture with geometry of irradiation antero-posterior

    NASA Astrophysics Data System (ADS)

    Galeano, D. C.; Cavalcante, F. R.; Carvalho, A. B.; Hunt, J.

    2014-02-01

    The dose conversion coefficient (DCC) is important to quantify and assess effective doses associated with medical, professional and public exposures. The calculation of DCCs using anthropomorphic simulators and radiation transport codes is justified since in-vivo measurement of effective dose is extremely difficult and not practical for occupational dosimetry. DCCs have been published by the ICRP using simulators in a standing posture, which is not always applicable to all exposure scenarios, providing an inaccurate dose estimation. The aim of this work was to calculate DCCs for equivalent dose in terms of air kerma (H/Kair) using the Visual Monte Carlo (VMC) code and the VOXTISS8 adult male voxel simulator in sitting and standing postures. In both postures, the simulator was irradiated by a plane source of monoenergetic photons in antero-posterior (AP) geometry. The photon energy ranged from 15 keV to 2 MeV. The DCCs for both postures were compared and the DCCs for the standing simulator were higher. For certain organs, the difference of DCCs were more significant, as in gonads (48% higher), bladder (16% higher) and colon (11% higher). As these organs are positioned in the abdominal region, the posture of the anthropomorphic simulator modifies the form in which the radiation is transported and how the energy is deposited. It was also noted that the average percentage difference of conversion coefficients was 33% for the bone marrow, 11% for the skin, 13% for the bone surface and 31% for the muscle. For other organs, the percentage difference of the DCCs for both postures was not relevant (less than 5%) due to no anatomical changes in the organs of the head, chest and upper abdomen. We can conclude that is important to obtain DCCs using different postures from those present in the scientific literature.

  10. Skin Cancers of the Feet

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  11. Do Over or Make Do? Climate Models as a Software Development Challenge (Invited)

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.

    2010-12-01

    We present the results of a comparative study of the software engineering culture and practices at four different earth system modeling centers: the UK Met Office Hadley Centre, the National Center for Atmospheric Research (NCAR), The Max-Planck-Institut für Meteorologie (MPI-M), and the Institut Pierre Simon Laplace (IPSL). The study investigated the software tools and techniques used at each center to assess their effectiveness. We also investigated how differences in the organizational structures, collaborative relationships, and technical infrastructures constrain the software development and affect software quality. Specific questions for the study included 1) Verification and Validation - What techniques are used to ensure that the code matches the scientists’ understanding of what it should do? How effective are these are at eliminating errors of correctness and errors of understanding? 2) Coordination - How are the contributions from across the modeling community coordinated? For coupled models, how are the differences in the priorities of different, overlapping communities of users addressed? 3) Division of responsibility - How are the responsibilities for coding, verification, and coordination distributed between different roles (scientific, engineering, support) in the organization? 4) Planning and release processes - How do modelers decide on priorities for model development, how do they decide which changes to tackle in a particular release of the model? 5) Debugging - How do scientists debug the models, what types of bugs do they find in their code, and how they find them? The results show that each center has evolved a set of model development practices that are tailored to their needs and organizational constraints. These practices emphasize scientific validity, but tend to neglect other software qualities, and all the centers struggle frequently with software problems. The testing processes are effective at removing software errors prior to release, but the code is hard to understand and hard to change. Software errors and model configuration problems are common during model development, and appear to have a serious impact on scientific productivity. These problems have grown dramatically in recent years with the growth in size and complexity of earth system models. Much of the success in obtaining valid simulations from the models depends on the scientists developing their own code, experimenting with alternatives, running frequent full system tests, and exploring patterns in the results. Blind application of generic software engineering processes is unlikely to work well. Instead, each center needs to lean how to balance the need for better coordination through a more disciplined approach with the freedom to explore, and the value of having scientists work directly with the code. This suggests that each center can learn a lot from comparing their practices with others, but that each might need to develop a different set of best practices.

  12. Prescription Custom Orthotics and Shoe Inserts

    MedlinePlus

    ... and Reimbursement Basics APMA Career Center Your APMA Leadership Opportunities Early Career Resources Academic and Scientific Resources Practice Management & Reimbursement Coding Resources Coding Resource Center Reimbursement Resources ...

  13. Case studies of fifth-grade student modeling in science through programming: Comparison of modeling practices and conversations

    NASA Astrophysics Data System (ADS)

    Louca, Loucas

    This is a descriptive case study investigating the use of two computer-based programming environments (CPEs), MicroWorlds(TM) (MW) and Stagecast Creator(TM) (SC), as modeling tools for collaborative fifth grade science learning. In this study I investigated how CPEs might support fifth grade student work and inquiry in science. There is a longstanding awareness of the need to help students learn about models and modeling in science, and CPEs are promising tools for this. A computer program can be a model of a physical system, and modeling through programming may make the process more tangible: Programming involves making decisions and assumptions; the code is used to express ideas; running the program shows the implications of those ideas. In this study I have analyzed and compared students' activities and conversations in two after-school clubs, one working with MW and the other with SC. The findings confirm the promise of CPEs as tools for teaching practices of modeling and science, and they suggest advantages and disadvantages to that purpose of particular aspects of CPE designs. MW is an open-ended, textual CPE that uses procedural programming. MW students focused on breaking down phenomena into small programmable pieces, which is useful for scientific modeling. Developing their programs, the students focused on writing, testing and debugging code, which are also useful for scientific modeling. SC is a non-linear, object-oriented CPE that uses visual program language. SC students saw their work as creating games. They were focused on the overall story which they then translated it into SC rules, which was in conflict with SC's object-oriented interface. However, telling the story of individual causal agents was useful for scientific modeling. Programming in SC was easier, whereas reading code in MW was more tangible. The latter helped MW students to use the code as the representation of the phenomenon rather than merely as a tool for creating a simulation. The analyses also pointed to three emerging "frames" that describe student's work focus, based on their goals, strategies, and criteria for success. Emerging "frames" are the programming, the visualization, and the modeling frame. One way to understand the respective advantages and disadvantages of the two CPEs is with respect to which frames they engendered in students.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos

    Application analysis is facilitated through a number of program profiling tools. The tools vary in their complexity, ease of deployment, design, and profiling detail. Specifically, understand- ing, analyzing, and optimizing is of particular importance for scientific applications where minor changes in code paths and data-structure layout can have profound effects. Understanding how intricate data-structures are accessed and how a given memory system responds is a complex task. In this paper we describe a trace profiling tool, Glprof, specifically aimed to lessen the burden of the programmer to pin-point heavily involved data-structures during an application's run-time, and understand data-structure run-time usage.more » Moreover, we showcase the tool's modularity using additional cache simulation components. We elaborate on the tool's design, and features. Finally we demonstrate the application of our tool in the context of Spec bench- marks using the Glprof profiler and two concurrently running cache simulators, PPC440 and AMD Interlagos.« less

  15. The Particle-in-Cell and Kinetic Simulation Software Center

    NASA Astrophysics Data System (ADS)

    Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; An, W.; Dalichaouch, T. N.; Davidson, A.; Hildebrand, L.; Joglekar, A.; May, J.; Miller, K.; Touati, M.; Xu, X. L.

    2017-10-01

    The UCLA Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) aims to support an international community of PIC and plasma kinetic software developers, users, and educators; to increase the use of this software for accelerating the rate of scientific discovery; and to be a repository of knowledge and history for PIC. We discuss progress towards making available and documenting illustrative open-source software programs and distinct production programs; developing and comparing different PIC algorithms; coordinating the development of resources for the educational use of kinetic software; and the outcomes of our first sponsored OSIRIS users workshop. We also welcome input and discussion from anyone interested in using or developing kinetic software, in obtaining access to our codes, in collaborating, in sharing their own software, or in commenting on how PICKSC can better serve the DPP community. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  16. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  17. Using Jupyter Notebooks for Interactive Space Science Simulations

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht

    2016-04-01

    Jupyter Notebooks can be used as an effective means to communicate scientific ideas through Web-based visualisations and, at the same time, give a user more than a pre-defined set of options to manipulate the visualisations. To some degree, even computations can be done without too much knowledge of the underlying data structures and infrastructure to discover novel aspects of the data or tailor view to users' needs. Here, we show how to combine Jupyter Notebooks with other open-source tools to provide rich and interactive views on space data, especially the visualisation of spacecraft operations. Topics covered are orbit visualisation, spacecraft orientation, instrument timelines as well as performance analysis of mission segments. Technically, also the re-use and integration of existing components will be shown, both on the code level as well on the visualisation level so that the effort which was put into the development of new components could be reduced. Another important aspect is the bridging of the gap between operational data and the scientific exploitation of the payload data, for which also a way forward will be shown. A lesson learned from the implementation and use of a prototype is the synergy between the team who provisions the notebooks and the consumers, who both share access to the same code base, if not resources; this often simplifies communication and deployment.

  18. High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas

    2016-05-01

    The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling we observe a parallel efficiency above 0.8 for both two and four dimensional problems and up to 8192 cores. Strong scaling results show good scalability to at least 512 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov-Poisson solver that is implemented using the framework provided. All of the simulations are conducted in the context of worst case communication overhead; i.e., in a setting where the CFL (Courant-Friedrichs-Lewy) number increases linearly with the problem size. The framework introduced in this paper facilitates a dimension independent implementation of scientific codes (based on C++ templates) using both an MPI and a hybrid approach to parallelization. We describe the essential ingredients of our implementation.

  19. Fully accelerating quantum Monte Carlo simulations of real materials on GPU clusters

    NASA Astrophysics Data System (ADS)

    Esler, Kenneth

    2011-03-01

    Quantum Monte Carlo (QMC) has proved to be an invaluable tool for predicting the properties of matter from fundamental principles, combining very high accuracy with extreme parallel scalability. By solving the many-body Schrödinger equation through a stochastic projection, it achieves greater accuracy than mean-field methods and better scaling with system size than quantum chemical methods, enabling scientific discovery across a broad spectrum of disciplines. In recent years, graphics processing units (GPUs) have provided a high-performance and low-cost new approach to scientific computing, and GPU-based supercomputers are now among the fastest in the world. The multiple forms of parallelism afforded by QMC algorithms make the method an ideal candidate for acceleration in the many-core paradigm. We present the results of porting the QMCPACK code to run on GPU clusters using the NVIDIA CUDA platform. Using mixed precision on GPUs and MPI for intercommunication, we observe typical full-application speedups of approximately 10x to 15x relative to quad-core CPUs alone, while reproducing the double-precision CPU results within statistical error. We discuss the algorithm modifications necessary to achieve good performance on this heterogeneous architecture and present the results of applying our code to molecules and bulk materials. Supported by the U.S. DOE under Contract No. DOE-DE-FG05-08OR23336 and by the NSF under No. 0904572.

  20. Assessment and Application of the ROSE Code for Reactor Outage Thermal-Hydraulic and Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Thomas K.S.; Ko, F.-K.; Dai, L.-C

    The currently available tools, such as RELAP5, RETRAN, and others, cannot easily and correctly perform the task of analyzing the system behavior during plant outages. Therefore, a medium-sized program aiming at reactor outage simulation and evaluation, such as midloop operation (MLO) with loss of residual heat removal (RHR), has been developed. Important thermal-hydraulic processes involved during MLO with loss of RHR can be properly simulated by the newly developed reactor outage simulation and evaluation (ROSE) code. The two-region approach with a modified two-fluid model has been adopted to be the theoretical basis of the ROSE code.To verify the analytical modelmore » in the first step, posttest calculations against the integral midloop experiments with loss of RHR have been performed. The excellent simulation capacity of the ROSE code against the Institute of Nuclear Energy Research Integral System Test Facility test data is demonstrated. To further mature the ROSE code in simulating a full-sized pressurized water reactor, assessment against the WGOTHIC code and the Maanshan momentary-loss-of-RHR event has been undertaken. The successfully assessed ROSE code is then applied to evaluate the abnormal operation procedure (AOP) with loss of RHR during MLO (AOP 537.4) for the Maanshan plant. The ROSE code also has been successfully transplanted into the Maanshan training simulator to support operator training. How the simulator was upgraded by the ROSE code for MLO will be presented in the future.« less

  1. Comparison of DAC and MONACO DSMC Codes with Flat Plate Simulation

    NASA Technical Reports Server (NTRS)

    Padilla, Jose F.

    2010-01-01

    Various implementations of the direct simulation Monte Carlo (DSMC) method exist in academia, government and industry. By comparing implementations, deficiencies and merits of each can be discovered. This document reports comparisons between DSMC Analysis Code (DAC) and MONACO. DAC is NASA's standard DSMC production code and MONACO is a research DSMC code developed in academia. These codes have various differences; in particular, they employ distinct computational grid definitions. In this study, DAC and MONACO are compared by having each simulate a blunted flat plate wind tunnel test, using an identical volume mesh. Simulation expense and DSMC metrics are compared. In addition, flow results are compared with available laboratory data. Overall, this study revealed that both codes, excluding grid adaptation, performed similarly. For parallel processing, DAC was generally more efficient. As expected, code accuracy was mainly dependent on physical models employed.

  2. Simulation of Ionospheric Response During Solar Eclipse Events

    NASA Astrophysics Data System (ADS)

    Kordella, L.; Earle, G. D.; Huba, J.

    2016-12-01

    Total solar eclipses are rare, short duration events that present interesting case studies of ionospheric behavior because the structure of the ionosphere is determined and stabilized by varying energies of solar radiation (Lyman alpha, X-ray, U.V., etc.). The ionospheric response to eclipse events is a source of scientific intrigue that has been studied in various capacities over the past 50 years. Unlike the daily terminator crossings, eclipses cause highly localized, steep gradients of ionization efficiency due to their comparatively small solar zenith angle. However, the corona remains present even at full obscuration, meaning that the energy reduction never falls to the levels seen at night. Previous eclipse studies performed by research groups in the US, UK, China and Russia have shown a range of effects, some counter-intuitive and others contradictory. In the shadowed region of an eclipse (i.e. umbra) it is logical to assume a reduction in ionization rates correlating with the reduction of incident solar radiation. Results have shown that even this straightforward hypothesis may not be true; effects on plasma distribution, motion and temperature are more appreciable than might be expected. Recent advancements in ionospheric simulation codes present the opportunity to investigate the relationship between geophysical conditions and geomagnetic location on resulting eclipse event ionosphere. Here we present computational simulation results using the Naval Research Lab (NRL) developed ionospheric modeling codes Sami2 and Sami3 (Sami2 is Another Model of the Ionosphere) modified with spatio-temporal photoionization attenuation functions derived from theory and empirical data.

  3. A computational geometry approach to pore network construction for granular packings

    NASA Astrophysics Data System (ADS)

    van der Linden, Joost H.; Sufian, Adnan; Narsilio, Guillermo A.; Russell, Adrian R.; Tordesillas, Antoinette

    2018-03-01

    Pore network construction provides the ability to characterize and study the pore space of inhomogeneous and geometrically complex granular media in a range of scientific and engineering applications. Various approaches to the construction have been proposed, however subtle implementational details are frequently omitted, open access to source code is limited, and few studies compare multiple algorithms in the context of a specific application. This study presents, in detail, a new pore network construction algorithm, and provides a comprehensive comparison with two other, well-established Delaunay triangulation-based pore network construction methods. Source code is provided to encourage further development. The proposed algorithm avoids the expensive non-linear optimization procedure in existing Delaunay approaches, and is robust in the presence of polydispersity. Algorithms are compared in terms of structural, geometrical and advanced connectivity parameters, focusing on the application of fluid flow characteristics. Sensitivity of the various networks to permeability is assessed through network (Stokes) simulations and finite-element (Navier-Stokes) simulations. Results highlight strong dependencies of pore volume, pore connectivity, throat geometry and fluid conductance on the degree of tetrahedra merging and the specific characteristics of the throats targeted by the merging algorithm. The paper concludes with practical recommendations on the applicability of the three investigated algorithms.

  4. Fostering successful scientific software communities

    NASA Astrophysics Data System (ADS)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  5. 50 CFR Table 2 to Part 680 - Crab Species Code

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., Table 2 Table 2 to Part 680—Crab Species Code Species code Common name Scientific name 900 Box Lopholithodes mandtii. 910 Dungeness Cancer magister. 921 Red king crab Paralithodes camtshaticus. 922 Blue king... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Crab Species Code 2 Table 2 to Part 680...

  6. 50 CFR Table 2 to Part 680 - Crab Species Code

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., Table 2 Table 2 to Part 680—Crab Species Code Species code Common name Scientific name 900 Box Lopholithodes mandtii. 910 Dungeness Cancer magister. 921 Red king crab Paralithodes camtshaticus. 922 Blue king... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Crab Species Code 2 Table 2 to Part 680...

  7. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  8. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  9. Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data

    NASA Astrophysics Data System (ADS)

    Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.

    2017-10-01

    We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.

  10. Combustor Simulation

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    The goal was to perform 3D simulation of GE90 combustor, as part of full turbofan engine simulation. Requirements of high fidelity as well as fast turn-around time require massively parallel code. National Combustion Code (NCC) was chosen for this task as supports up to 999 processors and includes state-of-the-art combustion models. Also required is ability to take inlet conditions from compressor code and give exit conditions to turbine code.

  11. An approach for coupled-code multiphysics core simulations from a common input

    DOE PAGES

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...

    2014-12-10

    This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less

  12. Investigation on the Capability of a Non Linear CFD Code to Simulate Wave Propagation

    DTIC Science & Technology

    2003-02-01

    Linear CFD Code to Simulate Wave Propagation Pedro de la Calzada Pablo Quintana Manuel Antonio Burgos ITP, S.A. Parque Empresarial Fernando avenida...mechanisms above presented, simulation of unsteady aerodynamics with linear and nonlinear CFD codes is an ongoing activity within the turbomachinery industry

  13. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  14. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  15. Test Driven Development: Lessons from a Simple Scientific Model

    NASA Astrophysics Data System (ADS)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  16. Improving Hall Thruster Plume Simulation through Refined Characterization of Near-field Plasma Properties

    NASA Astrophysics Data System (ADS)

    Huismann, Tyler D.

    Due to the rapidly expanding role of electric propulsion (EP) devices, it is important to evaluate their integration with other spacecraft systems. Specifically, EP device plumes can play a major role in spacecraft integration, and as such, accurate characterization of plume structure bears on mission success. This dissertation addresses issues related to accurate prediction of plume structure in a particular type of EP device, a Hall thruster. This is done in two ways: first, by coupling current plume simulation models with current models that simulate a Hall thruster's internal plasma behavior; second, by improving plume simulation models and thereby increasing physical fidelity. These methods are assessed by comparing simulated results to experimental measurements. Assessment indicates the two methods improve plume modeling capabilities significantly: using far-field ion current density as a metric, these approaches used in conjunction improve agreement with measurements by a factor of 2.5, as compared to previous methods. Based on comparison to experimental measurements, recent computational work on discharge chamber modeling has been largely successful in predicting properties of internal thruster plasmas. This model can provide detailed information on plasma properties at a variety of locations. Frequently, experimental data is not available at many locations that are of interest regarding computational models. Excepting the presence of experimental data, there are limited alternatives for scientifically determining plasma properties that are necessary as inputs into plume simulations. Therefore, this dissertation focuses on coupling current models that simulate internal thruster plasma behavior with plume simulation models. Further, recent experimental work on atom-ion interactions has provided a better understanding of particle collisions within plasmas. This experimental work is used to update collision models in a current plume simulation code. Previous versions of the code assume an unknown dependence between particles' pre-collision velocities and post-collision scattering angles. This dissertation focuses on updating several of these types of collisions by assuming a curve fit based on the measurements of atom-ion interactions, such that previously unknown angular dependences are well-characterized.

  17. Main steam line break accident simulation of APR1400 using the model of ATLAS facility

    NASA Astrophysics Data System (ADS)

    Ekariansyah, A. S.; Deswandri; Sunaryo, Geni R.

    2018-02-01

    A main steam line break simulation for APR1400 as an advanced design of PWR has been performed using the RELAP5 code. The simulation was conducted in a model of thermal-hydraulic test facility called as ATLAS, which represents a scaled down facility of the APR1400 design. The main steam line break event is described in a open-access safety report document, in which initial conditions and assumptionsfor the analysis were utilized in performing the simulation and analysis of the selected parameter. The objective of this work was to conduct a benchmark activities by comparing the simulation results of the CESEC-III code as a conservative approach code with the results of RELAP5 as a best-estimate code. Based on the simulation results, a general similarity in the behavior of selected parameters was observed between the two codes. However the degree of accuracy still needs further research an analysis by comparing with the other best-estimate code. Uncertainties arising from the ATLAS model should be minimized by taking into account much more specific data in developing the APR1400 model.

  18. High Energy Physics Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and High Energy Physics, June 10-12, 2015, Bethesda, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert; Gerber, Richard

    The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greatermore » — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, (5) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less

  19. A need for a code of ethics in science communication?

    NASA Astrophysics Data System (ADS)

    Benestad, R. E.

    2009-09-01

    The modern western civilization and high standard of living are to a large extent the 'fruits' of scientific endeavor over generations. Some examples include the longer life expectancy due to progress in medical sciences, and changes in infrastructure associated with the utilization of electromagnetism. Modern meteorology is not possible without the state-of-the-art digital computers, satellites, remote sensing, and communications. Science also is of relevance for policy making, e.g. the present hot topic of climate change. Climate scientists have recently become much exposed to media focus and mass communications, a task for which many are not trained. Furthermore, science, communication, and politics have different objectives, and do not necessarily mix. Scientists have an obligation to provide unbiased information, and a code of ethics is needed to give a guidance for acceptable and unacceptable conduct. Some examples of questionable conduct in Norway include using the title 'Ph.D' to imply scientific authority when the person never had obtained such an academic degree, or writing biased and one-sided articles in Norwegian encyclopedia that do not reflect the scientific consensus. It is proposed here that a set of guide lines (for the scientists and journalists) and a code of conduct could provide recommendation for regarding how to act in media - similar to a code of conduct with respect to carrying out research - to which everyone could agree, even when disagreeing on specific scientific questions.

  20. Implementing Subduction Models in the New Mantle Convection Code Aspect

    NASA Astrophysics Data System (ADS)

    Arredondo, Katrina; Billen, Magali

    2014-05-01

    The geodynamic community has utilized various numerical modeling codes as scientific questions arise and computer processing power increases. Citcom, a widely used mantle convection code, has limitations and vulnerabilities such as temperature overshoots of hundreds or thousands degrees Kelvin (i.e., Kommu et al., 2013). Recently Aspect intended as a more powerful cousin, is in active development with additions such as Adaptable Mesh Refinement (AMR) and improved solvers (Kronbichler et al., 2012). The validity and ease of use of Aspect is important to its survival and role as a possible upgrade and replacement to Citcom. Development of publishable models illustrates the capacity of Aspect. We present work on the addition of non-linear solvers and stress-dependent rheology to Aspect. With a solid foundational knowledge of C++, these additions were easily added into Aspect and tested against CitcomS. Time-dependent subduction models akin to those in Billen and Hirth (2007) are built and compared in CitcomS and Aspect. Comparison with CitcomS assists in Aspect development and showcases its flexibility, usability and capabilities. References: Billen, M. I., and G. Hirth, 2007. Rheologic controls on slab dynamics. Geochemistry, Geophysics, Geosystems. Kommu, R., E. Heien, L. H. Kellogg, W. Bangerth, T. Heister, E. Studley, 2013. The Overshoot Phenomenon in Geodynamics Codes. American Geophysical Union Fall Meeting. M. Kronbichler, T. Heister, W. Bangerth, 2012, High Accuracy Mantle Convection Simulation through Modern Numerical Methods, Geophys. J. Int.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehe, Remi

    Many simulation software produce data in the form of a set of field values or of a set of particle positions. (one such example is that of particle-in-cell codes, which produce data on the electromagnetic fields that they simulate.) However, each particular software uses its own particular format and layout, for the output data. This makes it difficult to compare the results of different simulation software, or to have a common visualization tool for these results. However, a standardized layout for fields and particles has recently been developed: the openPMD format ( HYPERLINK "http://www.openpmd.org/"www.openpmd.org) This format is open- source, andmore » specifies a standard way in which field data and particle data should be written. The openPMD format is already implemented in the particle-in-cell code Warp (developed at LBL) and in PIConGPU (developed at HZDR, Germany). In this context, the proposed software (openPMD-viewer) is a Python package, which allows to access and visualize any data which has been formatted according to the openPMD standard. This package contains two main components: - a Python API, which allows to read and extract the data from a openPMD file, so as to be able to work with it within the Python environment. (e.g. plot the data and reprocess it with particular Python functions) - a graphical interface, which works with the ipython notebook, and allows to quickly visualize the data and browse through a set of openPMD files. The proposed software will be typically used when analyzing the results of numerical simulations. It will be useful to quickly extract scientific meaning from a set of numerical data.« less

  2. Python Radiative Transfer Emission code (PyRaTE): non-LTE spectral lines simulations

    NASA Astrophysics Data System (ADS)

    Tritsis, A.; Yorke, H.; Tassis, K.

    2018-05-01

    We describe PyRaTE, a new, non-local thermodynamic equilibrium (non-LTE) line radiative transfer code developed specifically for post-processing astrochemical simulations. Population densities are estimated using the escape probability method. When computing the escape probability, the optical depth is calculated towards all directions with density, molecular abundance, temperature and velocity variations all taken into account. A very easy-to-use interface, capable of importing data from simulations outputs performed with all major astrophysical codes, is also developed. The code is written in PYTHON using an "embarrassingly parallel" strategy and can handle all geometries and projection angles. We benchmark the code by comparing our results with those from RADEX (van der Tak et al. 2007) and against analytical solutions and present case studies using hydrochemical simulations. The code will be released for public use.

  3. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  4. MOCCA code for star cluster simulation: comparison with optical observations using COCOA

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Olech, Arkadiusz; Hypki, Arkadiusz

    2016-02-01

    We introduce and present preliminary results from COCOA (Cluster simulatiOn Comparison with ObservAtions) code for a star cluster after 12 Gyr of evolution simulated using the MOCCA code. The COCOA code is being developed to quickly compare results of numerical simulations of star clusters with observational data. We use COCOA to obtain parameters of the projected cluster model. For comparison, a FITS file of the projected cluster was provided to observers so that they could use their observational methods and techniques to obtain cluster parameters. The results show that the similarity of cluster parameters obtained through numerical simulations and observations depends significantly on the quality of observational data and photometric accuracy.

  5. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  6. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  7. NSSDC index of international scientific rocket launches ordered by sponsering country/agency

    NASA Technical Reports Server (NTRS)

    1972-01-01

    International scientific rocket launches are listed by discipline codes and by sponsoring country/agencies identifications. Launch sites, experiments, approximate apogee, success and principle experimenters are also shown.

  8. Salt, time, and metaphor: examining norms in scientific culture

    NASA Astrophysics Data System (ADS)

    Brady, Anna G.

    2017-06-01

    As has been widely discussed, the National Research Council's (NRC) current policy in United States education advocates supporting students toward acquiring skills to engage in scientific practices. NRC policy also suggests that supporting students in the practices of science may require different approaches than what is required for supporting student engagement with scientific content. Further, acquiring skills in scientific practices is not limited to gaining proficiency in utilizing tools that support scientific inquiry: students must also understand how to interpret information generated from such tools. These tools of scientific practices are embedded within scientific culture, which from Sewell's perspective, is comprised of both practice and semiotic code (symbols and meanings). To become scientifically literate students must learn to utilize this code in practice. Author Germà Garcia-Belmonte identified one example of learning to utilize the semiotic code in scientific practice and considers challenges faced by undergraduate physics and engineering students within that context. Garcia-Belmonte observes students struggle to interpret symbols and meaning (the visual display generated) while engaging in practice (utilizing an oscilloscope) and posits that two, culturally bound, competing, linguistic metaphors of time may be the cause. Ultimately, however, the author does not explore beyond hypotheses. Although his theory may be correct, the paper serves as a reminder of the responsibility we have to students. As educators, it is useful and beneficial to make observations and develop theories surrounding why our students struggle. However, in addition to theorizing on why, for example, a particular scientific norm might present challenges for our students, we must remain mindful that challenges may not be uniform and may vary considerably according to students' culture(s). Engaging with students and soliciting specific information regarding the challenges they face allows us, as educators, to both examine whether students' reported challenges align or conflict with our own perceptions of those challenges, and subsequently devise and test methods toward supporting students in overcoming their challenges.

  9. Three-dimensional Monte-Carlo simulation of gamma-ray scattering and production in the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.J.

    1989-05-15

    Monte Carlo codes have been developed to simulate gamma-ray scattering and production in the atmosphere. The scattering code simulates interactions of low-energy gamma rays (20 to several hundred keV) from an astronomical point source in the atmosphere; a modified code also simulates scattering in a spacecraft. Four incident spectra, typical of gamma-ray bursts, solar flares, and the Crab pulsar, and 511 keV line radiation have been studied. These simulations are consistent with observations of solar flare radiation scattered from the atmosphere. The production code simulates the interactions of cosmic rays which produce high-energy (above 10 MeV) photons and electrons. Itmore » has been used to calculate gamma-ray and electron albedo intensities at Palestine, Texas and at the equator; the results agree with observations in most respects. With minor modifications this code can be used to calculate intensities of other high-energy particles. Both codes are fully three-dimensional, incorporating a curved atmosphere; the production code also incorporates the variation with both zenith and azimuth of the incident cosmic-ray intensity due to geomagnetic effects. These effects are clearly reflected in the calculated albedo by intensity contrasts between the horizon and nadir, and between the east and west horizons.« less

  10. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  11. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  12. SPH for impact force and ricochet behavior of water-entry bodies

    NASA Astrophysics Data System (ADS)

    Omidvar, Pourya; Farghadani, Omid; Nikeghbali, Pooyan

    The numerical modeling of fluid interaction with a bouncing body has many applications in scientific and engineering application. In this paper, the problem of water impact of a body on free-surface is investigated, where the fixed ghost boundary condition is added to the open source code SPHysics2D1 to rectify the oscillations in pressure distributions with the repulsive boundary condition. First, after introducing the methodology of SPH and the option of boundary conditions, the still water problem is simulated using two types of boundary conditions. It is shown that the fixed ghost boundary condition gives a better result for a hydrostatics pressure. Then, the dam-break problem, which is a bench mark test case in SPH, is simulated and compared with available data. In order to show the behavior of the hydrostatics forces on bodies, a fix/floating cylinder is placed on free surface looking carefully at the force and heaving profile. Finally, the impact of a body on free-surface is successfully simulated for different impact angles and velocities.

  13. a Framework for Distributed Mixed Language Scientific Applications

    NASA Astrophysics Data System (ADS)

    Quarrie, D. R.

    The Object Management Group has defined an architecture (CORBA) for distributed object applications based on an Object Request Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel stubs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently underway to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL.

  14. Object-oriented approach for gas turbine engine simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Felder, James L.

    1995-01-01

    An object-oriented gas turbine engine simulation program was developed. This program is a prototype for a more complete, commercial grade engine performance program now being proposed as part of the Numerical Propulsion System Simulator (NPSS). This report discusses architectural issues of this complex software system and the lessons learned from developing the prototype code. The prototype code is a fully functional, general purpose engine simulation program, however, only the component models necessary to model a transient compressor test rig have been written. The production system will be capable of steady state and transient modeling of almost any turbine engine configuration. Chief among the architectural considerations for this code was the framework in which the various software modules will interact. These modules include the equation solver, simulation code, data model, event handler, and user interface. Also documented in this report is the component based design of the simulation module and the inter-component communication paradigm. Object class hierarchies for some of the code modules are given.

  15. Two-dimensional implosion simulations with a kinetic particle code [2D implosion simulations with a kinetic particle code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sagert, Irina; Even, Wesley Paul; Strother, Terrance Timothy

    Here, we perform two-dimensional implosion simulations using a Monte Carlo kinetic particle code. The application of a kinetic transport code is motivated, in part, by the occurrence of nonequilibrium effects in inertial confinement fusion capsule implosions, which cannot be fully captured by hydrodynamic simulations. Kinetic methods, on the other hand, are able to describe both continuum and rarefied flows. We perform simple two-dimensional disk implosion simulations using one-particle species and compare the results to simulations with the hydrodynamics code rage. The impact of the particle mean free path on the implosion is also explored. In a second study, we focusmore » on the formation of fluid instabilities from induced perturbations. We find good agreement with hydrodynamic studies regarding the location of the shock and the implosion dynamics. Differences are found in the evolution of fluid instabilities, originating from the higher resolution of rage and statistical noise in the kinetic studies.« less

  16. Two-dimensional implosion simulations with a kinetic particle code [2D implosion simulations with a kinetic particle code

    DOE PAGES

    Sagert, Irina; Even, Wesley Paul; Strother, Terrance Timothy

    2017-05-17

    Here, we perform two-dimensional implosion simulations using a Monte Carlo kinetic particle code. The application of a kinetic transport code is motivated, in part, by the occurrence of nonequilibrium effects in inertial confinement fusion capsule implosions, which cannot be fully captured by hydrodynamic simulations. Kinetic methods, on the other hand, are able to describe both continuum and rarefied flows. We perform simple two-dimensional disk implosion simulations using one-particle species and compare the results to simulations with the hydrodynamics code rage. The impact of the particle mean free path on the implosion is also explored. In a second study, we focusmore » on the formation of fluid instabilities from induced perturbations. We find good agreement with hydrodynamic studies regarding the location of the shock and the implosion dynamics. Differences are found in the evolution of fluid instabilities, originating from the higher resolution of rage and statistical noise in the kinetic studies.« less

  17. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  18. LavaSIM: the effect of heat transfer in 3D on lava flow characteristics (Invited)

    NASA Astrophysics Data System (ADS)

    Fujita, E.

    2013-12-01

    Characteristics of lava flow are governed by many parameters like lava viscosity, effusion rate, ground topography, etc. The accuracy and applicability of lava flow simulation code is evaluated whether the numerical simulation can reproduce these features quantitatively, which is important from both strategic and scientific points of views. Many lava flow simulation codes are so far proposed, and they are classified into two categories, i.e., the deterministic and the probabilistic models. LavaSIM is one of the former category models, and has a disadvantage of time consuming. But LavaSIM can solves the equations of continuity, motion, energy by step and has an advantage in the calculation of three-dimensional analysis with solid-liquid two phase flow, including the heat transfer between lava, solidified crust, air, water and ground, and three-dimensional convection in liquid lava. In other word, we can check the detailed structure of lava flow by LavaSIM. Therefore, this code can produce both channeled and fan-dispersive flows. The margin of the flow is solidified by cooling and these solidified crusts control the behavior of successive lava flow. In case of a channel flow, the solidified margin supports the stable central main flow and elongates the lava flow distance. The cross section of lava flow shows that the liquid lava flows between solidified crusts. As for the lava extrusion flow rate, LavaSIM can include the time function as well as the location of the vents. In some cases, some parts of the solidified wall may be broken by the pressure of successive flow and/or re-melting. These mechanisms could characterize complex features of the observed lava flows at many volcanoes in the world. To apply LavaSIM to the benchmark tests organized by V-hub is important to improve the lava flow evaluation technique.

  19. OSIRIS - an object-oriented parallel 3D PIC code for modeling laser and particle beam-plasma interaction

    NASA Astrophysics Data System (ADS)

    Hemker, Roy

    1999-11-01

    The advances in computational speed make it now possible to do full 3D PIC simulations of laser plasma and beam plasma interactions, but at the same time the increased complexity of these problems makes it necessary to apply modern approaches like object oriented programming to the development of simulation codes. We report here on our progress in developing an object oriented parallel 3D PIC code using Fortran 90. In its current state the code contains algorithms for 1D, 2D, and 3D simulations in cartesian coordinates and for 2D cylindrically-symmetric geometry. For all of these algorithms the code allows for a moving simulation window and arbitrary domain decomposition for any number of dimensions. Recent 3D simulation results on the propagation of intense laser and electron beams through plasmas will be presented.

  20. 5D Tempest simulations of kinetic edge turbulence

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M. R.; Hittinger, J. A.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.; Umansky, M. V.; Qin, H.

    2006-10-01

    Results are presented from the development and application of TEMPEST, a nonlinear five dimensional (3d2v) gyrokinetic continuum code. The simulation results and theoretical analysis include studies of H-mode edge plasma neoclassical transport and turbulence in real divertor geometry and its relationship to plasma flow generation with zero external momentum input, including the important orbit-squeezing effect due to the large electric field flow-shear in the edge. In order to extend the code to 5D, we have formulated a set of fully nonlinear electrostatic gyrokinetic equations and a fully nonlinear gyrokinetic Poisson's equation which is valid for both neoclassical and turbulence simulations. Our 5D gyrokinetic code is built on 4D version of Tempest neoclassical code with extension to a fifth dimension in binormal direction. The code is able to simulate either a full torus or a toroidal segment. Progress on performing 5D turbulence simulations will be reported.

  1. Optical, electrical and elastic properties of ferroelectric domain walls in lithium niobate and lithium titanate

    NASA Astrophysics Data System (ADS)

    Kim, Sungwon

    Ferroelectric LiNbO3 and LiTaO3 crystals have developed, over the last 50 years as key materials for integrated and nonlinear optics due to their large electro-optic and nonlinear optical coefficients and a broad transparency range from 0.4 mum-4.5 mum wavelengths. Applications include high speed optical modulation and switching in 40GHz range, second harmonic generation, optical parametric amplification, pulse compression and so on. Ferroelectric domain microengineering has led to electro-optic scanners, dynamic focusing lenses, total internal reflection switches, and quasi-phase matched (QPM) frequency doublers. Most of these applications have so far been on non-stoichiometric compositions of these crystals. Recent breakthroughs in crystal growth have however opened up an entirely new window of opportunity from both scientific and technological viewpoint. The growth of stoichiometric composition crystals has led to the discovery of many fascinating effects arising from the presence or absence of atomic defects, such as an order of magnitude changes in coercive fields, internal fields, domain backswitching and stabilization phenomenon. On the nanoscale, unexpected features such as the presence of wide regions of optical contrast and strain have been discovered at 180° domain walls. Such strong influence of small amounts of nonstoichiometric defects on material properties has led to new device applications, particularly those involving domain patterning and shaping such as QPM devices in thick bulk crystals and improved photorefractive damage compositions. The central focus of this dissertation is to explore the role of nonstoichiometry and its precise influence on macroscale and nanoscale properties in lithium niobate and tantalate. Macroscale properties are studied using a combination of in-situ and high-speed electro-optic imaging microscopy and electrical switching experiments. Local static and dynamic strain properties at individual domain walls is studied using X-ray synchrotron imaging with and without in-situ electric fields. Nanoscale optical properties are studied using Near Field Scanning Optical Microscopy(NSOM). Finite Difference Time Domain(FDTD) codes, Beam Propagation Method(BPM) codes and X-ray tracing codes have been developed to successfully simulate NSOM images and X-ray topography images to extract the local optical and strain properties, respectively. A 3-D ferroelectric domain simulation code based on Time Dependent Ginzburg Landau(TDGL) theory and group theory has been developed to understand the nature of these local wall strains and the preferred wall orientations. By combining these experimental and numerical tools, We have also proposed a defect-dipole model and a mechanism by which the defect interacts with the domain walls. This thesis has thus built a more comprehensive picture of the influence of defects on domain walls on nanoscale and macroscale, and raises new scientific questions about the exact nature of domain walls-defect interactions. Besides the specific problem of ferroelectrics, the experimental and simulation tools, developed in this thesis will have wider application in the area of materials science.

  2. The Space Telescope SI C&DH system. [Scientific Instrument Control and Data Handling Subsystem

    NASA Technical Reports Server (NTRS)

    Gadwal, Govind R.; Barasch, Ronald S.

    1990-01-01

    The Hubble Space Telescope Scientific Instrument Control and Data Handling Subsystem (SI C&DH) is designed to interface with five scientific instruments of the Space Telescope to provide ground and autonomous control and collect health and status information using the Standard Telemetry and Command Components (STACC) multiplex data bus. It also formats high throughput science data into packets. The packetized data is interleaved and Reed-Solomon encoded for error correction and Pseudo Random encoded. An inner convolutional coding with the outer Reed-Solomon coding provides excellent error correction capability. The subsystem is designed with the capacity for orbital replacement in order to meet a mission life of fifteen years. The spacecraft computer and the SI C&DH computer coordinate the activities of the spacecraft and the scientific instruments to achieve the mission objectives.

  3. Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.

    PubMed

    Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan

    2016-09-01

    Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Slow Decomposition of Silicone Rubber.

    DTIC Science & Technology

    1982-09-01

    6i 0 20 0 0 -c CA soa ,~ I -- 00 N - C,,l I 21. DYN 6181 DISTRIBUTION LIST No. Cooies No. Cooies Dr. L.V. Schmidt 1 Or. F. Roberto 1 Assistant...Scientific Dr. A.L. Slafkosky 1 Research Scientific Advisor Directorate of Aerosoace Sciences Commandant of the Marine Corps Bolling Air Force Base Code...Research Research Code 413 Directorate of Chemical Sciences Arlington, VA 22217 Bolling Air Force Base Washington, D.C. 20332 M r . Da v id S i e g e lD r J

  5. Applications of Functional Analytic and Martingale Methods to Problems in Queueing Network Theory.

    DTIC Science & Technology

    1983-05-14

    8217’") Air Force Office of Scientific Research Sf. ADDRESS (Cllty. State and ZIP Code) 7b. ADDRESS (City. State and ZIP Code) Directorate of Mathematical... Scientific Report on Air Force Grant #82-0167 Principal Investigator: Professor Walter A. Rosenkrantz I. Publications (1) Calculation of the LaPlace transform...whether or not a protocol for accessing a comunications channel is stable. In AFOSR 82-0167, Report No. 3 we showed that the SLOTTED ALOHA Multi access

  6. Bibliography on Metrication, January 1977 to August 1989

    DTIC Science & Technology

    1990-08-01

    X.L. 109 Guist, Althea R . 460 Gutmann, Fredrick T. 14,291 Hager. Mary 306 Halstead, Bruce B. 188 Hamilton, A.B. 21,303 Hanley, Charles J. 417 Hart, K.C...Scientific Info. Cent IAMSMI-RD-cs- R 6c. ADDRESS (CIty, State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) Commander, U.S. Army Missile Command...Redstone Scientific Information Center AMSMI-RD-CS- R Redstone Arsenal, AL 35898-5241 8a. NAME OF FUNDING/SPONSORING 18b. OFFICE SYMBOL 9. PROCUREMENT

  7. MHD Simulation of Magnetic Nozzle Plasma with the NIMROD Code: Applications to the VASIMR Advanced Space Propulsion Concept

    NASA Astrophysics Data System (ADS)

    Tarditi, Alfonso G.; Shebalin, John V.

    2002-11-01

    A simulation study with the NIMROD code [1] is being carried on to investigate the efficiency of the thrust generation process and the properties of the plasma detachment in a magnetic nozzle. In the simulation, hot plasma is injected in the magnetic nozzle, modeled as a 2D, axi-symmetric domain. NIMROD has two-fluid, 3D capabilities but the present runs are being conducted within the MHD, 2D approximation. As the plasma travels through the magnetic field, part of its thermal energy is converted into longitudinal kinetic energy, along the axis of the nozzle. The plasma eventually detaches from the magnetic field at a certain distance from the nozzle throat where the kinetic energy becomes larger than the magnetic energy. Preliminary NIMROD 2D runs have been benchmarked with a particle trajectory code showing satisfactory results [2]. Further testing is here reported with the emphasis on the analysis of the diffusion rate across the field lines and of the overall nozzle efficiency. These simulation runs are specifically designed for obtaining comparisons with laboratory measurements of the VASIMR experiment, by looking at the evolution of the radial plasma density and temperature profiles in the nozzle. VASIMR (Variable Specific Impulse Magnetoplasma Rocket, [3]) is an advanced space propulsion concept currently under experimental development at the Advanced Space Propulsion Laboratory, NASA Johnson Space Center. A plasma (typically ionized Hydrogen or Helium) is generated by a RF (Helicon) discharge and heated by an Ion Cyclotron Resonance Heating antenna. The heated plasma is then guided into a magnetic nozzle to convert the thermal plasma energy into effective thrust. The VASIMR system has no electrodes and a solenoidal magnetic field produced by an asymmetric mirror configuration ensures magnetic insulation of the plasma from the material surfaces. By powering the plasma source and the heating antenna at different levels it is possible to vary smoothly of the thrust-to-specific impulse ratio while maintaining maximum power utilization. [1] http://www.nimrodteam.org [2] A. V. Ilin et al., Proc. 40th AIAA Aerospace Sciences Meeting, Reno, NV, Jan. 2002 [3] F. R. Chang-Diaz, Scientific American, p. 90, Nov. 2000

  8. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  9. Flux-driven turbulence GDB simulations of the IWL Alcator C-Mod L-mode edge compared with experiment

    NASA Astrophysics Data System (ADS)

    Francisquez, Manaure; Zhu, Ben; Rogers, Barrett

    2017-10-01

    Prior to predicting confinement regime transitions in tokamaks one may need an accurate description of L-mode profiles and turbulence properties. These features determine the heat-flux width upon which wall integrity depends, a topic of major interest for research aid to ITER. To this end our work uses the GDB model to simulate the Alcator C-Mod edge and contributes support for its use in studying critical edge phenomena in current and future tokamaks. We carried out 3D electromagnetic flux-driven two-fluid turbulence simulations of inner wall limited (IWL) C-Mod shots spanning closed and open flux surfaces. These simulations are compared with gas puff imaging (GPI) and mirror Langmuir probe (MLP) data, examining global features and statistical properties of turbulent dynamics. GDB reproduces important qualitative aspects of the C-Mod edge regarding global density and temperature profiles, within reasonable margins, and though the turbulence statistics of the simulated turbulence follow similar quantitative trends questions remain about the code's difficulty in exactly predicting quantities like the autocorrelation time A proposed breakpoint in the near SOL pressure and the posited separation between drift and ballooning dynamics it represents are examined This work was supported by DOE-SC-0010508. This research used resources of the National Energy Research Scientific Computing Center (NERSC).

  10. SAFETY IN THE DESIGN OF SCIENCE LABORATORIES AND BUILDING CODES.

    ERIC Educational Resources Information Center

    HOROWITZ, HAROLD

    THE DESIGN OF COLLEGE AND UNIVERSITY BUILDINGS USED FOR SCIENTIFIC RESEARCH AND EDUCATION IS DISCUSSED IN TERMS OF LABORATORY SAFETY AND BUILDING CODES AND REGULATIONS. MAJOR TOPIC AREAS ARE--(1) SAFETY RELATED DESIGN FEATURES OF SCIENCE LABORATORIES, (2) LABORATORY SAFETY AND BUILDING CODES, AND (3) EVIDENCE OF UNSAFE DESIGN. EXAMPLES EMPHASIZE…

  11. Simulation of Weld Mechanical Behavior to Include Welding-Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes

    DTIC Science & Technology

    2015-11-01

    induced residual stresses and distortions from weld simulations in the SYSWELD software code in structural Finite Element Analysis ( FEA ) simulations...performed in the Abaqus FEA code is presented. The translation of these results is accomplished using a newly developed Python script. Full details of...Local Weld Model in Structural FEA ....................................................15 CONCLUSIONS

  12. An Open-Source Sandbox for Increasing the Accessibility of Functional Programming to the Bioinformatics and Scientific Communities

    PubMed Central

    Fenwick, Matthew; Sesanker, Colbert; Schiller, Martin R.; Ellis, Heidi JC; Hinman, M. Lee; Vyas, Jay; Gryk, Michael R.

    2012-01-01

    Scientists are continually faced with the need to express complex mathematical notions in code. The renaissance of functional languages such as LISP and Haskell is often credited to their ability to implement complex data operations and mathematical constructs in an expressive and natural idiom. The slow adoption of functional computing in the scientific community does not, however, reflect the congeniality of these fields. Unfortunately, the learning curve for adoption of functional programming techniques is steeper than that for more traditional languages in the scientific community, such as Python and Java, and this is partially due to the relative sparseness of available learning resources. To fill this gap, we demonstrate and provide applied, scientifically substantial examples of functional programming, We present a multi-language source-code repository for software integration and algorithm development, which generally focuses on the fields of machine learning, data processing, bioinformatics. We encourage scientists who are interested in learning the basics of functional programming to adopt, reuse, and learn from these examples. The source code is available at: https://github.com/CONNJUR/CONNJUR-Sandbox (see also http://www.connjur.org). PMID:25328913

  13. An Open-Source Sandbox for Increasing the Accessibility of Functional Programming to the Bioinformatics and Scientific Communities.

    PubMed

    Fenwick, Matthew; Sesanker, Colbert; Schiller, Martin R; Ellis, Heidi Jc; Hinman, M Lee; Vyas, Jay; Gryk, Michael R

    2012-01-01

    Scientists are continually faced with the need to express complex mathematical notions in code. The renaissance of functional languages such as LISP and Haskell is often credited to their ability to implement complex data operations and mathematical constructs in an expressive and natural idiom. The slow adoption of functional computing in the scientific community does not, however, reflect the congeniality of these fields. Unfortunately, the learning curve for adoption of functional programming techniques is steeper than that for more traditional languages in the scientific community, such as Python and Java, and this is partially due to the relative sparseness of available learning resources. To fill this gap, we demonstrate and provide applied, scientifically substantial examples of functional programming, We present a multi-language source-code repository for software integration and algorithm development, which generally focuses on the fields of machine learning, data processing, bioinformatics. We encourage scientists who are interested in learning the basics of functional programming to adopt, reuse, and learn from these examples. The source code is available at: https://github.com/CONNJUR/CONNJUR-Sandbox (see also http://www.connjur.org).

  14. Scientific journals and conflict of interest disclosure: what progress has been made?

    PubMed

    Ruff, Kathleen

    2015-05-30

    The article addresses the failure of the scientific community to create an effective mechanism to protect the integrity of the scientific literature from improper influence by vested interests. The seriousness of this threat is increasingly recognized. Scientists willing to distort scientific research to serve vested interests receive millions of dollars for their services. Organizations such as the International Committee of Medical Journal Editors, the World Association of Medical Editors and the Committee on Publication Ethics (COPE) have launched initiatives to establish international standards for Conflict of Interest (COI) disclosure. COPE requires its 7,000 member journals to comply with its Code of Conduct for Journal Editors. While these initiatives are encouraging, they are internal educational endeavours only. Five examples are given showing failure of COPE member journals to comply with COPE's Code of Conduct. While COPE offers a complaint process, it involves only discussion and voluntary compliance. COPE neither polices nor enforces its Code. Instead of the current feeble, un-resourced process, which delivers neither transparency nor accountability, the article proposes the creation of a mechanism that will employ specific, effective measures to address contraventions of COI disclosure requirements.

  15. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    NASA Astrophysics Data System (ADS)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.

  16. Python/Lua Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busby, L.

    This is an adaptation of the pre-existing Scimark benchmark code to a variety of Python and Lua implementations. It also measures performance of the Fparser expression parser and C and C++ code on a variety of simple scientific expressions.

  17. Numerical simulation of experiments in the Giant Planet Facility

    NASA Technical Reports Server (NTRS)

    Green, M. J.; Davy, W. C.

    1979-01-01

    Utilizing a series of existing computer codes, ablation experiments in the Giant Planet Facility are numerically simulated. Of primary importance is the simulation of the low Mach number shock layer that envelops the test model. The RASLE shock-layer code, used in the Jupiter entry probe heat-shield design, is adapted to the experimental conditions. RASLE predictions for radiative and convective heat fluxes are in good agreement with calorimeter measurements. In simulating carbonaceous ablation experiments, the RASLE code is coupled directly with the CMA material response code. For the graphite models, predicted and measured recessions agree very well. Predicted recession for the carbon phenolic models is 50% higher than that measured. This is the first time codes used for the Jupiter probe design have been compared with experiments.

  18. Hypothesis testing of scientific Monte Carlo calculations.

    PubMed

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  19. Hypothesis testing of scientific Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  20. NOAA draft scientific integrity policy: Comment period open through 20 August

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-08-01

    The National Oceanic and Atmospheric Administration (NOAA) is aiming to finalize its draft scientific integrity policy possibly by the end of the year, Larry Robinson, NOAA assistant secretary for conservation and management, indicated during a 28 July teleconference. The policy “is key to fostering an environment where science is encouraged, nurtured, respected, rewarded, and protected,” Robinson said, adding that the agency's comment period for the draft policy, which was released on 16 June, ends on 20 August. “Science underpins all that NOAA does. This policy is one piece of a broader effort to strengthen NOAA science,” Robinson said, noting that the draft “represents the first ever scientific integrity policy for NOAA. Previously, our policy only addressed research misconduct and focused on external grants. What's new about this policy is that it establishes NOAA's principles for scientific integrity, a scientific code of conduct, and a code of ethics for science supervision and management.”

  1. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  2. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    NASA Astrophysics Data System (ADS)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  3. Implementing Molecular Dynamics on Hybrid High Performance Computers - Three-Body Potentials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, W Michael; Yamada, Masako

    The use of coprocessors or accelerators such as graphics processing units (GPUs) has become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power re- quirements. Hybrid high-performance computers, defined as machines with nodes containing more than one type of floating-point processor (e.g. CPU and GPU), are now becoming more prevalent due to these advantages. Although there has been extensive research into methods to efficiently use accelerators to improve the performance of molecular dynamics (MD) employing pairwise potential energy models, little is reported in the literature for models that includemore » many-body effects. 3-body terms are required for many popular potentials such as MEAM, Tersoff, REBO, AIREBO, Stillinger-Weber, Bond-Order Potentials, and others. Because the per-atom simulation times are much higher for models incorporating 3-body terms, there is a clear need for efficient algo- rithms usable on hybrid high performance computers. Here, we report a shared-memory force-decomposition for 3-body potentials that avoids memory conflicts to allow for a deterministic code with substantial performance improvements on hybrid machines. We describe modifications necessary for use in distributed memory MD codes and show results for the simulation of water with Stillinger-Weber on the hybrid Titan supercomputer. We compare performance of the 3-body model to the SPC/E water model when using accelerators. Finally, we demonstrate that our approach can attain a speedup of 5.1 with acceleration on Titan for production simulations to study water droplet freezing on a surface.« less

  4. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  5. Aerodynamic Analysis of the M33 Projectile Using the CFX Code

    DTIC Science & Technology

    2011-12-01

    is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) The M33 projectile has been analyzed using the ANSYS CFX code that is based...analyzed using the ANSYS CFX code that is based on the numerical solution of the full Navier-Stokes equations. Simulation data were obtained...using the CFX code. The ANSYS - CFX code is a commercial CFD program used to simulate fluid flow in a variety of applications such as gas turbine

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strout, Michelle

    Programming parallel machines is fraught with difficulties: the obfuscation of algorithms due to implementation details such as communication and synchronization, the need for transparency between language constructs and performance, the difficulty of performing program analysis to enable automatic parallelization techniques, and the existence of important "dusty deck" codes. The SAIMI project developed abstractions that enable the orthogonal specification of algorithms and implementation details within the context of existing DOE applications. The main idea is to enable the injection of small programming models such as expressions involving transcendental functions, polyhedral iteration spaces with sparse constraints, and task graphs into full programsmore » through the use of pragmas. These smaller, more restricted programming models enable orthogonal specification of many implementation details such as how to map the computation on to parallel processors, how to schedule the computation, and how to allocation storage for the computation. At the same time, these small programming models enable the expression of the most computationally intense and communication heavy portions in many scientific simulations. The ability to orthogonally manipulate the implementation for such computations will significantly ease performance programming efforts and expose transformation possibilities and parameter to automated approaches such as autotuning. At Colorado State University, the SAIMI project was supported through DOE grant DE-SC3956 from April 2010 through August 2015. The SAIMI project has contributed a number of important results to programming abstractions that enable the orthogonal specification of implementation details in scientific codes. This final report summarizes the research that was funded by the SAIMI project.« less

  7. Muon simulation codes MUSIC and MUSUN for underground physics

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, V. A.

    2009-03-01

    The paper describes two Monte Carlo codes dedicated to muon simulations: MUSIC (MUon SImulation Code) and MUSUN (MUon Simulations UNderground). MUSIC is a package for muon transport through matter. It is particularly useful for propagating muons through large thickness of rock or water, for instance from the surface down to underground/underwater laboratory. MUSUN is designed to use the results of muon transport through rock/water to generate muons in or around underground laboratory taking into account their energy spectrum and angular distribution.

  8. Simulation Studies for Inspection of the Benchmark Test with PATRASH

    NASA Astrophysics Data System (ADS)

    Shimosaki, Y.; Igarashi, S.; Machida, S.; Shirakata, M.; Takayama, K.; Noda, F.; Shigaki, K.

    2002-12-01

    In order to delineate the halo-formation mechanisms in a typical FODO lattice, a 2-D simulation code PATRASH (PArticle TRAcking in a Synchrotron for Halo analysis) has been developed. The electric field originating from the space charge is calculated by the Hybrid Tree code method. Benchmark tests utilizing three simulation codes of ACCSIM, PATRASH and SIMPSONS were carried out. These results have been confirmed to be fairly in agreement with each other. The details of PATRASH simulation are discussed with some examples.

  9. Validation: Codes to compare simulation data to various observations

    NASA Astrophysics Data System (ADS)

    Cohn, J. D.

    2017-02-01

    Validation provides codes to compare several observations to simulated data with stellar mass and star formation rate, simulated data stellar mass function with observed stellar mass function from PRIMUS or SDSS-GALEX in several redshift bins from 0.01-1.0, and simulated data B band luminosity function with observed stellar mass function, and to create plots for various attributes, including stellar mass functions, and stellar mass to halo mass. These codes can model predictions (in some cases alongside observational data) to test other mock catalogs.

  10. Simulation Data as Data Streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdulla, G; Arrighi, W; Critchlow, T

    2003-11-18

    Computational or scientific simulations are increasingly being applied to solve a variety of scientific problems. Domains such as astrophysics, engineering, chemistry, biology, and environmental studies are benefiting from this important capability. Simulations, however, produce enormous amounts of data that need to be analyzed and understood. In this overview paper, we describe scientific simulation data, its characteristics, and the way scientists generate and use the data. We then compare and contrast simulation data to data streams. Finally, we describe our approach to analyzing simulation data, present the AQSim (Ad-hoc Queries for Simulation data) system, and discuss some of the challenges thatmore » result from handling this kind of data.« less

  11. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  12. Parcels v0.9: prototyping a Lagrangian ocean analysis framework for the petascale age

    NASA Astrophysics Data System (ADS)

    Lange, Michael; van Sebille, Erik

    2017-11-01

    As ocean general circulation models (OGCMs) move into the petascale age, where the output of single simulations exceeds petabytes of storage space, tools to analyse the output of these models will need to scale up too. Lagrangian ocean analysis, where virtual particles are tracked through hydrodynamic fields, is an increasingly popular way to analyse OGCM output, by mapping pathways and connectivity of biotic and abiotic particulates. However, the current software stack of Lagrangian ocean analysis codes is not dynamic enough to cope with the increasing complexity, scale and need for customization of use-cases. Furthermore, most community codes are developed for stand-alone use, making it a nontrivial task to integrate virtual particles at runtime of the OGCM. Here, we introduce the new Parcels code, which was designed from the ground up to be sufficiently scalable to cope with petascale computing. We highlight its API design that combines flexibility and customization with the ability to optimize for HPC workflows, following the paradigm of domain-specific languages. Parcels is primarily written in Python, utilizing the wide range of tools available in the scientific Python ecosystem, while generating low-level C code and using just-in-time compilation for performance-critical computation. We show a worked-out example of its API, and validate the accuracy of the code against seven idealized test cases. This version 0.9 of Parcels is focused on laying out the API, with future work concentrating on support for curvilinear grids, optimization, efficiency and at-runtime coupling with OGCMs.

  13. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  14. 78 FR 13338 - Exposure Modeling Public Meeting; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ... code 22 Professional, Scientific and Technical NAICS code 54 B. How can I get copies of this document... dates and abstract requests are announced through the ``empmlist'' forum on the LYRIS list server at...

  15. Indoor Fast Neutron Generator for Biophysical and Electronic Applications

    NASA Astrophysics Data System (ADS)

    Cannuli, A.; Caccamo, M. T.; Marchese, N.; Tomarchio, E. A.; Pace, C.; Magazù, S.

    2018-05-01

    This study focuses the attention on an indoor fast neutron generator for biophysical and electronic applications. More specifically, the findings obtained by several simulations with the MCNP Monte Carlo code, necessary for the realization of a shield for indoor measurements, are presented. Furthermore, an evaluation of the neutron spectrum modification caused by the shielding is reported. Fast neutron generators are a valid and interesting available source of neutrons, increasingly employed in a wide range of research fields, such as science and engineering. The employed portable pulsed neutron source is a MP320 Thermo Scientific neutron generator, able to generate 2.5 MeV neutrons with a neutron yield of 2.0 x 106 n/s, a pulse rate of 250 Hz to 20 KHz and a duty factor varying from 5% to 100%. The neutron generator, based on Deuterium-Deuterium nuclear fusion reactions, is employed in conjunction with a solid-state photon detector, made of n-type high-purity germanium (PINS-GMX by ORTEC) and it is mainly addressed to biophysical and electronic studies. The present study showed a proposal for the realization of a shield necessary for indoor applications for MP320 neutron generator, with a particular analysis of the transport of neutrons simulated with Monte Carlo code and described the two main lines of research in which the source will be used.

  16. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  17. Some Experimental and Monte Carlo Investigations of the Plastic Scintillators for the Current Mode Measurements at Pulsed Neutron Sources

    NASA Astrophysics Data System (ADS)

    Rogov, A.; Pepyolyshev, Yu.; Carta, M.; d'Angelo, A.

    Scintillation detector (SD) is widely used in neutron and gamma-spectrometry in a count mode. The organic scintillators for the count mode of the detector operation are investigated rather well. Usually, they are applied for measurement of amplitude and time distributions of pulses caused by single interaction events of neutrons or gamma's with scintillator material. But in a large area of scientific research scintillation detectors can alternatively be used on a current mode by recording the average current from the detector. For example,the measurements of the neutron pulse shape at the pulsed reactors or another pulsed neutron sources. So as to get a rather large volume of experimental data at pulsed neutron sources, it is necessary to use the current mode detector for registration of fast neutrons. Many parameters of the SD are changed with a transition from an accounting mode to current one. For example, the detector efficiency is different in counting and current modes. Many effects connected with time accuracy become substantial. Besides, for the registration of solely fast neutrons, as must be in many measurements, in the mixed radiation field of the pulsed neutron sources, SD efficiency has to be determined with a gamma-radiation shield present. Here is no calculations or experimental data on SD current mode operation up to now. The response functions of the detectors can be either measured in high-precision reference fields or calculated by a computer simulation. We have used the MCNP code [1] and carried out some experiments for investigation of the plastic performances in a current mode. There are numerous programs performing simulating similar to the MCNP code. For example, for neutrons there are [2-4], for photons - [5-8]. However, all known codes to use (SCINFUL, NRESP4, SANDYL, EGS49) have more stringent restrictions on the source, geometry and detector characteristics. In MCNP code a lot of these restrictions are absent and you need only to write special additions for proton and electron recoil and transfer energy to light output. These code modifications allow taking into account all processes in organic scintillator influence the light yield.

  18. Massive Data, the Digitization of Science, and Reproducibility of Results

    ScienceCinema

    Stodden, Victoria

    2018-04-27

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  19. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  20. The UPSF code: a metaprogramming-based high-performance automatically parallelized plasma simulation framework

    NASA Astrophysics Data System (ADS)

    Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao

    2017-10-01

    UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.

  1. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  2. Scientific Programming Using Java: A Remote Sensing Example

    NASA Technical Reports Server (NTRS)

    Prados, Don; Mohamed, Mohamed A.; Johnson, Michael; Cao, Changyong; Gasser, Jerry

    1999-01-01

    This paper presents results of a project to port remote sensing code from the C programming language to Java. The advantages and disadvantages of using Java versus C as a scientific programming language in remote sensing applications are discussed. Remote sensing applications deal with voluminous data that require effective memory management, such as buffering operations, when processed. Some of these applications also implement complex computational algorithms, such as Fast Fourier Transformation analysis, that are very performance intensive. Factors considered include performance, precision, complexity, rapidity of development, ease of code reuse, ease of maintenance, memory management, and platform independence. Performance of radiometric calibration code written in Java for the graphical user interface and of using C for the domain model are also presented.

  3. The origins of informed consent: the International Scientific Commission on Medical War Crimes, and the Nuremburg code.

    PubMed

    Weindling, P

    2001-01-01

    The Nuremberg Code has generally been seen as arising from the Nuremberg Medical Trial. This paper examines developments prior to the Trial, involving the physiologist Andrew Conway Ivy and an inter-Allied Scientific Commission on Medical War Crimes. The paper traces the formulation of the concept of a medical war crime by the physiologist John West Thompson, as part of the background to Ivy's code on human experiments of 1 August 1946. It evaluates subsequent responses by the American Medical Association, and by other war crimes experts, notably Leo Alexander, who developed Ivy's conceptual framework. Ivy's interaction with the judges at Nuremberg alerted them to the importance of formulating ethical guidelines for clinical research.

  4. Comparison of depth-dose distributions of proton therapeutic beams calculated by means of logical detectors and ionization chamber modeled in Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej

    2016-08-01

    The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.

  5. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  6. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  7. Electron-beam-ion-source (EBIS) modeling progress at FAR-TECH, Inc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J. S., E-mail: kim@far-tech.com; Zhao, L., E-mail: kim@far-tech.com; Spencer, J. A., E-mail: kim@far-tech.com

    FAR-TECH, Inc. has been developing a numerical modeling tool for Electron-Beam-Ion-Sources (EBISs). The tool consists of two codes. One is the Particle-Beam-Gun-Simulation (PBGUNS) code to simulate a steady state electron beam and the other is the EBIS-Particle-In-Cell (EBIS-PIC) code to simulate ion charge breeding with the electron beam. PBGUNS, a 2D (r,z) electron gun and ion source simulation code, has been extended for efficient modeling of EBISs and the work was presented previously. EBIS-PIC is a space charge self-consistent PIC code and is written to simulate charge breeding in an axisymmetric 2D (r,z) device allowing for full three-dimensional ion dynamics.more » This 2D code has been successfully benchmarked with Test-EBIS measurements at Brookhaven National Laboratory. For long timescale (< tens of ms) ion charge breeding, the 2D EBIS-PIC simulations take a long computational time making the simulation less practical. Most of the EBIS charge breeding, however, may be modeled in 1D (r) as the axial dependence of the ion dynamics may be ignored in the trap. Where 1D approximations are valid, simulations of charge breeding in an EBIS over long time scales become possible, using EBIS-PIC together with PBGUNS. Initial 1D results are presented. The significance of the magnetic field to ion dynamics, ion cooling effects due to collisions with neutral gas, and the role of Coulomb collisions are presented.« less

  8. Software Writing Skills for Your Research - Lessons Learned from Workshops in the Geosciences

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin

    2016-04-01

    Findings presented in scientific papers are based on data and software. Once in a while they come along with data - but not commonly with software. However, the software used to gain findings plays a crucial role in the scientific work. Nevertheless, software is rarely seen publishable. Thus researchers may not reproduce the findings without the software which is in conflict with the principle of reproducibility in sciences. For both, the writing of publishable software and the reproducibility issue, the quality of software is of utmost importance. For many programming scientists the treatment of source code, e.g. with code design, version control, documentation, and testing is associated with additional work that is not covered in the primary research task. This includes the adoption of processes following the software development life cycle. However, the adoption of software engineering rules and best practices has to be recognized and accepted as part of the scientific performance. Most scientists have little incentive to improve code and do not publish code because software engineering habits are rarely practised by researchers or students. Software engineering skills are not passed on to followers as for paper writing skill. Thus it is often felt that the software or code produced is not publishable. The quality of software and its source code has a decisive influence on the quality of research results obtained and their traceability. So establishing best practices from software engineering to serve scientific needs is crucial for the success of scientific software. Even though scientists use existing software and code, i.e., from open source software repositories, only few contribute their code back into the repositories. So writing and opening code for Open Science means that subsequent users are able to run the code, e.g. by the provision of sufficient documentation, sample data sets, tests and comments which in turn can be proven by adequate and qualified reviews. This assumes that scientist learn to write and release code and software as they learn to write and publish papers. Having this in mind, software could be valued and assessed as a contribution to science. But this requires the relevant skills that can be passed to colleagues and followers. Therefore, the GFZ German Research Centre for Geosciences performed three workshops in 2015 to address the passing of software writing skills to young scientists, the next generation of researchers in the Earth, planetary and space sciences. Experiences in running these workshops and the lessons learned will be summarized in this presentation. The workshops have received support and funding by Software Carpentry, a volunteer organization whose goal is to make scientists more productive, and their work more reliable, by teaching them basic computing skills, and by FOSTER (Facilitate Open Science Training for European Research), a two-year, EU-Funded (FP7) project, whose goal to produce a European-wide training programme that will help to incorporate Open Access approaches into existing research methodologies and to integrate Open Science principles and practice in the current research workflow by targeting the young researchers and other stakeholders.

  9. Convolutional coding results for the MVM '73 X-band telemetry experiment

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1978-01-01

    Results of simulation of several short-constraint-length convolutional codes using a noisy symbol stream obtained via the turnaround ranging channels of the MVM'73 spacecraft are presented. First operational use of this coding technique is on the Voyager mission. The relative performance of these codes in this environment is as previously predicted from computer-based simulations.

  10. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes.

    PubMed

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-02-01

    To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. Paediatric residency program at BC Children's Hospital, Vancouver, British Columbia. The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes.

  11. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  12. Mock Code: A Code Blue Scenario Requested by and Developed for Registered Nurses

    PubMed Central

    Rideout, Janice; Pritchett-Kelly, Sherry; McDonald, Melissa; Mullins-Richards, Paula; Dubrowski, Adam

    2016-01-01

    The use of simulation in medical training is quickly becoming more common, with applications in emergency, surgical, and nursing education. Recently, registered nurses working in surgical inpatient units requested a mock code simulation to practice skills, improve knowledge, and build self-confidence in a safe and controlled environment. A simulation scenario using a high-fidelity mannequin was developed and will be discussed herein. PMID:28123919

  13. An Overview of the Greyscales Lethality Assessment Methodology

    DTIC Science & Technology

    2011-01-01

    code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement simulations. It can also be integrated into...incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement...capable of being incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile

  14. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    DOE PAGES

    Laney, Daniel; Langer, Steven; Weber, Christopher; ...

    2014-01-01

    This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less

  15. A Radiation Chemistry Code Based on the Green's Function of the Diffusion Equation

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Wu, Honglu

    2014-01-01

    Stochastic radiation track structure codes are of great interest for space radiation studies and hadron therapy in medicine. These codes are used for a many purposes, notably for microdosimetry and DNA damage studies. In the last two decades, they were also used with the Independent Reaction Times (IRT) method in the simulation of chemical reactions, to calculate the yield of various radiolytic species produced during the radiolysis of water and in chemical dosimeters. Recently, we have developed a Green's function based code to simulate reversible chemical reactions with an intermediate state, which yielded results in excellent agreement with those obtained by using the IRT method. This code was also used to simulate and the interaction of particles with membrane receptors. We are in the process of including this program for use with the Monte-Carlo track structure code Relativistic Ion Tracks (RITRACKS). This recent addition should greatly expand the capabilities of RITRACKS, notably to simulate DNA damage by both the direct and indirect effect.

  16. Design of spherical electron gun for ultra high frequency, CW power inductive output tube

    NASA Astrophysics Data System (ADS)

    Kaushik, Meenu; Joshi, L. M.

    2016-03-01

    Inductive Output Tube (IOT) is an amplifier that usually operates in UHF range. It is an electron tube whose basic structure is similar to conventional vacuum devices. This device is widely used in broadcast applications but is now being explored for scientific applications also specifically, particle accelerators and fusion plasma heating purposes. The paper describes the design approach of a spherical gridded electron gun of a 500 MHz, 100 kW CW power IOT. The electron gun structure has been simulated and optimized for operating voltage and current of 40kV and 3.5 A respectively. The electromagnetic analysis of this spherical electron gun has been carried out in CST and TRAK codes.

  17. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  18. Design of spherical electron gun for ultra high frequency, CW power inductive output tube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaushik, Meenu, E-mail: mkceeri@gmail.com; Joshi, L. M., E-mail: lmj1953@gmail.com; Academy of Scientific and Innovative Research

    Inductive Output Tube (IOT) is an amplifier that usually operates in UHF range. It is an electron tube whose basic structure is similar to conventional vacuum devices. This device is widely used in broadcast applications but is now being explored for scientific applications also specifically, particle accelerators and fusion plasma heating purposes. The paper describes the design approach of a spherical gridded electron gun of a 500 MHz, 100 kW CW power IOT. The electron gun structure has been simulated and optimized for operating voltage and current of 40kV and 3.5 A respectively. The electromagnetic analysis of this spherical electron gunmore » has been carried out in CST and TRAK codes.« less

  19. Study of premixing phase of steam explosion with JASMINE code in ALPHA program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriyama, Kiyofumi; Yamano, Norihiro; Maruyama, Yu

    Premixing phase of steam explosion has been studied in ALPHA Program at Japan Atomic Energy Research Institute (JAERI). An analytical model to simulate the premixing phase, JASMINE (JAERI Simulator for Multiphase Interaction and Explosion), has been developed based on a multi-dimensional multi-phase thermal hydraulics code MISTRAL (by Fuji Research Institute Co.). The original code was extended to simulate the physics in the premixing phenomena. The first stage of the code validation was performed by analyzing two mixing experiments with solid particles and water: the isothermal experiment by Gilbertson et al. (1992) and the hot particle experiment by Angelini et al.more » (1993) (MAGICO). The code predicted reasonably well the experiments. Effectiveness of the TVD scheme employed in the code was also demonstrated.« less

  20. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of critiquing scientists' discovery encouraged students' articulation of scientific uncertainty sources in different ways.

  1. BridgeUP: STEM and Learning Astrophysics Interactively

    NASA Astrophysics Data System (ADS)

    Hernandez, Betsy; Geogdzhayeva, Maria; Beltre, Chasity; Ocasio, Adrienne; Skarbinski, Maya; Zbib, Daniela; Swar, Prachi; Mac Low, Mordecai

    2018-01-01

    BridgeUP: STEM is an initiative responding to the gender and opportunity gaps that exist in the STEM pipeline for women, girls, and under-resourced youth. The program engages high school girls in experiences at the intersection of computer science, scientific research, and visualization that will position them to succeed and lead in these fields. Students work on projects closely aligned with research taking place at the American Museum of Natural History. One of the current astronomy research projects at the museum simulates migration of black holes in active galactic nucleus disks using the Pencil Code. The work presented here focuses on interactive tools used to teach dynamical concepts pertaining to this project. These include Logger Pro, along with Vernier equipment, PhET Interactive Simulations, and Python. Throughout the internship, students also learn qualitative astrophysics via presentations, animations and videos. We discuss the success of utilizing the aforementioned tools in teaching, as well as showing work conducted by the six current students participating in this Astronomy research project.

  2. Preparing for in situ processing on upcoming leading-edge supercomputers

    DOE PAGES

    Kress, James; Churchill, Randy Michael; Klasky, Scott; ...

    2016-10-01

    High performance computing applications are producing increasingly large amounts of data and placing enormous stress on current capabilities for traditional post-hoc visualization techniques. Because of the growing compute and I/O imbalance, data reductions, including in situ visualization, are required. These reduced data are used for analysis and visualization in a variety of different ways. Many of he visualization and analysis requirements are known a priori, but when they are not, scientists are dependent on the reduced data to accurately represent the simulation in post hoc analysis. The contributions of this paper is a description of the directions we are pursuingmore » to assist a large scale fusion simulation code succeed on the next generation of supercomputers. Finally, these directions include the role of in situ processing for performing data reductions, as well as the tradeoffs between data size and data integrity within the context of complex operations in a typical scientific workflow.« less

  3. Evaluating the Efficacy of Wavelet Configurations on Turbulent-Flow Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Shaomeng; Gruchalla, Kenny; Potter, Kristin

    2015-10-25

    I/O is increasingly becoming a significant constraint for simulation codes and visualization tools on modern supercomputers. Data compression is an attractive workaround, and, in particular, wavelets provide a promising solution. However, wavelets can be applied in multiple configurations, and the variations in configuration impact accuracy, storage cost, and execution time. While the variation in these factors over wavelet configurations have been explored in image processing, they are not well understood for visualization and analysis of scientific data. To illuminate this issue, we evaluate multiple wavelet configurations on turbulent-flow data. Our approach is to repeat established analysis routines on uncompressed andmore » lossy-compressed versions of a data set, and then quantitatively compare their outcomes. Our findings show that accuracy varies greatly based on wavelet configuration, while storage cost and execution time vary less. Overall, our study provides new insights for simulation analysts and visualization experts, who need to make tradeoffs between accuracy, storage cost, and execution time.« less

  4. Neutronic calculation of fast reactors by the EUCLID/V1 integrated code

    NASA Astrophysics Data System (ADS)

    Koltashev, D. A.; Stakhanova, A. A.

    2017-01-01

    This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.

  5. The next-generation ESL continuum gyrokinetic edge code

    NASA Astrophysics Data System (ADS)

    Cohen, R.; Dorr, M.; Hittinger, J.; Rognlien, T.; Collela, P.; Martin, D.

    2009-05-01

    The Edge Simulation Laboratory (ESL) project is developing continuum-based approaches to kinetic simulation of edge plasmas. A new code is being developed, based on a conservative formulation and fourth-order discretization of full-f gyrokinetic equations in parallel-velocity, magnetic-moment coordinates. The code exploits mapped multiblock grids to deal with the geometric complexities of the edge region, and utilizes a new flux limiter [P. Colella and M.D. Sekora, JCP 227, 7069 (2008)] to suppress unphysical oscillations about discontinuities while maintaining high-order accuracy elsewhere. The code is just becoming operational; we will report initial tests for neoclassical orbit calculations in closed-flux surface and limiter (closed plus open flux surfaces) geometry. It is anticipated that the algorithmic refinements in the new code will address the slow numerical instability that was observed in some long simulations with the existing TEMPEST code. We will also discuss the status and plans for physics enhancements to the new code.

  6. On the error statistics of Viterbi decoding and the performance of concatenated codes

    NASA Technical Reports Server (NTRS)

    Miller, R. L.; Deutsch, L. J.; Butman, S. A.

    1981-01-01

    Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.

  7. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  8. Towards a supported common NEAMS software stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormac Garvey

    2012-04-01

    The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less

  9. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  10. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  11. Development of radiative transfer code for JUICE/SWI mission toward the atmosphere of icy moons of Jupiter

    NASA Astrophysics Data System (ADS)

    Yamada, Takayoshi; Kasai, Yasuko; Yoshida, Naohiro

    2016-07-01

    The Submillimeter Wave Instrument (SWI) is one of the scientific instruments on the JUpiter Icy moon Explorer (JUICE). We plan to observe atmospheric compositions including water vapor and its isotopomers in Galilean moons (Io, Europa, Ganymede, and Callisto). The frequency windows of SWI are 530 to 625 GHz and 1080 to 1275 GHz with 100 kHz spectral resolution. We are developing a radiative transfer code in Japan with line-by-line method for Ganymede atmosphere in THz region (0 - 3 THz). Molecular line parameters (line intensity and partition function) were taken from JPL (Jet Propulsion Laboratory) catalogue. The pencil beam was assumed to calculate a spectrum of H _{2}O and CO in rotational transitions at the THz region. We performed comparisons between our model and ARTS (Atmospheric Radiative Transfer Simulator). The difference were less than 10% and 5% for H _{2}O and CO, respectively, under the condition of the local thermodynamic equilibrium (LTE). Comparison with several models with non-LTE assumption will be presented.

  12. A microkernel design for component-based parallel numerical software systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.

    1999-01-13

    What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less

  13. RAY-RAMSES: a code for ray tracing on the fly in N-body simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barreira, Alexandre; Llinares, Claudio; Bose, Sownak

    2016-05-01

    We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less

  14. Scalability study of parallel spatial direct numerical simulation code on IBM SP1 parallel supercomputer

    NASA Technical Reports Server (NTRS)

    Hanebutte, Ulf R.; Joslin, Ronald D.; Zubair, Mohammad

    1994-01-01

    The implementation and the performance of a parallel spatial direct numerical simulation (PSDNS) code are reported for the IBM SP1 supercomputer. The spatially evolving disturbances that are associated with laminar-to-turbulent in three-dimensional boundary-layer flows are computed with the PS-DNS code. By remapping the distributed data structure during the course of the calculation, optimized serial library routines can be utilized that substantially increase the computational performance. Although the remapping incurs a high communication penalty, the parallel efficiency of the code remains above 40% for all performed calculations. By using appropriate compile options and optimized library routines, the serial code achieves 52-56 Mflops on a single node of the SP1 (45% of theoretical peak performance). The actual performance of the PSDNS code on the SP1 is evaluated with a 'real world' simulation that consists of 1.7 million grid points. One time step of this simulation is calculated on eight nodes of the SP1 in the same time as required by a Cray Y/MP for the same simulation. The scalability information provides estimated computational costs that match the actual costs relative to changes in the number of grid points.

  15. Test code for the assessment and improvement of Reynolds stress models

    NASA Technical Reports Server (NTRS)

    Rubesin, M. W.; Viegas, J. R.; Vandromme, D.; Minh, H. HA

    1987-01-01

    An existing two-dimensional, compressible flow, Navier-Stokes computer code, containing a full Reynolds stress turbulence model, was adapted for use as a test bed for assessing and improving turbulence models based on turbulence simulation experiments. To date, the results of using the code in comparison with simulated channel flow and over an oscillating flat plate have shown that the turbulence model used in the code needs improvement for these flows. It is also shown that direct simulation of turbulent flows over a range of Reynolds numbers are needed to guide subsequent improvement of turbulence models.

  16. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less

  17. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    NASA Astrophysics Data System (ADS)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  18. Kameleon Live: An Interactive Cloud Based Analysis and Visualization Platform for Space Weather Researchers

    NASA Astrophysics Data System (ADS)

    Pembroke, A. D.; Colbert, J. A.

    2015-12-01

    The Community Coordinated Modeling Center (CCMC) provides hosting for many of the simulations used by the space weather community of scientists, educators, and forecasters. CCMC users may submit model runs through the Runs on Request system, which produces static visualizations of model output in the browser, while further analysis may be performed off-line via Kameleon, CCMC's cross-language access and interpolation library. Off-line analysis may be suitable for power-users, but storage and coding requirements present a barrier to entry for non-experts. Moreover, a lack of a consistent framework for analysis hinders reproducibility of scientific findings. To that end, we have developed Kameleon Live, a cloud based interactive analysis and visualization platform. Kameleon Live allows users to create scientific studies built around selected runs from the Runs on Request database, perform analysis on those runs, collaborate with other users, and disseminate their findings among the space weather community. In addition to showcasing these novel collaborative analysis features, we invite feedback from CCMC users as we seek to advance and improve on the new platform.

  19. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, Stacy J; Symons, Christopher T

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  20. Benchmark Simulations of the Thermal-Hydraulic Responses during EBR-II Inherent Safety Tests using SAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui; Sumner, Tyler S.

    2016-04-17

    An advanced system analysis tool SAM is being developed for fast-running, improved-fidelity, and whole-plant transient analyses at Argonne National Laboratory under DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. As an important part of code development, companion validation activities are being conducted to ensure the performance and validity of the SAM code. This paper presents the benchmark simulations of two EBR-II tests, SHRT-45R and BOP-302R, whose data are available through the support of DOE-NE’s Advanced Reactor Technology (ART) program. The code predictions of major primary coolant system parameter are compared with the test results. Additionally, the SAS4A/SASSYS-1 code simulationmore » results are also included for a code-to-code comparison.« less

  1. A CellML simulation compiler and code generator using ODE solving schemes

    PubMed Central

    2012-01-01

    Models written in description languages such as CellML are becoming a popular solution to the handling of complex cellular physiological models in biological function simulations. However, in order to fully simulate a model, boundary conditions and ordinary differential equation (ODE) solving schemes have to be combined with it. Though boundary conditions can be described in CellML, it is difficult to explicitly specify ODE solving schemes using existing tools. In this study, we define an ODE solving scheme description language-based on XML and propose a code generation system for biological function simulations. In the proposed system, biological simulation programs using various ODE solving schemes can be easily generated. We designed a two-stage approach where the system generates the equation set associating the physiological model variable values at a certain time t with values at t + Δt in the first stage. The second stage generates the simulation code for the model. This approach enables the flexible construction of code generation modules that can support complex sets of formulas. We evaluate the relationship between models and their calculation accuracies by simulating complex biological models using various ODE solving schemes. Using the FHN model simulation, results showed good qualitative and quantitative correspondence with the theoretical predictions. Results for the Luo-Rudy 1991 model showed that only first order precision was achieved. In addition, running the generated code in parallel on a GPU made it possible to speed up the calculation time by a factor of 50. The CellML Compiler source code is available for download at http://sourceforge.net/projects/cellmlcompiler. PMID:23083065

  2. TWANG-PIC, a novel gyro-averaged one-dimensional particle-in-cell code for interpretation of gyrotron experiments

    NASA Astrophysics Data System (ADS)

    Braunmueller, F.; Tran, T. M.; Vuillemin, Q.; Alberti, S.; Genoud, J.; Hogge, J.-Ph.; Tran, M. Q.

    2015-06-01

    A new gyrotron simulation code for simulating the beam-wave interaction using a monomode time-dependent self-consistent model is presented. The new code TWANG-PIC is derived from the trajectory-based code TWANG by describing the electron motion in a gyro-averaged one-dimensional Particle-In-Cell (PIC) approach. In comparison to common PIC-codes, it is distinguished by its computation speed, which makes its use in parameter scans and in experiment interpretation possible. A benchmark of the new code is presented as well as a comparative study between the two codes. This study shows that the inclusion of a time-dependence in the electron equations, as it is the case in the PIC-approach, is mandatory for simulating any kind of non-stationary oscillations in gyrotrons. Finally, the new code is compared with experimental results and some implications of the violated model assumptions in the TWANG code are disclosed for a gyrotron experiment in which non-stationary regimes have been observed and for a critical case that is of interest in high power gyrotron development.

  3. TWANG-PIC, a novel gyro-averaged one-dimensional particle-in-cell code for interpretation of gyrotron experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braunmueller, F., E-mail: falk.braunmueller@epfl.ch; Tran, T. M.; Alberti, S.

    A new gyrotron simulation code for simulating the beam-wave interaction using a monomode time-dependent self-consistent model is presented. The new code TWANG-PIC is derived from the trajectory-based code TWANG by describing the electron motion in a gyro-averaged one-dimensional Particle-In-Cell (PIC) approach. In comparison to common PIC-codes, it is distinguished by its computation speed, which makes its use in parameter scans and in experiment interpretation possible. A benchmark of the new code is presented as well as a comparative study between the two codes. This study shows that the inclusion of a time-dependence in the electron equations, as it is themore » case in the PIC-approach, is mandatory for simulating any kind of non-stationary oscillations in gyrotrons. Finally, the new code is compared with experimental results and some implications of the violated model assumptions in the TWANG code are disclosed for a gyrotron experiment in which non-stationary regimes have been observed and for a critical case that is of interest in high power gyrotron development.« less

  4. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    NASA Astrophysics Data System (ADS)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  5. Video Monitoring a Simulation-Based Quality Improvement Program in Bihar, India.

    PubMed

    Dyer, Jessica; Spindler, Hilary; Christmas, Amelia; Shah, Malay Bharat; Morgan, Melissa; Cohen, Susanna R; Sterne, Jason; Mahapatra, Tanmay; Walker, Dilys

    2018-04-01

    Simulation-based training has become an accepted clinical training andragogy in high-resource settings with its use increasing in low-resource settings. Video recordings of simulated scenarios are commonly used by facilitators. Beyond using the videos during debrief sessions, researchers can also analyze the simulation videos to quantify technical and nontechnical skills during simulated scenarios over time. Little is known about the feasibility and use of large-scale systems to video record and analyze simulation and debriefing data for monitoring and evaluation in low-resource settings. This manuscript describes the process of designing and implementing a large-scale video monitoring system. Mentees and Mentors were consented and all simulations and debriefs conducted at 320 Primary Health Centers (PHCs) were video recorded. The system design, number of video recordings, and inter-rater reliability of the coded videos were assessed. The final dataset included a total of 11,278 videos. Overall, a total of 2,124 simulation videos were coded and 183 (12%) were blindly double-coded. For the double-coded sample, the average inter-rater reliability (IRR) scores were 80% for nontechnical skills, and 94% for clinical technical skills. Among 4,450 long debrief videos received, 216 were selected for coding and all were double-coded. Data quality of simulation videos was found to be very good in terms of recorded instances of "unable to see" and "unable to hear" in Phases 1 and 2. This study demonstrates that video monitoring systems can be effectively implemented at scale in resource limited settings. Further, video monitoring systems can play several vital roles within program implementation, including monitoring and evaluation, provision of actionable feedback to program implementers, and assurance of program fidelity.

  6. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes

    PubMed Central

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-01-01

    OBJECTIVE: To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. DESIGN: Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. SETTING: Paediatric residency program at BC Children’s Hospital, Vancouver, British Columbia. INTERVENTIONS: The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. RESULTS: A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. CONCLUSIONS: A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes. PMID:23372405

  7. Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.

    2007-03-01

    In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.

  8. Simulation of Combustion Systems with Realistic g-jitter

    NASA Technical Reports Server (NTRS)

    Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.

    2003-01-01

    In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.

  9. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    NASA Astrophysics Data System (ADS)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  10. 3D Multispecies Nonlinear Perturbative Particle Simulation of Intense Nonneutral Particle Beams (Research supported by the Department of Energy and the Short Pulse Spallation Source Project and LANSCE Division of LANL.)

    NASA Astrophysics Data System (ADS)

    Qin, Hong; Davidson, Ronald C.; Lee, W. Wei-Li

    1999-11-01

    The Beam Equilibrium Stability and Transport (BEST) code, a 3D multispecies nonlinear perturbative particle simulation code, has been developed to study collective effects in intense charged particle beams described self-consistently by the Vlasov-Maxwell equations. A Darwin model is adopted for transverse electromagnetic effects. As a 3D multispecies perturbative particle simulation code, it provides several unique capabilities. Since the simulation particles are used to simulate only the perturbed distribution function and self-fields, the simulation noise is reduced significantly. The perturbative approach also enables the code to investigate different physics effects separately, as well as simultaneously. The code can be easily switched between linear and nonlinear operation, and used to study both linear stability properties and nonlinear beam dynamics. These features, combined with 3D and multispecies capabilities, provides an effective tool to investigate the electron-ion two-stream instability, periodically focused solutions in alternating focusing fields, and many other important problems in nonlinear beam dynamics and accelerator physics. Applications to the two-stream instability are presented.

  11. DBCC Software as Database for Collisional Cross-Sections

    NASA Astrophysics Data System (ADS)

    Moroz, Daniel; Moroz, Paul

    2014-10-01

    Interactions of species, such as atoms, radicals, molecules, electrons, and photons, in plasmas used for materials processing could be very complex, and many of them could be described in terms of collisional cross-sections. Researchers involved in plasma simulations must select reasonable cross-sections for collisional processes for implementing them into their simulation codes to be able to correctly simulate plasmas. However, collisional cross-section data are difficult to obtain, and, for some collisional processes, the cross-sections are still not known. Data on collisional cross-sections can be obtained from numerous sources including numerical calculations, experiments, journal articles, conference proceedings, scientific reports, various universities' websites, national labs and centers specifically devoted to collecting data on cross-sections. The cross-sections data received from different sources could be partial, corresponding to limited energy ranges, or could even not be in agreement. The DBCC software package was designed to help researchers in collecting, comparing, and selecting cross-sections, some of which could be constructed from others or chosen as defaults. This is important as different researchers may place trust in different cross-sections or in different sources. We will discuss the details of DBCC and demonstrate how it works and why it is beneficial to researchers working on plasma simulations.

  12. 41 CFR Appendix C to Chapter 301 - Standard Data Elements for Federal Travel [Traveler Identification

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... education, in scientific, professional, technical, mechanical, trade, clerical, fiscal, administrative, or... Data Elements for Federal Travel [Accounting & Certification] Group name Data elements Description Accounting Classification Accounting Code Agency accounting code. Non-Federal Source Indicator Per Diem...

  13. Three-dimensional simulation of triode-type MIG for 1 MW, 120 GHz gyrotron for ECRH applications

    NASA Astrophysics Data System (ADS)

    Singh, Udaybir; Kumar, Nitin; Kumar, Narendra; Kumar, Anil; Sinha, A. K.

    2012-01-01

    In this paper, the three-dimensional simulation of triode-type magnetron injection gun (MIG) for 120 GHz, 1 MW gyrotron is presented. The operating voltages of the modulating anode and the accelerating anode are 57 kV and 80 kV respectively. The high order TE 22,6 mode is selected as the operating mode and the electron beam is launched at the first radial maxima for the fundamental beam-mode operation. The initial design is obtained by using the in-house developed code MIGSYN. The numerical simulation is performed by using the commercially available code CST-Particle Studio (PS). The simulated results of MIG obtained by using CST-PS are validated with other simulation codes EGUN and TRAK, respectively. The results on the design output parameters obtained by using these three codes are found to be in close agreement.

  14. A Novel Technique for Running the NASA Legacy Code LAPIN Synchronously With Simulations Developed Using Simulink

    NASA Technical Reports Server (NTRS)

    Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.

    2012-01-01

    This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.

  15. Code modernization and modularization of APEX and SWAT watershed simulation models

    USDA-ARS?s Scientific Manuscript database

    SWAT (Soil and Water Assessment Tool) and APEX (Agricultural Policy / Environmental eXtender) are respectively large and small watershed simulation models derived from EPIC Environmental Policy Integrated Climate), a field-scale agroecology simulation model. All three models are coded in FORTRAN an...

  16. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  17. Three Dimensional Hybrid Simulations of Super-Alfvénic Laser Ablation Experiments in the Large Plasma Device

    NASA Astrophysics Data System (ADS)

    Clark, Stephen; Winske, Dan; Schaeffer, Derek; Everson, Erik; Bondarenko, Anton; Constantin, Carmen; Niemann, Christoph

    2014-10-01

    We present 3D hybrid simulations of laser produced expanding debris clouds propagating though a magnetized ambient plasma in the context of magnetized collisionless shocks. New results from the 3D code are compared to previously obtained simulation results using a 2D hybrid code. The 3D code is an extension of a previously developed 2D code developed at Los Alamos National Laboratory. It has been parallelized and ported to execute on a cluster environment. The new simulations are used to verify scaling relationships, such as shock onset time and coupling parameter (Rm /ρd), developed via 2D simulations. Previous 2D results focus primarily on laboratory shock formation relevant to experiments being performed on the Large Plasma Device, where the shock propagates across the magnetic field. The new 3D simulations show wave structure and dynamics oblique to the magnetic field that introduce new physics to be considered in future experiments.

  18. Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David

    This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.

  19. Simulation of Weld Mechanical Behavior to Include Welding-Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes

    DTIC Science & Technology

    2015-11-01

    Memorandum Simulation of Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes... Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes by Charles R. Fisher...TYPE Technical Report 3. DATES COVERED (From - To) Dec 2013 – July 2015 4. TITLE AND SUBTITLE Simulation of Weld Mechanical Behavior to Include

  20. Simulation of Weld Mechanical Behavior to Include Welding Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes

    DTIC Science & Technology

    2015-11-01

    Memorandum Simulation of Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes... Weld Mechanical Behavior to Include Welding -Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes by Charles R. Fisher...TYPE Technical Report 3. DATES COVERED (From - To) Dec 2013 – July 2015 4. TITLE AND SUBTITLE Simulation of Weld Mechanical Behavior to Include

  1. Creating and Testing Simulation Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  2. High dynamic range coding imaging system

    NASA Astrophysics Data System (ADS)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  3. The Plasma Simulation Code: A modern particle-in-cell code with patch-based load-balancing

    NASA Astrophysics Data System (ADS)

    Germaschewski, Kai; Fox, William; Abbott, Stephen; Ahmadi, Narges; Maynard, Kristofor; Wang, Liang; Ruhl, Hartmut; Bhattacharjee, Amitava

    2016-08-01

    This work describes the Plasma Simulation Code (PSC), an explicit, electromagnetic particle-in-cell code with support for different order particle shape functions. We review the basic components of the particle-in-cell method as well as the computational architecture of the PSC code that allows support for modular algorithms and data structure in the code. We then describe and analyze in detail a distinguishing feature of PSC: patch-based load balancing using space-filling curves which is shown to lead to major efficiency gains over unbalanced methods and a previously used simpler balancing method.

  4. Coupled Kinetic-MHD Simulations of Divertor Heat Load with ELM Perturbations

    NASA Astrophysics Data System (ADS)

    Cummings, Julian; Chang, C. S.; Park, Gunyoung; Sugiyama, Linda; Pankin, Alexei; Klasky, Scott; Podhorszki, Norbert; Docan, Ciprian; Parashar, Manish

    2010-11-01

    The effect of Type-I ELM activity on divertor plate heat load is a key component of the DOE OFES Joint Research Target milestones for this year. In this talk, we present simulations of kinetic edge physics, ELM activity, and the associated divertor heat loads in which we couple the discrete guiding-center neoclassical transport code XGC0 with the nonlinear extended MHD code M3D using the End-to-end Framework for Fusion Integrated Simulations, or EFFIS. In these coupled simulations, the kinetic code and the MHD code run concurrently on the same massively parallel platform and periodic data exchanges are performed using a memory-to-memory coupling technology provided by EFFIS. The M3D code models the fast ELM event and sends frequent updates of the magnetic field perturbations and electrostatic potential to XGC0, which in turn tracks particle dynamics under the influence of these perturbations and collects divertor particle and energy flux statistics. We describe here how EFFIS technologies facilitate these coupled simulations and discuss results for DIII-D, NSTX and Alcator C-Mod tokamak discharges.

  5. Parallelized direct execution simulation of message-passing parallel programs

    NASA Technical Reports Server (NTRS)

    Dickens, Phillip M.; Heidelberger, Philip; Nicol, David M.

    1994-01-01

    As massively parallel computers proliferate, there is growing interest in findings ways by which performance of massively parallel codes can be efficiently predicted. This problem arises in diverse contexts such as parallelizing computers, parallel performance monitoring, and parallel algorithm development. In this paper we describe one solution where one directly executes the application code, but uses a discrete-event simulator to model details of the presumed parallel machine such as operating system and communication network behavior. Because this approach is computationally expensive, we are interested in its own parallelization specifically the parallelization of the discrete-event simulator. We describe methods suitable for parallelized direct execution simulation of message-passing parallel programs, and report on the performance of such a system, Large Application Parallel Simulation Environment (LAPSE), we have built on the Intel Paragon. On all codes measured to date, LAPSE predicts performance well typically within 10 percent relative error. Depending on the nature of the application code, we have observed low slowdowns (relative to natively executing code) and high relative speedups using up to 64 processors.

  6. Umbra (core)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.

    Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.

  7. GAPD: a GPU-accelerated atom-based polychromatic diffraction simulation code.

    PubMed

    E, J C; Wang, L; Chen, S; Zhang, Y Y; Luo, S N

    2018-03-01

    GAPD, a graphics-processing-unit (GPU)-accelerated atom-based polychromatic diffraction simulation code for direct, kinematics-based, simulations of X-ray/electron diffraction of large-scale atomic systems with mono-/polychromatic beams and arbitrary plane detector geometries, is presented. This code implements GPU parallel computation via both real- and reciprocal-space decompositions. With GAPD, direct simulations are performed of the reciprocal lattice node of ultralarge systems (∼5 billion atoms) and diffraction patterns of single-crystal and polycrystalline configurations with mono- and polychromatic X-ray beams (including synchrotron undulator sources), and validation, benchmark and application cases are presented.

  8. The NEST Dry-Run Mode: Efficient Dynamic Analysis of Neuronal Network Simulation Code.

    PubMed

    Kunkel, Susanne; Schenck, Wolfram

    2017-01-01

    NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling.

  9. The NEST Dry-Run Mode: Efficient Dynamic Analysis of Neuronal Network Simulation Code

    PubMed Central

    Kunkel, Susanne; Schenck, Wolfram

    2017-01-01

    NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling. PMID:28701946

  10. Exploring the Lived Experiences of Participants in Simulation-Based Learning Activities

    ERIC Educational Resources Information Center

    Beard, Rachael

    2013-01-01

    There is currently a small body of research on the experiences of participants, both facilitators and learners, during simulated mock codes (cardiac arrest) in the healthcare setting. This study was based on a practitioner's concerns that mock codes are facilitated differently among educators, mock codes are not aligned with andragogy theory of…

  11. Smoothed Particle Hydrodynamic Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-10-05

    This code is a highly modular framework for developing smoothed particle hydrodynamic (SPH) simulations running on parallel platforms. The compartmentalization of the code allows for rapid development of new SPH applications and modifications of existing algorithms. The compartmentalization also allows changes in one part of the code used by many applications to instantly be made available to all applications.

  12. Enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding for four-level holographic data storage systems

    NASA Astrophysics Data System (ADS)

    Kong, Gyuyeol; Choi, Sooyong

    2017-09-01

    An enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding is proposed for four-level holographic data storage systems. While the previous four-ary modulation codes focus on preventing maximum two-dimensional intersymbol interference patterns, the proposed four-ary modulation code aims at maximizing the coding gains for better bit error rate performances. For achieving significant coding gains from the four-ary modulation codes, we design a new 2/3 four-ary modulation code in order to enlarge the free distance on the trellis through extensive simulation. The free distance of the proposed four-ary modulation code is extended from 1.21 to 2.04 compared with that of the conventional four-ary modulation code. The simulation result shows that the proposed four-ary modulation code has more than 1 dB gains compared with the conventional four-ary modulation code.

  13. Development of a dynamic coupled hydro-geomechanical code and its application to induced seismicity

    NASA Astrophysics Data System (ADS)

    Miah, Md Mamun

    This research describes the importance of a hydro-geomechanical coupling in the geologic sub-surface environment from fluid injection at geothermal plants, large-scale geological CO2 sequestration for climate mitigation, enhanced oil recovery, and hydraulic fracturing during wells construction in the oil and gas industries. A sequential computational code is developed to capture the multiphysics interaction behavior by linking a flow simulation code TOUGH2 and a geomechanics modeling code PyLith. Numerical formulation of each code is discussed to demonstrate their modeling capabilities. The computational framework involves sequential coupling, and solution of two sub-problems- fluid flow through fractured and porous media and reservoir geomechanics. For each time step of flow calculation, pressure field is passed to the geomechanics code to compute effective stress field and fault slips. A simplified permeability model is implemented in the code that accounts for the permeability of porous and saturated rocks subject to confining stresses. The accuracy of the TOUGH-PyLith coupled simulator is tested by simulating Terzaghi's 1D consolidation problem. The modeling capability of coupled poroelasticity is validated by benchmarking it against Mandel's problem. The code is used to simulate both quasi-static and dynamic earthquake nucleation and slip distribution on a fault from the combined effect of far field tectonic loading and fluid injection by using an appropriate fault constitutive friction model. Results from the quasi-static induced earthquake simulations show a delayed response in earthquake nucleation. This is attributed to the increased total stress in the domain and not accounting for pressure on the fault. However, this issue is resolved in the final chapter in simulating a single event earthquake dynamic rupture. Simulation results show that fluid pressure has a positive effect on slip nucleation and subsequent crack propagation. This is confirmed by running a sensitivity analysis that shows an increase in injection well distance results in delayed slip nucleation and rupture propagation on the fault.

  14. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  15. Parallel Grand Canonical Monte Carlo (ParaGrandMC) Simulation Code

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.

    2016-01-01

    This report provides an overview of the Parallel Grand Canonical Monte Carlo (ParaGrandMC) simulation code. This is a highly scalable parallel FORTRAN code for simulating the thermodynamic evolution of metal alloy systems at the atomic level, and predicting the thermodynamic state, phase diagram, chemical composition and mechanical properties. The code is designed to simulate multi-component alloy systems, predict solid-state phase transformations such as austenite-martensite transformations, precipitate formation, recrystallization, capillary effects at interfaces, surface absorption, etc., which can aid the design of novel metallic alloys. While the software is mainly tailored for modeling metal alloys, it can also be used for other types of solid-state systems, and to some degree for liquid or gaseous systems, including multiphase systems forming solid-liquid-gas interfaces.

  16. Some issues and subtleties in numerical simulation of X-ray FEL's

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fawley, William M.

    Part of the overall design effort for x-ray FEL's such as the LCLS and TESLA projects has involved extensive use of particle simulation codes to predict their output performance and underlying sensitivity to various input parameters (e.g. electron beam emittance). This paper discusses some of the numerical issues that must be addressed by simulation codes in this regime. We first give a brief overview of the standard approximations and simulation methods adopted by time-dependent(i.e. polychromatic) codes such as GINGER, GENESIS, and FAST3D, including the effects of temporal discretization and the resultant limited spectral bandpass,and then discuss the accuracies and inaccuraciesmore » of these codes in predicting incoherent spontaneous emission (i.e. the extremely low gain regime).« less

  17. Exploring the Ability of a Coarse-grained Potential to Describe the Stress-strain Response of Glassy Polystyrene

    DTIC Science & Technology

    2012-10-01

    using the open-source code Large-scale Atomic/Molecular Massively Parallel Simulator ( LAMMPS ) (http://lammps.sandia.gov) (23). The commercial...parameters are proprietary and cannot be ported to the LAMMPS 4 simulation code. In our molecular dynamics simulations at the atomistic resolution, we...IBI iterative Boltzmann inversion LAMMPS Large-scale Atomic/Molecular Massively Parallel Simulator MAPS Materials Processes and Simulations MS

  18. plasmaFoam: An OpenFOAM framework for computational plasma physics and chemistry

    NASA Astrophysics Data System (ADS)

    Venkattraman, Ayyaswamy; Verma, Abhishek Kumar

    2016-09-01

    As emphasized in the 2012 Roadmap for low temperature plasmas (LTP), scientific computing has emerged as an essential tool for the investigation and prediction of the fundamental physical and chemical processes associated with these systems. While several in-house and commercial codes exist, with each having its own advantages and disadvantages, a common framework that can be developed by researchers from all over the world will likely accelerate the impact of computational studies on advances in low-temperature plasma physics and chemistry. In this regard, we present a finite volume computational toolbox to perform high-fidelity simulations of LTP systems. This framework, primarily based on the OpenFOAM solver suite, allows us to enhance our understanding of multiscale plasma phenomenon by performing massively parallel, three-dimensional simulations on unstructured meshes using well-established high performance computing tools that are widely used in the computational fluid dynamics community. In this talk, we will present preliminary results obtained using the OpenFOAM-based solver suite with benchmark three-dimensional simulations of microplasma devices including both dielectric and plasma regions. We will also discuss the future outlook for the solver suite.

  19. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  20. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  1. Improving fast generation of halo catalogues with higher order Lagrangian perturbation theory

    NASA Astrophysics Data System (ADS)

    Munari, Emiliano; Monaco, Pierluigi; Sefusatti, Emiliano; Castorina, Emanuele; Mohammad, Faizan G.; Anselmi, Stefano; Borgani, Stefano

    2017-03-01

    We present the latest version of PINOCCHIO, a code that generates catalogues of dark matter haloes in an approximate but fast way with respect to an N-body simulation. This code version implements a new on-the-fly production of halo catalogue on the past light cone with continuous time sampling, and the computation of particle and halo displacements are extended up to third-order Lagrangian perturbation theory (LPT), in contrast with previous versions that used Zel'dovich approximation. We run PINOCCHIO on the same initial configuration of a reference N-body simulation, so that the comparison extends to the object-by-object level. We consider haloes at redshifts 0 and 1, using different LPT orders either for halo construction or to compute halo final positions. We compare the clustering properties of PINOCCHIO haloes with those from the simulation by computing the power spectrum and two-point correlation function in real and redshift space (monopole and quadrupole), the bispectrum and the phase difference of halo distributions. We find that 2LPT and 3LPT give noticeable improvement. 3LPT provides the best agreement with N-body when it is used to displace haloes, while 2LPT gives better results for constructing haloes. At the highest orders, linear bias is typically recovered at a few per cent level. In Fourier space and using 3LPT for halo displacements, the halo power spectrum is recovered to within 10 per cent up to kmax ∼ 0.5 h Mpc-1. The results presented in this paper have interesting implications for the generation of large ensemble of mock surveys for the scientific exploitation of data from big surveys.

  2. Global MHD simulation of magnetosphere using HPF

    NASA Astrophysics Data System (ADS)

    Ogino, T.

    We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5% using 56 PEs of Fujitsu VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.

  3. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  4. Nonlinear to Linear Elastic Code Coupling in 2-D Axisymmetric Media.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preston, Leiph

    Explosions within the earth nonlinearly deform the local media, but at typical seismological observation distances, the seismic waves can be considered linear. Although nonlinear algorithms can simulate explosions in the very near field well, these codes are computationally expensive and inaccurate at propagating these signals to great distances. A linearized wave propagation code, coupled to a nonlinear code, provides an efficient mechanism to both accurately simulate the explosion itself and to propagate these signals to distant receivers. To this end we have coupled Sandia's nonlinear simulation algorithm CTH to a linearized elastic wave propagation code for 2-D axisymmetric media (axiElasti)more » by passing information from the nonlinear to the linear code via time-varying boundary conditions. In this report, we first develop the 2-D axisymmetric elastic wave equations in cylindrical coordinates. Next we show how we design the time-varying boundary conditions passing information from CTH to axiElasti, and finally we demonstrate the coupling code via a simple study of the elastic radius.« less

  5. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  6. Developing Information Power Grid Based Algorithms and Software

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack

    1998-01-01

    This was an exploratory study to enhance our understanding of problems involved in developing large scale applications in a heterogeneous distributed environment. It is likely that the large scale applications of the future will be built by coupling specialized computational modules together. For example, efforts now exist to couple ocean and atmospheric prediction codes to simulate a more complete climate system. These two applications differ in many respects. They have different grids, the data is in different unit systems and the algorithms for inte,-rating in time are different. In addition the code for each application is likely to have been developed on different architectures and tend to have poor performance when run on an architecture for which the code was not designed, if it runs at all. Architectural differences may also induce differences in data representation which effect precision and convergence criteria as well as data transfer issues. In order to couple such dissimilar codes some form of translation must be present. This translation should be able to handle interpolation from one grid to another as well as construction of the correct data field in the correct units from available data. Even if a code is to be developed from scratch, a modular approach will likely be followed in that standard scientific packages will be used to do the more mundane tasks such as linear algebra or Fourier transform operations. This approach allows the developers to concentrate on their science rather than becoming experts in linear algebra or signal processing. Problems associated with this development approach include difficulties associated with data extraction and translation from one module to another, module performance on different nodal architectures, and others. In addition to these data and software issues there exists operational issues such as platform stability and resource management.

  7. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  8. Further Studies of the NRL Collective Particle Accelerator VIA Numerical Modeling with the MAGIC Code.

    DTIC Science & Technology

    1984-08-01

    COLLFCTIVF PAPTTCLE ACCELERATOR VIA NUMERICAL MODFLINC WITH THF MAGIC CODE Robert 1. Darker Auqust 19F4 Final Report for Period I April. qI84 - 30...NUMERICAL MODELING WITH THE MAGIC CODE Robert 3. Barker August 1984 Final Report for Period 1 April 1984 - 30 September 1984 Prepared for: Scientific...Collective Final Report Particle Accelerator VIA Numerical Modeling with April 1 - September-30, 1984 MAGIC Code. 6. PERFORMING ORG. REPORT NUMBER MRC/WDC-R

  9. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  10. Multi-Region Boundary Element Analysis for Coupled Thermal-Fracturing Processes in Geomaterials

    NASA Astrophysics Data System (ADS)

    Shen, Baotang; Kim, Hyung-Mok; Park, Eui-Seob; Kim, Taek-Kon; Wuttke, Manfred W.; Rinne, Mikael; Backers, Tobias; Stephansson, Ove

    2013-01-01

    This paper describes a boundary element code development on coupled thermal-mechanical processes of rock fracture propagation. The code development was based on the fracture mechanics code FRACOD that has previously been developed by Shen and Stephansson (Int J Eng Fracture Mech 47:177-189, 1993) and FRACOM (A fracture propagation code—FRACOD, User's manual. FRACOM Ltd. 2002) and simulates complex fracture propagation in rocks governed by both tensile and shear mechanisms. For the coupled thermal-fracturing analysis, an indirect boundary element method, namely the fictitious heat source method, was implemented in FRACOD to simulate the temperature change and thermal stresses in rocks. This indirect method is particularly suitable for the thermal-fracturing coupling in FRACOD where the displacement discontinuity method is used for mechanical simulation. The coupled code was also extended to simulate multiple region problems in which rock mass, concrete linings and insulation layers with different thermal and mechanical properties were present. Both verification and application cases were presented where a point heat source in a 2D infinite medium and a pilot LNG underground cavern were solved and studied using the coupled code. Good agreement was observed between the simulation results, analytical solutions and in situ measurements which validates an applicability of the developed coupled code.

  11. Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.

    1998-01-01

    A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

  12. Analysis and Simulation of Narrowband GPS Jamming Using Digital Excision Temporal Filtering.

    DTIC Science & Technology

    1994-12-01

    the sequence of stored values from the P- code sampled at a 20 MHz rate. When correlated with a reference vector of the same length to simulate a GPS ...rate required for the GPS signals, (20 MHz sampling rate for the P- code signal), the personal computer (PC) used run the simulation could not perform...This subroutine is used to perform a fast FFT based 168 biased cross correlation . Written by Capt Gerry Falen, USAF, 16 AUG 94 % start of code

  13. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  14. Large Eddy Simulation of Flow in Turbine Cascades Using LESTool and UNCLE Codes

    NASA Technical Reports Server (NTRS)

    Huang, P. G.

    2004-01-01

    During the period December 23,1997 and December August 31,2004, we accomplished the development of 2 CFD codes for DNS/LES/RANS simulation of turbine cascade flows, namely LESTool and UNCLE. LESTool is a structured code making use of 5th order upwind differencing scheme and UNCLE is a second-order-accuracy unstructured code. LESTool has both Dynamic SGS and Spalart's DES models and UNCLE makes use of URANS and DES models. The current report provides a description of methodologies used in the codes.

  15. Large Eddy Simulation of Flow in Turbine Cascades Using LEST and UNCLE Codes

    NASA Technical Reports Server (NTRS)

    Ashpis, David (Technical Monitor); Huang, P. G.

    2004-01-01

    During the period December 23, 1997 and December August 31, 2004, we accomplished the development of 2 CFD codes for DNS/LES/RANS simulation of turbine cascade flows, namely LESTool and UNCLE. LESTool is a structured code making use of 5th order upwind differencing scheme and UNCLE is a second-order-accuracy unstructured code. LESTool has both Dynamic SGS and Sparlart's DES models and UNCLE makes use of URANS and DES models. The current report provides a description of methodologies used in the codes.

  16. QR code for medical information uses.

    PubMed

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  17. ASCR/HEP Exascale Requirements Review Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert; Gerber, Richard

    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, tomore » store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less

  18. ASCR/HEP Exascale Requirements Review Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; et al.

    2016-03-30

    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, tomore » store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less

  19. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, III, F. G.

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less

  20. Preparation macroconstants to simulate the core of VVER-1000 reactor

    NASA Astrophysics Data System (ADS)

    Seleznev, V. Y.

    2017-01-01

    Dynamic model is used in simulators of VVER-1000 reactor for training of operating staff and students. As a code for the simulation of neutron-physical characteristics is used DYNCO code that allows you to perform calculations of stationary, transient and emergency processes in real time to a different geometry of the reactor lattices [1]. To perform calculations using this code, you need to prepare macroconstants for each FA. One way of getting macroconstants is to use the WIMS code, which is based on the use of its own 69-group macroconstants library. This paper presents the results of calculations of FA obtained by the WIMS code for VVER-1000 reactor with different parameters of fuel and coolant, as well as the method of selection of energy groups for further calculation macroconstants.

  1. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation.

    PubMed

    Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir

    2009-11-01

    Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  2. Real-time visual simulation of APT system based on RTW and Vega

    NASA Astrophysics Data System (ADS)

    Xiong, Shuai; Fu, Chengyu; Tang, Tao

    2012-10-01

    The Matlab/Simulink simulation model of APT (acquisition, pointing and tracking) system is analyzed and established. Then the model's C code which can be used for real-time simulation is generated by RTW (Real-Time Workshop). Practical experiments show, the simulation result of running the C code is the same as running the Simulink model directly in the Matlab environment. MultiGen-Vega is a real-time 3D scene simulation software system. With it and OpenGL, the APT scene simulation platform is developed and used to render and display the virtual scenes of the APT system. To add some necessary graphics effects to the virtual scenes real-time, GLSL (OpenGL Shading Language) shaders are used based on programmable GPU. By calling the C code, the scene simulation platform can adjust the system parameters on-line and get APT system's real-time simulation data to drive the scenes. Practical application shows that this visual simulation platform has high efficiency, low charge and good simulation effect.

  3. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  4. Turbulence dissipation challenge: particle-in-cell simulations

    NASA Astrophysics Data System (ADS)

    Roytershteyn, V.; Karimabadi, H.; Omelchenko, Y.; Germaschewski, K.

    2015-12-01

    We discuss application of three particle in cell (PIC) codes to the problems relevant to turbulence dissipation challenge. VPIC is a fully kinetic code extensively used to study a variety of diverse problems ranging from laboratory plasmas to astrophysics. PSC is a flexible fully kinetic code offering a variety of algorithms that can be advantageous to turbulence simulations, including high order particle shapes, dynamic load balancing, and ability to efficiently run on Graphics Processing Units (GPUs). Finally, HYPERS is a novel hybrid (kinetic ions+fluid electrons) code, which utilizes asynchronous time advance and a number of other advanced algorithms. We present examples drawn both from large-scale turbulence simulations and from the test problems outlined by the turbulence dissipation challenge. Special attention is paid to such issues as the small-scale intermittency of inertial range turbulence, mode content of the sub-proton range of scales, the formation of electron-scale current sheets and the role of magnetic reconnection, as well as numerical challenges of applying PIC codes to simulations of astrophysical turbulence.

  5. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  6. Recent Developments in the Code RITRACKS (Relativistic Ion Tracks)

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Ponomarev, Artem L.; Blattnig, Steve R.

    2018-01-01

    The code RITRACKS (Relativistic Ion Tracks) was developed to simulate detailed stochastic radiation track structures of ions of different types and energies. Many new capabilities were added to the code during the recent years. Several options were added to specify the times at which the tracks appear in the irradiated volume, allowing the simulation of dose-rate effects. The code has been used to simulate energy deposition in several targets: spherical, ellipsoidal and cylindrical. More recently, density changes as well as a spherical shell were implemented for spherical targets, in order to simulate energy deposition in walled tissue equivalent proportional counters. RITRACKS is used as a part of the new program BDSTracks (Biological Damage by Stochastic Tracks) to simulate several types of chromosome aberrations in various irradiation conditions. The simulation of damage to various DNA structures (linear and chromatin fiber) by direct and indirect effects has been improved and is ongoing. Many improvements were also made to the graphic user interface (GUI), including the addition of several labels allowing changes of units. A new GUI has been added to display the electron ejection vectors. The parallel calculation capabilities, notably the pre- and post-simulation processing on Windows and Linux machines have been reviewed to make them more portable between different systems. The calculation part is currently maintained in an Atlassian Stash® repository for code tracking and possibly future collaboration.

  7. Kinetic modeling of x-ray laser-driven solid Al plasmas via particle-in-cell simulation

    NASA Astrophysics Data System (ADS)

    Royle, R.; Sentoku, Y.; Mancini, R. C.; Paraschiv, I.; Johzaki, T.

    2017-06-01

    Solid-density plasmas driven by intense x-ray free-electron laser (XFEL) radiation are seeded by sources of nonthermal photoelectrons and Auger electrons that ionize and heat the target via collisions. Simulation codes that are commonly used to model such plasmas, such as collisional-radiative (CR) codes, typically assume a Maxwellian distribution and thus instantaneous thermalization of the source electrons. In this study, we present a detailed description and initial applications of a collisional particle-in-cell code, picls, that has been extended with a self-consistent radiation transport model and Monte Carlo models for photoionization and K L L Auger ionization, enabling the fully kinetic simulation of XFEL-driven plasmas. The code is used to simulate two experiments previously performed at the Linac Coherent Light Source investigating XFEL-driven solid-density Al plasmas. It is shown that picls-simulated pulse transmissions using the Ecker-Kröll continuum-lowering model agree much better with measurements than do simulations using the Stewart-Pyatt model. Good quantitative agreement is also found between the time-dependent picls results and those of analogous simulations by the CR code scfly, which was used in the analysis of the experiments to accurately reproduce the observed K α emissions and pulse transmissions. Finally, it is shown that the effects of the nonthermal electrons are negligible for the conditions of the particular experiments under investigation.

  8. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  9. A new paradigm for reproducing and analyzing N-body simulations of planetary systems

    NASA Astrophysics Data System (ADS)

    Rein, Hanno; Tamayo, Daniel

    2017-05-01

    The reproducibility of experiments is one of the main principles of the scientific method. However, numerical N-body experiments, especially those of planetary systems, are currently not reproducible. In the most optimistic scenario, they can only be replicated in an approximate or statistical sense. Even if authors share their full source code and initial conditions, differences in compilers, libraries, operating systems or hardware often lead to qualitatively different results. We provide a new set of easy-to-use, open-source tools that address the above issues, allowing for exact (bit-by-bit) reproducibility of N-body experiments. In addition to generating completely reproducible integrations, we show that our framework also offers novel and innovative ways to analyse these simulations. As an example, we present a high-accuracy integration of the Solar system spanning 10 Gyr, requiring several weeks to run on a modern CPU. In our framework, we can not only easily access simulation data at predefined intervals for which we save snapshots, but at any time during the integration. We achieve this by integrating an on-demand reconstructed simulation forward in time from the nearest snapshot. This allows us to extract arbitrary quantities at any point in the saved simulation exactly (bit-by-bit), and within seconds rather than weeks. We believe that the tools we present in this paper offer a new paradigm for how N-body simulations are run, analysed and shared across the community.

  10. Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connaway, H. M.; Lee, C. H.

    The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To supportmore » this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.« less

  11. 77 FR 31325 - National Fire Codes: Request for Comments on NFPA Technical Committee Reports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-25

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request... publishing this notice on behalf of the National Fire Protection Association (NFPA) to announce the... National Fire Protection Association (NFPA) has accomplished its mission by advocating scientifically based...

  12. Radiation Transport Tools for Space Applications: A Review

    NASA Technical Reports Server (NTRS)

    Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn

    2008-01-01

    This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.

  13. Scientific Assistant Virtual Laboratory (SAVL)

    NASA Astrophysics Data System (ADS)

    Alaghband, Gita; Fardi, Hamid; Gnabasik, David

    2007-03-01

    The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.

  14. A hybrid gyrokinetic ion and isothermal electron fluid code for astrophysical plasma

    NASA Astrophysics Data System (ADS)

    Kawazura, Y.; Barnes, M.

    2018-05-01

    This paper describes a new code for simulating astrophysical plasmas that solves a hybrid model composed of gyrokinetic ions (GKI) and an isothermal electron fluid (ITEF) Schekochihin et al. (2009) [9]. This model captures ion kinetic effects that are important near the ion gyro-radius scale while electron kinetic effects are ordered out by an electron-ion mass ratio expansion. The code is developed by incorporating the ITEF approximation into AstroGK, an Eulerian δf gyrokinetics code specialized to a slab geometry Numata et al. (2010) [41]. The new code treats the linear terms in the ITEF equations implicitly while the nonlinear terms are treated explicitly. We show linear and nonlinear benchmark tests to prove the validity and applicability of the simulation code. Since the fast electron timescale is eliminated by the mass ratio expansion, the Courant-Friedrichs-Lewy condition is much less restrictive than in full gyrokinetic codes; the present hybrid code runs ∼ 2√{mi /me } ∼ 100 times faster than AstroGK with a single ion species and kinetic electrons where mi /me is the ion-electron mass ratio. The improvement of the computational time makes it feasible to execute ion scale gyrokinetic simulations with a high velocity space resolution and to run multiple simulations to determine the dependence of turbulent dynamics on parameters such as electron-ion temperature ratio and plasma beta.

  15. Particle kinetic simulation of high altitude hypervelocity flight

    NASA Technical Reports Server (NTRS)

    Boyd, Iain; Haas, Brian L.

    1994-01-01

    Rarefied flows about hypersonic vehicles entering the upper atmosphere or through nozzles expanding into a near vacuum may only be simulated accurately with a direct simulation Monte Carlo (DSMC) method. Under this grant, researchers enhanced the models employed in the DSMC method and performed simulations in support of existing NASA projects or missions. DSMC models were developed and validated for simulating rotational, vibrational, and chemical relaxation in high-temperature flows, including effects of quantized anharmonic oscillators and temperature-dependent relaxation rates. State-of-the-art advancements were made in simulating coupled vibration-dissociation recombination for post-shock flows. Models were also developed to compute vehicle surface temperatures directly in the code rather than requiring isothermal estimates. These codes were instrumental in simulating aerobraking of NASA's Magellan spacecraft during orbital maneuvers to assess heat transfer and aerodynamic properties of the delicate satellite. NASA also depended upon simulations of entry of the Galileo probe into the atmosphere of Jupiter to provide drag and flow field information essential for accurate interpretation of an onboard experiment. Finally, the codes have been used extensively to simulate expanding nozzle flows in low-power thrusters in support of propulsion activities at NASA-Lewis. Detailed comparisons between continuum calculations and DSMC results helped to quantify the limitations of continuum CFD codes in rarefied applications.

  16. Metrics for comparing dynamic earthquake rupture simulations

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  17. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; ...

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  18. Abiding by codes of ethics and codes of conduct imposed on members of learned and professional geoscience institutions and - a tiresome formality or a win-win for scientific and professional integrity and protection of the public?

    NASA Astrophysics Data System (ADS)

    Allington, Ruth; Fernandez, Isabel

    2015-04-01

    In 2012, the International Union of Geological Sciences (IUGS) formed the Task Group on Global Geoscience Professionalism ("TG-GGP") to bring together the expanding network of organizations around the world whose primary purpose is self-regulation of geoscience practice. An important part of TG-GGP's mission is to foster a shared understanding of aspects of professionalism relevant to individual scientists and applied practitioners working in one or more sectors of the wider geoscience profession (e.g. research, teaching, industry, geoscience communication and government service). These may be summarised as competence, ethical practice, and professional, technical and scientific accountability. Legal regimes for the oversight of registered or licensed professionals differ around the world and in many jurisdictions there is no registration or licensure with the force of law. However, principles of peer-based self-regulation universally apply. This makes professional geoscience organisations ideal settings within which geoscientists can debate and agree what society should expect of us in the range of roles we fulfil. They can provide the structures needed to best determine what expectations, in the public interest, are appropriate for us collectively to impose on each other. They can also provide the structures for the development of associated procedures necessary to identify and discipline those who do not live up to the expected standards of behaviour established by consensus between peers. Codes of Ethics (sometimes referred to as Codes of Conduct), to which all members of all major professional and/or scientific geoscience organizations are bound (whether or not they are registered or hold professional qualifications awarded by those organisations), incorporate such traditional tenets as: safeguarding the health and safety of the public, scientific integrity, and fairness. Codes also increasingly include obligations concerning welfare of the environment and sustainability. This contribution is part of a series of presentations and papers by TG-GGP members in 2015 on a similar theme, including a paper submitted for the American Geophysical Union Joint Assembly meeting in Montreal, Canada, in May 2015 (Bonham and Allington). It will first describe common features of ethical codes/codes of conduct and associated complaints and disciplinary procedures, drawing on examples from the professional geoscience organisations which are members of TG-GGP. It will go on to examine the challenges associated with encouraging and policing compliance with such codes, especially where the need for compliance is not a legal obligation, but simply a condition of membership of that organisation.

  19. Electro-Thermal-Mechanical Simulation Capability Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, D

    This is the Final Report for LDRD 04-ERD-086, 'Electro-Thermal-Mechanical Simulation Capability'. The accomplishments are well documented in five peer-reviewed publications and six conference presentations and hence will not be detailed here. The purpose of this LDRD was to research and develop numerical algorithms for three-dimensional (3D) Electro-Thermal-Mechanical simulations. LLNL has long been a world leader in the area of computational mechanics, and recently several mechanics codes have become 'multiphysics' codes with the addition of fluid dynamics, heat transfer, and chemistry. However, these multiphysics codes do not incorporate the electromagnetics that is required for a coupled Electro-Thermal-Mechanical (ETM) simulation. There aremore » numerous applications for an ETM simulation capability, such as explosively-driven magnetic flux compressors, electromagnetic launchers, inductive heating and mixing of metals, and MEMS. A robust ETM simulation capability will enable LLNL physicists and engineers to better support current DOE programs, and will prepare LLNL for some very exciting long-term DoD opportunities. We define a coupled Electro-Thermal-Mechanical (ETM) simulation as a simulation that solves, in a self-consistent manner, the equations of electromagnetics (primarily statics and diffusion), heat transfer (primarily conduction), and non-linear mechanics (elastic-plastic deformation, and contact with friction). There is no existing parallel 3D code for simulating ETM systems at LLNL or elsewhere. While there are numerous magnetohydrodynamic codes, these codes are designed for astrophysics, magnetic fusion energy, laser-plasma interaction, etc. and do not attempt to accurately model electromagnetically driven solid mechanics. This project responds to the Engineering R&D Focus Areas of Simulation and Energy Manipulation, and addresses the specific problem of Electro-Thermal-Mechanical simulation for design and analysis of energy manipulation systems such as magnetic flux compression generators and railguns. This project compliments ongoing DNT projects that have an experimental emphasis. Our research efforts have been encapsulated in the Diablo and ALE3D simulation codes. This new ETM capability already has both internal and external users, and has spawned additional research in plasma railgun technology. By developing this capability Engineering has become a world-leader in ETM design, analysis, and simulation. This research has positioned LLNL to be able to compete for new business opportunities with the DoD in the area of railgun design. We currently have a three-year $1.5M project with the Office of Naval Research to apply our ETM simulation capability to railgun bore life issues and we expect to be a key player in the railgun community.« less

  20. Graphical programming interface: A development environment for MRI methods.

    PubMed

    Zwart, Nicholas R; Pipe, James G

    2015-11-01

    To introduce a multiplatform, Python language-based, development environment called graphical programming interface for prototyping MRI techniques. The interface allows developers to interact with their scientific algorithm prototypes visually in an event-driven environment making tasks such as parameterization, algorithm testing, data manipulation, and visualization an integrated part of the work-flow. Algorithm developers extend the built-in functionality through simple code interfaces designed to facilitate rapid implementation. This article shows several examples of algorithms developed in graphical programming interface including the non-Cartesian MR reconstruction algorithms for PROPELLER and spiral as well as spin simulation and trajectory visualization of a FLORET example. The graphical programming interface framework is shown to be a versatile prototyping environment for developing numeric algorithms used in the latest MR techniques. © 2014 Wiley Periodicals, Inc.

  1. On Fusing Recursive Traversals of K-d Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    Loop fusion is a key program transformation for data locality optimization that is implemented in production compilers. But optimizing compilers currently cannot exploit fusion opportunities across a set of recursive tree traversal computations with producer-consumer relationships. In this paper, we develop a compile-time approach to dependence characterization and program transformation to enable fusion across recursively specified traversals over k-ary trees. We present the FuseT source-to-source code transformation framework to automatically generate fused composite recursive operators from an input program containing a sequence of primitive recursive operators. We use our framework to implement fused operators for MADNESS, Multiresolution Adaptive Numerical Environmentmore » for Scientific Simulation. We show that locality optimization through fusion can offer more than an order of magnitude performance improvement.« less

  2. A Proposal of Monitoring and Forecasting Method for Crustal Activity in and around Japan with 3-dimensional Heterogeneous Medium Using a Large-scale High-fidelity Finite Element Simulation

    NASA Astrophysics Data System (ADS)

    Hori, T.; Agata, R.; Ichimura, T.; Fujita, K.; Yamaguchi, T.; Takahashi, N.

    2017-12-01

    Recently, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. For construct a system for monitoring and forecasting, it is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate inter-face and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Unstructured FE non-linear seismic wave simulation code has been developed. This achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. A high fidelity FEM simulation code with mesh generator has also been developed to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. This code has been improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, waveform inversion code for modeling 3D crustal structure has been developed, and the high-fidelity FEM code has been improved to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. We are developing the methods for forecasting the slip velocity variation on the plate interface. Although the prototype is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model. Furthermore, large-scale simulation codes for monitoring are being implemented on the GPU clusters and analysis tools are developing to include other functions such as examination in model errors.

  3. Benchmark problems for numerical implementations of phase field models

    DOE PAGES

    Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; ...

    2016-10-01

    Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verifymore » new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.« less

  4. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  5. Is there a need for a code of ethics in science communication and Communicating Uncertainties on Climate Change?

    NASA Astrophysics Data System (ADS)

    Cegnar, T.; Benestad, R.; Billard, C.

    2010-09-01

    The EMS Media team recognises that: Scientific knowledge is valuable for society, but it also becomes fragile in a media-dominated society where the distortion of facts clouds the validity of the information. The use of scientific titles in communication normally brings expectations of high standards regarding the information content. Freedom of speech is fragile in the sense that it can be diluted by a high proportion of false information. The value of scientific and scholastic titles is degraded when they are used to give the impression of false validity. Science communication is powerful, and implies a certain responsibility and ethical standard. The scientific community operates with a more or less tacit ethics code in all areas touching the scientists' activities. Even though many scientific questions cannot be completely resolved, there is a set of established and unequivocal scientific practices, methods, and tests, on which our scientific knowledge rests. Scientists are assumed to master the scientific practices, methods, and tests. High standard in science-related communication and media exposure, openness, and honesty will increase the relevance of science, academies, and scientists in the society, in addition to benefiting the society itself. Science communication is important to maintain and enhance the general appreciation of science. The value of the role of science is likely to increase with a reduced distance between scientists and the society and a lower knowledge barrier. An awareness about the ethical aspects of science and science communication may aid scientists in making decisions about how and what to say. Scientists are often not trained in communication or ethics. A set of guide lines may lower the barrier for scientists concerned about tacit codes to come forward and talk to the media. Recommendations: The mass media should seek more insight into scientific knowledge, history, principles, and societies. Journalists and artists should be encouraged and receive support to attend the large scientific conferences organised by e.g the EMS, EGU, AMS, and the AGU. National meteorological societies can contribute by promoting the idea of media participation, e.g. through statements and letters of opinion to news papers, in TV and radio. They can point to media awards and best-practice examples (such as the Norwegian collaboration between the national broadcasting corporation and the meteorological service yr.no.) Tacit ethics codes and expectations from scientists should be spelled out. The role of scientists should be clear, and national academies and member organisations are encouraged to provide a clear list of expectations. Statements drawing on the authority of science should have a basis in well-established and unequivocal scientific practices, methods, and tests. This means, for instance, that analysis and statistics must conform to well-established robust methods, avoiding 'cherry picking' and the misrepresentation of data. The information should also - to the greatest possible degree - be based on open source and transparent methods and data.

  6. LOOPREF: A Fluid Code for the Simulation of Coronal Loops

    NASA Technical Reports Server (NTRS)

    deFainchtein, Rosalinda; Antiochos, Spiro; Spicer, Daniel

    1998-01-01

    This report documents the code LOOPREF. LOOPREF is a semi-one dimensional finite element code that is especially well suited to simulate coronal-loop phenomena. It has a full implementation of adaptive mesh refinement (AMR), which is crucial for this type of simulation. The AMR routines are an improved version of AMR1D. LOOPREF's versatility makes is suitable to simulate a wide variety of problems. In addition to efficiently providing very high resolution in rapidly changing regions of the domain, it is equipped to treat loops of variable cross section, any non-linear form of heat conduction, shocks, gravitational effects, and radiative loss.

  7. Development of the V4.2m5 and V5.0m0 Multigroup Cross Section Libraries for MPACT for PWR and BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Clarno, Kevin T.; Gentry, Cole

    2017-03-01

    The MPACT neutronics module of the Consortium for Advanced Simulation of Light Water Reactors (CASL) core simulator is a 3-D whole core transport code being developed for the CASL toolset, Virtual Environment for Reactor Analysis (VERA). Key characteristics of the MPACT code include (1) a subgroup method for resonance selfshielding and (2) a whole-core transport solver with a 2-D/1-D synthesis method. The MPACT code requires a cross section library to support all the MPACT core simulation capabilities which would be the most influencing component for simulation accuracy.

  8. An evaluation of TRAC-PF1/MOD1 computer code performance during posttest simulations of Semiscale MOD-2C feedwater line break transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, D.G.: Watkins, J.C.

    This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In additionmore » to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use.« less

  9. Coupled field effects in BWR stability simulations using SIMULATE-3K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borkowski, J.; Smith, K.; Hagrman, D.

    1996-12-31

    The SIMULATE-3K code is the transient analysis version of the Studsvik advanced nodal reactor analysis code, SIMULATE-3. Recent developments have focused on further broadening the range of transient applications by refinement of core thermal-hydraulic models and on comparison with boiling water reactor (BWR) stability measurements performed at Ringhals unit 1, during the startups of cycles 14 through 17.

  10. Managing Scientific Software Complexity with Bocca and CCA

    DOE PAGES

    Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...

    2008-01-01

    In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less

  11. BARTTest: Community-Standard Atmospheric Radiative-Transfer and Retrieval Tests

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Himes, Michael D.; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.

    2018-01-01

    Atmospheric radiative transfer (RT) codes are used both to predict planetary and brown-dwarf spectra and in retrieval algorithms to infer atmospheric chemistry, clouds, and thermal structure from observations. Observational plans, theoretical models, and scientific results depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. The community needs a suite of test calculations with analytically, numerically, or at least community-verified results. We therefore present the Bayesian Atmospheric Radiative Transfer Test Suite, or BARTTest. BARTTest has four categories of tests: analytically verified RT tests of simple atmospheres (single line in single layer, line blends, saturation, isothermal, multiple line-list combination, etc.), community-verified RT tests of complex atmospheres, synthetic retrieval tests on simulated data with known answers, and community-verified real-data retrieval tests.BARTTest is open-source software intended for community use and further development. It is available at https://github.com/ExOSPORTS/BARTTest. We propose this test suite as a standard for verifying atmospheric RT and retrieval codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G, NASA Astrophysics Data Analysis Program grant NNX13AF38G, and NASA Exoplanets Research Program grant NNX17AB62G.

  12. Design of orbital debris shields for oblique hypervelocity impact

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.

    1994-01-01

    A new impact debris propagation code was written to link CTH simulations of space debris shield perforation to the Lagrangian finite element code DYNA3D, for space structure wall impact simulations. This software (DC3D) simulates debris cloud evolution using a nonlinear elastic-plastic deformable particle dynamics model, and renders computationally tractable the supercomputer simulation of oblique impacts on Whipple shield protected structures. Comparison of three dimensional, oblique impact simulations with experimental data shows good agreement over a range of velocities of interest in the design of orbital debris shielding. Source code developed during this research is provided on the enclosed floppy disk. An abstract based on the work described was submitted to the 1994 Hypervelocity Impact Symposium.

  13. Modeling and Simulation of Explosively Driven Electromechanical Devices

    NASA Astrophysics Data System (ADS)

    Demmie, Paul N.

    2002-07-01

    Components that store electrical energy in ferroelectric materials and produce currents when their permittivity is explosively reduced are used in a variety of applications. The modeling and simulation of such devices is a challenging problem since one has to represent the coupled physics of detonation, shock propagation, and electromagnetic field generation. The high fidelity modeling and simulation of complicated electromechanical devices was not feasible prior to having the Accelerated Strategic Computing Initiative (ASCI) computers and the ASCI developed codes at Sandia National Laboratories (SNL). The EMMA computer code is used to model such devices and simulate their operation. In this paper, I discuss the capabilities of the EMMA code for the modeling and simulation of one such electromechanical device, a slim-loop ferroelectric (SFE) firing set.

  14. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  15. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  16. Enabling a Scientific Cloud Marketplace: VGL (Invited)

    NASA Astrophysics Data System (ADS)

    Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.

    2013-12-01

    The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org

  17. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  18. Large Eddy Simulations and Turbulence Modeling for Film Cooling

    NASA Technical Reports Server (NTRS)

    Acharya, Sumanta

    1999-01-01

    The objective of the research is to perform Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) for film cooling process, and to evaluate and improve advanced forms of the two equation turbulence models for turbine blade surface flow analysis. The DNS/LES were used to resolve the large eddies within the flow field near the coolant jet location. The work involved code development and applications of the codes developed to the film cooling problems. Five different codes were developed and utilized to perform this research. This report presented a summary of the development of the codes and their applications to analyze the turbulence properties at locations near coolant injection holes.

  19. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A

  20. Collaborative Simulation Grid: Multiscale Quantum-Mechanical/Classical Atomistic Simulations on Distributed PC Clusters in the US and Japan

    NASA Technical Reports Server (NTRS)

    Kikuchi, Hideaki; Kalia, Rajiv; Nakano, Aiichiro; Vashishta, Priya; Iyetomi, Hiroshi; Ogata, Shuji; Kouno, Takahisa; Shimojo, Fuyuki; Tsuruta, Kanji; Saini, Subhash; hide

    2002-01-01

    A multidisciplinary, collaborative simulation has been performed on a Grid of geographically distributed PC clusters. The multiscale simulation approach seamlessly combines i) atomistic simulation backed on the molecular dynamics (MD) method and ii) quantum mechanical (QM) calculation based on the density functional theory (DFT), so that accurate but less scalable computations are performed only where they are needed. The multiscale MD/QM simulation code has been Grid-enabled using i) a modular, additive hybridization scheme, ii) multiple QM clustering, and iii) computation/communication overlapping. The Gridified MD/QM simulation code has been used to study environmental effects of water molecules on fracture in silicon. A preliminary run of the code has achieved a parallel efficiency of 94% on 25 PCs distributed over 3 PC clusters in the US and Japan, and a larger test involving 154 processors on 5 distributed PC clusters is in progress.

  1. Implementation of a 3D halo neutral model in the TRANSP code and application to projected NSTX-U plasmas

    NASA Astrophysics Data System (ADS)

    Medley, S. S.; Liu, D.; Gorelenkova, M. V.; Heidbrink, W. W.; Stagner, L.

    2016-02-01

    A 3D halo neutral code developed at the Princeton Plasma Physics Laboratory and implemented for analysis using the TRANSP code is applied to projected National Spherical Torus eXperiment-Upgrade (NSTX-U plasmas). The legacy TRANSP code did not handle halo neutrals properly since they were distributed over the plasma volume rather than remaining in the vicinity of the neutral beam footprint as is actually the case. The 3D halo neutral code uses a ‘beam-in-a-box’ model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components produce first generation halo neutrals that are tracked through successive generations until an ionization event occurs or the descendant halos exit the box. The 3D halo neutral model and neutral particle analyzer (NPA) simulator in the TRANSP code have been benchmarked with the Fast-Ion D-Alpha simulation (FIDAsim) code, which provides Monte Carlo simulations of beam neutral injection, attenuation, halo generation, halo spatial diffusion, and photoemission processes. When using the same atomic physics database, TRANSP and FIDAsim simulations achieve excellent agreement on the spatial profile and magnitude of beam and halo neutral densities and the NPA energy spectrum. The simulations show that the halo neutral density can be comparable to the beam neutral density. These halo neutrals can double the NPA flux, but they have minor effects on the NPA energy spectrum shape. The TRANSP and FIDAsim simulations also suggest that the magnitudes of beam and halo neutral densities are relatively sensitive to the choice of the atomic physics databases.

  2. Implementation of a 3D halo neutral model in the TRANSP code and application to projected NSTX-U plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medley, S. S.; Liu, D.; Gorelenkova, M. V.

    2016-01-12

    A 3D halo neutral code developed at the Princeton Plasma Physics Laboratory and implemented for analysis using the TRANSP code is applied to projected National Spherical Torus eXperiment-Upgrade (NSTX-U plasmas). The legacy TRANSP code did not handle halo neutrals properly since they were distributed over the plasma volume rather than remaining in the vicinity of the neutral beam footprint as is actually the case. The 3D halo neutral code uses a 'beam-in-a-box' model that encompasses both injected beam neutrals and resulting halo neutrals. Upon deposition by charge exchange, a subset of the full, one-half and one-third beam energy components producemore » first generation halo neutrals that are tracked through successive generations until an ionization event occurs or the descendant halos exit the box. The 3D halo neutral model and neutral particle analyzer (NPA) simulator in the TRANSP code have been benchmarked with the Fast-Ion D-Alpha simulation (FIDAsim) code, which provides Monte Carlo simulations of beam neutral injection, attenuation, halo generation, halo spatial diffusion, and photoemission processes. When using the same atomic physics database, TRANSP and FIDAsim simulations achieve excellent agreement on the spatial profile and magnitude of beam and halo neutral densities and the NPA energy spectrum. The simulations show that the halo neutral density can be comparable to the beam neutral density. These halo neutrals can double the NPA flux, but they have minor effects on the NPA energy spectrum shape. The TRANSP and FIDAsim simulations also suggest that the magnitudes of beam and halo neutral densities are relatively sensitive to the choice of the atomic physics databases.« less

  3. ogs6 - a new concept for porous-fractured media simulations

    NASA Astrophysics Data System (ADS)

    Naumov, Dmitri; Bilke, Lars; Fischer, Thomas; Rink, Karsten; Wang, Wenqing; Watanabe, Norihiro; Kolditz, Olaf

    2015-04-01

    OpenGeoSys (OGS) is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THMC) processes in porous and fractured media, continuously developed since the mid-eighties. The basic concept is to provide a flexible numerical framework for solving coupled multi-field problems. OGS is targeting mainly on applications in environmental geoscience, e.g. in the fields of contaminant hydrology, water resources management, waste deposits, or geothermal energy systems, but it has also been successfully applied to new topics in energy storage recently. OGS is actively participating several international benchmarking initiatives, e.g. DECOVALEX (waste management), CO2BENCH (CO2 storage and sequestration), SeSBENCH (reactive transport processes) and HM-Intercomp (coupled hydrosystems). Despite the broad applicability of OGS in geo-, hydro- and energy-sciences, several shortcomings became obvious concerning the computational efficiency as well as the code structure became too sophisticated for further efficient development. OGS-5 was designed for object-oriented FEM applications. However, in many multi-field problems a certain flexibility of tailored numerical schemes is essential. Therefore, a new concept was designed to overcome existing bottlenecks. The paradigms for ogs6 are: - Flexibility of numerical schemes (FEM#FVM#FDM), - Computational efficiency (PetaScale ready), - Developer- and user-friendly. ogs6 has a module-oriented architecture based on thematic libraries (e.g. MeshLib, NumLib) on the large scale and uses object-oriented approach for the small scale interfaces. Usage of a linear algebra library (Eigen3) for the mathematical operations together with the ISO C++11 standard increases the expressiveness of the code and makes it more developer-friendly. The new C++ standard also makes the template meta-programming technique code used for compile-time optimizations more compact. We have transitioned the main code development to the GitHub code hosting system (https://github.com/ufz/ogs). The very flexible revision control system Git in combination with issue tracking, developer feedback and the code review options improve the code quality and the development process in general. The continuous testing procedure of the benchmarks as it was established for OGS-5 is maintained. Additionally unit testing, which is automatically triggered by any code changes, is executed by two continuous integration frameworks (Jenkins CI, Travis CI) which build and test the code on different operating systems (Windows, Linux, Mac OS), in multiple configurations and with different compilers (GCC, Clang, Visual Studio). To improve the testing possibilities further, XML based file input formats are introduced helping with automatic validation of the user contributed benchmarks. The first ogs6 prototype version 6.0.1 has been implemented for solving generic elliptic problems. Next steps are envisaged to transient, non-linear and coupled problems. Literature: [1] Kolditz O, Shao H, Wang W, Bauer S (eds) (2014): Thermo-Hydro-Mechanical-Chemical Processes in Fractured Porous Media: Modelling and Benchmarking - Closed Form Solutions. In: Terrestrial Environmental Sciences, Vol. 1, Springer, Heidelberg, ISBN 978-3-319-11893-2, 315pp. http://www.springer.com/earth+sciences+and+geography/geology/book/978-3-319-11893-2 [2] Naumov D (2015): Computational Fluid Dynamics in Unconsolidated Sediments: Model Generation and Discrete Flow Simulations, PhD thesis, Technische Universität Dresden.

  4. Finite element methods in a simulation code for offshore wind turbines

    NASA Astrophysics Data System (ADS)

    Kurz, Wolfgang

    1994-06-01

    Offshore installation of wind turbines will become important for electricity supply in future. Wind conditions above sea are more favorable than on land and appropriate locations on land are limited and restricted. The dynamic behavior of advanced wind turbines is investigated with digital simulations to reduce time and cost in development and design phase. A wind turbine can be described and simulated as a multi-body system containing rigid and flexible bodies. Simulation of the non-linear motion of such a mechanical system using a multi-body system code is much faster than using a finite element code. However, a modal representation of the deformation field has to be incorporated in the multi-body system approach. The equations of motion of flexible bodies due to deformation are generated by finite element calculations. At Delft University of Technology the simulation code DUWECS has been developed which simulates the non-linear behavior of wind turbines in time domain. The wind turbine is divided in subcomponents which are represented by modules (e.g. rotor, tower etc.).

  5. Schnek: A C++ library for the development of parallel simulation codes on regular grids

    NASA Astrophysics Data System (ADS)

    Schmitz, Holger

    2018-05-01

    A large number of algorithms across the field of computational physics are formulated on grids with a regular topology. We present Schnek, a library that enables fast development of parallel simulations on regular grids. Schnek contains a number of easy-to-use modules that greatly reduce the amount of administrative code for large-scale simulation codes. The library provides an interface for reading simulation setup files with a hierarchical structure. The structure of the setup file is translated into a hierarchy of simulation modules that the developer can specify. The reader parses and evaluates mathematical expressions and initialises variables or grid data. This enables developers to write modular and flexible simulation codes with minimal effort. Regular grids of arbitrary dimension are defined as well as mechanisms for defining physical domain sizes, grid staggering, and ghost cells on these grids. Ghost cells can be exchanged between neighbouring processes using MPI with a simple interface. The grid data can easily be written into HDF5 files using serial or parallel I/O.

  6. MuSim, a Graphical User Interface for Multiple Simulation Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Thomas; Cummings, Mary Anne; Johnson, Rolland

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X,more » and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.« less

  7. Simulation studies of chemical erosion on carbon based materials at elevated temperatures

    NASA Astrophysics Data System (ADS)

    Kenmotsu, T.; Kawamura, T.; Li, Zhijie; Ono, T.; Yamamura, Y.

    1999-06-01

    We simulated the fluence dependence of methane reaction yield in carbon with hydrogen bombardment using the ACAT-DIFFUSE code. The ACAT-DIFFUSE code is a simulation code based on a Monte Carlo method with a binary collision approximation and on solving diffusion equations. The chemical reaction model in carbon was studied by Roth or other researchers. Roth's model is suitable for the steady state methane reaction. But this model cannot estimate the fluence dependence of the methane reaction. Then, we derived an empirical formula based on Roth's model for methane reaction. In this empirical formula, we assumed the reaction region where chemical sputtering due to methane formation takes place. The reaction region corresponds to the peak range of incident hydrogen distribution in the target material. We adopted this empirical formula to the ACAT-DIFFUSE code. The simulation results indicate the similar fluence dependence compared with the experiment result. But, the fluence to achieve the steady state are different between experiment and simulation results.

  8. Computational methods for coupling microstructural and micromechanical materials response simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were appliedmore » to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.« less

  9. Neptune: An astrophysical smooth particle hydrodynamics code for massively parallel computer architectures

    NASA Astrophysics Data System (ADS)

    Sandalski, Stou

    Smooth particle hydrodynamics is an efficient method for modeling the dynamics of fluids. It is commonly used to simulate astrophysical processes such as binary mergers. We present a newly developed GPU accelerated smooth particle hydrodynamics code for astrophysical simulations. The code is named neptune after the Roman god of water. It is written in OpenMP parallelized C++ and OpenCL and includes octree based hydrodynamic and gravitational acceleration. The design relies on object-oriented methodologies in order to provide a flexible and modular framework that can be easily extended and modified by the user. Several pre-built scenarios for simulating collisions of polytropes and black-hole accretion are provided. The code is released under the MIT Open Source license and publicly available at http://code.google.com/p/neptune-sph/.

  10. Efficient Modeling of Laser-Plasma Accelerators with INF&RNO

    NASA Astrophysics Data System (ADS)

    Benedetti, C.; Schroeder, C. B.; Esarey, E.; Geddes, C. G. R.; Leemans, W. P.

    2010-11-01

    The numerical modeling code INF&RNO (INtegrated Fluid & paRticle simulatioN cOde, pronounced "inferno") is presented. INF&RNO is an efficient 2D cylindrical code to model the interaction of a short laser pulse with an underdense plasma. The code is based on an envelope model for the laser while either a PIC or a fluid description can be used for the plasma. The effect of the laser pulse on the plasma is modeled with the time-averaged poderomotive force. These and other features allow for a speedup of 2-4 orders of magnitude compared to standard full PIC simulations while still retaining physical fidelity. The code has been benchmarked against analytical solutions and 3D PIC simulations and here a set of validation tests together with a discussion of the performances are presented.

  11. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerjan, Charles J.; Shi, Xizeng

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold H. Kritz

    PTRANSP, which is the predictive version of the TRANSP code, was developed in a collaborative effort involving the Princeton Plasma Physics Laboratory, General Atomics Corporation, Lawrence Livermore National Laboratory, and Lehigh University. The PTRANSP/TRANSP suite of codes is the premier integrated tokamak modeling software in the United States. A production service for PTRANSP/TRANSP simulations is maintained at the Princeton Plasma Physics Laboratory; the server has a simple command line client interface and is subscribed to by about 100 researchers from tokamak projects in the US, Europe, and Asia. This service produced nearly 13000 PTRANSP/TRANSP simulations in the four year periodmore » FY 2005 through FY 2008. Major archives of TRANSP results are maintained at PPPL, MIT, General Atomics, and JET. Recent utilization, counting experimental analysis simulations as well as predictive simulations, more than doubled from slightly over 2000 simulations per year in FY 2005 and FY 2006 to over 4300 simulations per year in FY 2007 and FY 2008. PTRANSP predictive simulations applied to ITER increased eight fold from 30 simulations per year in FY 2005 and FY 2006 to 240 simulations per year in FY 2007 and FY 2008, accounting for more than half of combined PTRANSP/TRANSP service CPU resource utilization in FY 2008. PTRANSP studies focused on ITER played a key role in journal articles. Examples of validation studies carried out for momentum transport in PTRANSP simulations were presented at the 2008 IAEA conference. The increase in number of PTRANSP simulations has continued (more than 7000 TRANSP/PTRANSP simulations in 2010) and results of PTRANSP simulations appear in conference proceedings, for example the 2010 IAEA conference, and in peer reviewed papers. PTRANSP provides a bridge to the Fusion Simulation Program (FSP) and to the future of integrated modeling. Through years of widespread usage, each of the many parts of the PTRANSP suite of codes has been thoroughly validated against experimental data and benchmarked against other codes. At the same time, architectural modernizations are improving the modularity of the PTRANSP code base. The NUBEAM neutral beam and fusion products fast ion model, the Plasma State data repository (developed originally in the SWIM SciDAC project and adapted for use in PTRANSP), and other components are already shared with the SWIM, FACETS, and CPES SciDAC FSP prototype projects. Thus, the PTRANSP code is already serving as a bridge between our present integrated modeling capability and future capability. As the Fusion Simulation Program builds toward the facility currently available in the PTRANSP suite of codes, early versions of the FSP core plasma model will need to be benchmarked against the PTRANSP simulations. This will be necessary to build user confidence in FSP, but this benchmarking can only be done if PTRANSP itself is maintained and developed.« less

  13. Integrated Devices and Systems | Grid Modernization | NREL

    Science.gov Websites

    storage models Microgrids Microgrids Grid Simulation and Power Hardware-in-the-Loop Grid simulation and power hardware-in-the-loop Grid Standards and Codes Standards and codes Contact Barry Mather, Ph.D

  14. The Application of SERS (Surface Enhanced Raman Scattering) to Study Surface Oxidation Reactions of Phosphonates.

    DTIC Science & Technology

    1988-02-15

    Center Attn: Dr. Ron Atkins Code 50C Chemistry Division Crane, Indiana 47522-5050 China Lake, California 93555 Scientific Advisor INaval Civil...Superintendent Marine Sciences Division Chemistry Division, Code 6100 San Diego, California 91232 Naval Research Laboratory Washington, D.C. 20375-5000 ,! .1

  15. 75 FR 61139 - Board of Scientific Counselors (BOSC); Request for Nominations of Experts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    .... Heather Drumm, Mail Code 8104-R, Office of Science Policy, Office of Research and Development... and bioinformatics); socioeconomics; environmental justice, science policy (research, policy, and.... Acting Director, Office of Science Policy . [FR Doc. 2010-24805 Filed 10-1-10; 8:45 am] BILLING CODE 6560...

  16. Planck 2015 results. I. Overview of products and scientific results

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Adam, R.; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Alves, M. I. R.; Argüeso, F.; Arnaud, M.; Arroja, F.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Ballardini, M.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Battye, R.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bertincourt, B.; Bielewicz, P.; Bikmaev, I.; Bock, J. J.; Böhringer, H.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burenin, R.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carvalho, P.; Casaponsa, B.; Castex, G.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chiang, H. C.; Chluba, J.; Chon, G.; Christensen, P. R.; Church, S.; Clemens, M.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Comis, B.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Eisenhardt, P. R. M.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fantaye, Y.; Farhang, M.; Feeney, S.; Fergusson, J.; Fernandez-Cobos, R.; Feroz, F.; Finelli, F.; Florido, E.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschet, C.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Génova-Santos, R. T.; Gerbino, M.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Giusarma, E.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Grainge, K. J. B.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hamann, J.; Handley, W.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Heavens, A.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Ilić, S.; Jaffe, A. H.; Jaffe, T. R.; Jin, T.; Jones, W. C.; Juvela, M.; Karakci, A.; Keihänen, E.; Keskitalo, R.; Khamitov, I.; Kiiveri, K.; Kim, J.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Langer, M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leahy, J. P.; Lellouch, E.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Levrier, F.; Lewis, A.; Liguori, M.; Lilje, P. B.; Lilley, M.; Linden-Vørnle, M.; Lindholm, V.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Ma, Y.-Z.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mak, D. S. Y.; Mandolesi, N.; Mangilli, A.; Marchini, A.; Marcos-Caballero, A.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martinelli, M.; Martínez-González, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McEwen, J. D.; McGehee, P.; Mei, S.; Meinhold, P. R.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Moreno, R.; Morgante, G.; Mortlock, D.; Moss, A.; Mottet, S.; Münchmeyer, M.; Munshi, D.; Murphy, J. A.; Narimani, A.; Naselsky, P.; Nastasi, A.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Olamaie, M.; Oppermann, N.; Orlando, E.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Pandolfi, S.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Peel, M.; Peiris, H. V.; Pelkonen, V.-M.; Perdereau, O.; Perotto, L.; Perrott, Y. C.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pogosyan, D.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Racine, B.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Roman, M.; Romelli, E.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rouillé d'Orfeuil, B.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Ruiz-Granados, B.; Rumsey, C.; Rusholme, B.; Said, N.; Salvatelli, V.; Salvati, L.; Sandri, M.; Sanghera, H. S.; Santos, D.; Saunders, R. D. E.; Sauvé, A.; Savelainen, M.; Savini, G.; Schaefer, B. M.; Schammel, M. P.; Scott, D.; Seiffert, M. D.; Serra, P.; Shellard, E. P. S.; Shimwell, T. W.; Shiraishi, M.; Smith, K.; Souradeep, T.; Spencer, L. D.; Spinelli, M.; Stanford, S. A.; Stern, D.; Stolyarov, V.; Stompor, R.; Strong, A. W.; Sudiwala, R.; Sunyaev, R.; Sutter, P.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Texier, D.; Toffolatti, L.; Tomasi, M.; Tornikoski, M.; Tramonte, D.; Tristram, M.; Troja, A.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vassallo, T.; Vibert, L.; Vidal, M.; Viel, M.; Vielva, P.; Villa, F.; Wade, L. A.; Walter, B.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Welikala, N.; Weller, J.; White, M.; White, S. D. M.; Wilkinson, A.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    The European Space Agency's Planck satellite, which is dedicated to studying the early Universe and its subsequent evolution, was launched on 14 May 2009. It scanned the microwave and submillimetre sky continuously between 12 August 2009 and 23 October 2013. In February 2015, ESA and the Planck Collaboration released the second set of cosmology products based ondata from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the main characteristics of the data and the data products in the release, as well as the associated cosmological and astrophysical science results and papers. The data products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing uncertainties and the performance of the analysis methods. The likelihood code used to assess cosmological models against the Planck data is described, along with a CMB lensing likelihood. Scientific results include cosmological parameters derived from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity, and new results on low-frequency Galactic foregrounds.

  17. MicroHH 1.0: a computational fluid dynamics code for direct numerical simulation and large-eddy simulation of atmospheric boundary layer flows

    NASA Astrophysics Data System (ADS)

    van Heerwaarden, Chiel C.; van Stratum, Bart J. H.; Heus, Thijs; Gibbs, Jeremy A.; Fedorovich, Evgeni; Mellado, Juan Pedro

    2017-08-01

    This paper describes MicroHH 1.0, a new and open-source (www.microhh.org) computational fluid dynamics code for the simulation of turbulent flows in the atmosphere. It is primarily made for direct numerical simulation but also supports large-eddy simulation (LES). The paper covers the description of the governing equations, their numerical implementation, and the parameterizations included in the code. Furthermore, the paper presents the validation of the dynamical core in the form of convergence and conservation tests, and comparison of simulations of channel flows and slope flows against well-established test cases. The full numerical model, including the associated parameterizations for LES, has been tested for a set of cases under stable and unstable conditions, under the Boussinesq and anelastic approximations, and with dry and moist convection under stationary and time-varying boundary conditions. The paper presents performance tests showing good scaling from 256 to 32 768 processes. The graphical processing unit (GPU)-enabled version of the code can reach a speedup of more than an order of magnitude for simulations that fit in the memory of a single GPU.

  18. electromagnetics, eddy current, computer codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gartling, David

    TORO Version 4 is designed for finite element analysis of steady, transient and time-harmonic, multi-dimensional, quasi-static problems in electromagnetics. The code allows simulation of electrostatic fields, steady current flows, magnetostatics and eddy current problems in plane or axisymmetric, two-dimensional geometries. TORO is easily coupled to heat conduction and solid mechanics codes to allow multi-physics simulations to be performed.

  19. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  20. Recent Progress and Future Plans for Fusion Plasma Synthetic Diagnostics Platform

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Kramer, Gerrit; Tang, William; Tobias, Benjamin; Valeo, Ernest; Churchill, Randy; Hausammann, Loic

    2015-11-01

    The Fusion Plasma Synthetic Diagnostics Platform (FPSDP) is a Python package developed at the Princeton Plasma Physics Laboratory. It is dedicated to providing an integrated programmable environment for applying a modern ensemble of synthetic diagnostics to the experimental validation of fusion plasma simulation codes. The FPSDP will allow physicists to directly compare key laboratory measurements to simulation results. This enables deeper understanding of experimental data, more realistic validation of simulation codes, quantitative assessment of existing diagnostics, and new capabilities for the design and optimization of future diagnostics. The Fusion Plasma Synthetic Diagnostics Platform now has data interfaces for the GTS and XGC-1 global particle-in-cell simulation codes with synthetic diagnostic modules including: (i) 2D and 3D Reflectometry; (ii) Beam Emission Spectroscopy; and (iii) 1D Electron Cyclotron Emission. Results will be reported on the delivery of interfaces for the global electromagnetic PIC code GTC, the extended MHD M3D-C1 code, and the electromagnetic hybrid NOVAK eigenmode code. Progress toward development of a more comprehensive 2D Electron Cyclotron Emission module will also be discussed. This work is supported by DOE contract #DEAC02-09CH11466.

  1. Parser for Sabin-to-Mahoney Transition Model of Quasispecies Replication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ecale Zhou, Carol

    2016-01-03

    This code is a data parse for preparing output from the Qspp agent-based stochastic simulation model for plotting in Excel. This code is specific to a set of simulations that were run for the purpose of preparing data for a publication. It is necessary to make this code open-source in order to publish the model code (Qspp), which has already been released. There is a necessity of assuring that results from using Qspp for a publication

  2. Capabilities overview of the MORET 5 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.

    2014-06-01

    The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.

  3. An empirical analysis of journal policy effectiveness for computational reproducibility.

    PubMed

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  4. An empirical analysis of journal policy effectiveness for computational reproducibility

    PubMed Central

    Seiler, Jennifer; Ma, Zhaokun

    2018-01-01

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050

  5. LDPC Codes with Minimum Distance Proportional to Block Size

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Jones, Christopher; Dolinar, Samuel; Thorpe, Jeremy

    2009-01-01

    Low-density parity-check (LDPC) codes characterized by minimum Hamming distances proportional to block sizes have been demonstrated. Like the codes mentioned in the immediately preceding article, the present codes are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. The previously mentioned codes have low decoding thresholds and reasonably low error floors. However, the minimum Hamming distances of those codes do not grow linearly with code-block sizes. Codes that have this minimum-distance property exhibit very low error floors. Examples of such codes include regular LDPC codes with variable degrees of at least 3. Unfortunately, the decoding thresholds of regular LDPC codes are high. Hence, there is a need for LDPC codes characterized by both low decoding thresholds and, in order to obtain acceptably low error floors, minimum Hamming distances that are proportional to code-block sizes. The present codes were developed to satisfy this need. The minimum Hamming distances of the present codes have been shown, through consideration of ensemble-average weight enumerators, to be proportional to code block sizes. As in the cases of irregular ensembles, the properties of these codes are sensitive to the proportion of degree-2 variable nodes. A code having too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code having too many such nodes tends not to exhibit a minimum distance that is proportional to block size. Results of computational simulations have shown that the decoding thresholds of codes of the present type are lower than those of regular LDPC codes. Included in the simulations were a few examples from a family of codes characterized by rates ranging from low to high and by thresholds that adhere closely to their respective channel capacity thresholds; the simulation results from these examples showed that the codes in question have low error floors as well as low decoding thresholds. As an example, the illustration shows the protograph (which represents the blueprint for overall construction) of one proposed code family for code rates greater than or equal to 1.2. Any size LDPC code can be obtained by copying the protograph structure N times, then permuting the edges. The illustration also provides Field Programmable Gate Array (FPGA) hardware performance simulations for this code family. In addition, the illustration provides minimum signal-to-noise ratios (Eb/No) in decibels (decoding thresholds) to achieve zero error rates as the code block size goes to infinity for various code rates. In comparison with the codes mentioned in the preceding article, these codes have slightly higher decoding thresholds.

  6. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  7. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.

  8. NWChem: A comprehensive and scalable open-source solution for large scale molecular simulations

    NASA Astrophysics Data System (ADS)

    Valiev, M.; Bylaska, E. J.; Govind, N.; Kowalski, K.; Straatsma, T. P.; Van Dam, H. J. J.; Wang, D.; Nieplocha, J.; Apra, E.; Windus, T. L.; de Jong, W. A.

    2010-09-01

    The latest release of NWChem delivers an open-source computational chemistry package with extensive capabilities for large scale simulations of chemical and biological systems. Utilizing a common computational framework, diverse theoretical descriptions can be used to provide the best solution for a given scientific problem. Scalable parallel implementations and modular software design enable efficient utilization of current computational architectures. This paper provides an overview of NWChem focusing primarily on the core theoretical modules provided by the code and their parallel performance. Program summaryProgram title: NWChem Catalogue identifier: AEGI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open Source Educational Community License No. of lines in distributed program, including test data, etc.: 11 709 543 No. of bytes in distributed program, including test data, etc.: 680 696 106 Distribution format: tar.gz Programming language: Fortran 77, C Computer: all Linux based workstations and parallel supercomputers, Windows and Apple machines Operating system: Linux, OS X, Windows Has the code been vectorised or parallelized?: Code is parallelized Classification: 2.1, 2.2, 3, 7.3, 7.7, 16.1, 16.2, 16.3, 16.10, 16.13 Nature of problem: Large-scale atomistic simulations of chemical and biological systems require efficient and reliable methods for ground and excited solutions of many-electron Hamiltonian, analysis of the potential energy surface, and dynamics. Solution method: Ground and excited solutions of many-electron Hamiltonian are obtained utilizing density-functional theory, many-body perturbation approach, and coupled cluster expansion. These solutions or a combination thereof with classical descriptions are then used to analyze potential energy surface and perform dynamical simulations. Additional comments: Full documentation is provided in the distribution file. This includes an INSTALL file giving details of how to build the package. A set of test runs is provided in the examples directory. The distribution file for this program is over 90 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Running time depends on the size of the chemical system, complexity of the method, number of cpu's and the computational task. It ranges from several seconds for serial DFT energy calculations on a few atoms to several hours for parallel coupled cluster energy calculations on tens of atoms or ab-initio molecular dynamics simulation on hundreds of atoms.

  9. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  10. Global linear gyrokinetic simulations for LHD including collisions

    NASA Astrophysics Data System (ADS)

    Kauffmann, K.; Kleiber, R.; Hatzky, R.; Borchardt, M.

    2010-11-01

    The code EUTERPE uses a Particle-In-Cell (PIC) method to solve the gyrokinetic equation globally (full radius, full flux surface) for three-dimensional equilibria calculated with VMEC. Recently this code has been extended to include multiple kinetic species and electromagnetic effects. Additionally, a pitch-angle scattering operator has been implemented in order to include collisional effects in the simulation of instabilities and to be able to simulate neoclassical transport. As a first application of this extended code we study the effects of collisions on electrostatic ion-temperature-gradient (ITG) instabilities in LHD.

  11. Simulations of the plasma dynamics in high-current ion diodes

    NASA Astrophysics Data System (ADS)

    Boine-Frankenheim, O.; Pointon, T. D.; Mehlhorn, T. A.

    Our time-implicit fluid/Particle-In-Cell (PIC) code DYNAID [1]is applied to problems relevant for applied- B ion diode operation. We present simulations of the laser ion source, which will soon be employed on the SABRE accelerator at SNL, and of the dynamics of the anode source plasma in the applied electric and magnetic fields. DYNAID is still a test-bed for a higher-dimensional simulation code. Nevertheless, the code can already give new theoretical insight into the dynamics of plasmas in pulsed power devices.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevska, Tanya

    This is the first code, designed to run on a desktop, which models the intracellular replication and the cell-to-cell infection and demonstrates virus evolution at the molecular level. This code simulates the infection of a population of "idealized biological cells" (represented as objects that do not divide or have metabolism) with "virus" (represented by its genetic sequence), the replication and simultaneous mutation of the virus which leads to evolution of the population of genetically diverse viruses. The code is built to simulate single-stranded RNA viruses. The input for the code is 1. the number of biological cells in the culture,more » 2. the initial composition of the virus population, 3. the reference genome of the RNA virus, 4. the coordinates of the genome regions and their significance and, 5. parameters determining the dynamics of virus replication, such as the mutation rate. The simulation ends when all cells have been infected or when no more infections occurs after a given number of attempts. The code has the ability to simulate the evolution of the virus in serial passage of cell "cultures", i.e. after the end of a simulation, a new one is immediately scheduled with a new culture of infected cells. The code outputs characteristics of the resulting virus population dynamics and genetic composition of the virus population, such as the top dominant genomes, percentage of a genome with specific characteristics.« less

  13. Using Large Signal Code TESLA for Wide Band Klystron Simulations

    DTIC Science & Technology

    2006-04-01

    tuning procedure TESLA simulates of high power klystron [3]. accurately actual eigenmodes of the structure as a solution Wide band klystrons very often...on band klystrons with two-gap two-mode resonators. The decomposition of simulation region into an external results of TESLA simulations for NRL S ...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP022454 TITLE: Using Large Signal Code TESLA for Wide Band Klystron

  14. Using Just-in-Time Information to Support Scientific Discovery Learning in a Computer-Based Simulation

    ERIC Educational Resources Information Center

    Hulshof, Casper D.; de Jong, Ton

    2006-01-01

    Students encounter many obstacles during scientific discovery learning with computer-based simulations. It is hypothesized that an effective type of support, that does not interfere with the scientific discovery learning process, should be delivered on a "just-in-time" base. This study explores the effect of facilitating access to…

  15. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  16. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a fundamental design flaw. The benefits became obvious almost immediately, not just in the correctness of the individual functions and classes but also in the correctness and flexibility being added to the overall design. Creating code to be testable, and testing as it was created resulted not only in better working code, but also in better-organized, flexible, and readable (i.e., articulate) code. This was, in essence the Test-driven development (TDD) methodology created by Kent Beck. Seeing the benefits of Test Driven Development, other Trick components were refactored to make them more testable and tests were designed and implemented for them.

  17. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  18. Edge-relevant plasma simulations with the continuum code COGENT

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Ghosh, D.; Hittinger, J.; Rognlien, T.; Cohen, R.; Lee, W.; Schwartz, P.

    2016-10-01

    We describe recent advances in cross-separatrix and other edge-relevant plasma simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The distinguishing feature of the COGENT code is its high-order finite-volume discretization methods, which employ arbitrary mapped multiblock grid technology (nearly field-aligned on blocks) to handle the complexity of tokamak divertor geometry with high accuracy. This paper discusses the 4D (axisymmetric) electrostatic version of the code, and the presented topics include: (a) initial simulations with kinetic electrons and development of reduced fluid models; (b) development and application of implicit-explicit (IMEX) time integration schemes; and (c) conservative modeling of drift-waves and the universal instability. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344 and at LBNL under contract DE-AC02-05CH11231.

  19. The VENUS/NWChem software package. Tight coupling between chemical dynamics simulations and electronic structure theory

    NASA Astrophysics Data System (ADS)

    Lourderaj, Upakarasamy; Sun, Rui; Kohale, Swapnil C.; Barnes, George L.; de Jong, Wibe A.; Windus, Theresa L.; Hase, William L.

    2014-03-01

    The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling since the two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface that accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.

  20. The Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; Wood, K.

    2018-04-01

    We present the public Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE, which can be used to simulate the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given type, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code, but also as a moving-mesh code.

Top