Sample records for llnl simulation codes

  1. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerjan, Charles J.; Shi, Xizeng

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less

  2. Electro-Thermal-Mechanical Simulation Capability Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, D

    This is the Final Report for LDRD 04-ERD-086, 'Electro-Thermal-Mechanical Simulation Capability'. The accomplishments are well documented in five peer-reviewed publications and six conference presentations and hence will not be detailed here. The purpose of this LDRD was to research and develop numerical algorithms for three-dimensional (3D) Electro-Thermal-Mechanical simulations. LLNL has long been a world leader in the area of computational mechanics, and recently several mechanics codes have become 'multiphysics' codes with the addition of fluid dynamics, heat transfer, and chemistry. However, these multiphysics codes do not incorporate the electromagnetics that is required for a coupled Electro-Thermal-Mechanical (ETM) simulation. There aremore » numerous applications for an ETM simulation capability, such as explosively-driven magnetic flux compressors, electromagnetic launchers, inductive heating and mixing of metals, and MEMS. A robust ETM simulation capability will enable LLNL physicists and engineers to better support current DOE programs, and will prepare LLNL for some very exciting long-term DoD opportunities. We define a coupled Electro-Thermal-Mechanical (ETM) simulation as a simulation that solves, in a self-consistent manner, the equations of electromagnetics (primarily statics and diffusion), heat transfer (primarily conduction), and non-linear mechanics (elastic-plastic deformation, and contact with friction). There is no existing parallel 3D code for simulating ETM systems at LLNL or elsewhere. While there are numerous magnetohydrodynamic codes, these codes are designed for astrophysics, magnetic fusion energy, laser-plasma interaction, etc. and do not attempt to accurately model electromagnetically driven solid mechanics. This project responds to the Engineering R&D Focus Areas of Simulation and Energy Manipulation, and addresses the specific problem of Electro-Thermal-Mechanical simulation for design and analysis of energy manipulation systems such as magnetic flux compression generators and railguns. This project compliments ongoing DNT projects that have an experimental emphasis. Our research efforts have been encapsulated in the Diablo and ALE3D simulation codes. This new ETM capability already has both internal and external users, and has spawned additional research in plasma railgun technology. By developing this capability Engineering has become a world-leader in ETM design, analysis, and simulation. This research has positioned LLNL to be able to compete for new business opportunities with the DoD in the area of railgun design. We currently have a three-year $1.5M project with the Office of Naval Research to apply our ETM simulation capability to railgun bore life issues and we expect to be a key player in the railgun community.« less

  3. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brantley, Patrick; Dawson, Shawn; McKinley, Scott

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less

  4. CDAC Student Report: Summary of LLNL Internship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herriman, Jane E.

    Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less

  5. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  6. Comparison of the LLNL ALE3D and AKTS Thermal Safety Computer Codes for Calculating Times to Explosion in ODTX and STEX Thermal Cookoff Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemhoff, A P; Burnham, A K

    2006-04-05

    Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less

  7. pF3D Simulations of SBS and SRS in NIF Hohlraum Experiments

    NASA Astrophysics Data System (ADS)

    Langer, Steven; Strozzi, David; Amendt, Peter; Chapman, Thomas; Hopkins, Laura; Kritcher, Andrea; Sepke, Scott

    2016-10-01

    We present simulations of stimulated Brillouin scattering (SBS) and stimulated Raman scattering (SRS) for NIF experiments using high foot pulses in cylindrical hohlraums and for low foot pulses in rugby-shaped hohlraums. We use pF3D, a massively-parallel, paraxial-envelope laser plasma interaction code, with plasma profiles obtained from the radiation-hydrodynamics codes Lasnex and HYDRA. We compare the simulations to experimental data for SBS and SRS power and spectrum. We also show simulated SRS and SBS intensities at the target chamber wall and report the fraction of the backscattered light that passes through and misses the lenses. Work performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344. Release number LLNL-ABS-697482.

  8. Modeling Multi-Bunch X-band Photoinjector Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsh, R A; Anderson, S G; Gibson, D J

    An X-band test station is being developed at LLNL to investigate accelerator optimization for future upgrades to mono-energetic gamma-ray technology at LLNL. The test station will consist of a 5.5 cell X-band rf photoinjector, single accelerator section, and beam diagnostics. Of critical import to the functioning of the LLNL X-band system with multiple electron bunches is the performance of the photoinjector. In depth modeling of the Mark 1 LLNL/SLAC X-band rf photoinjector performance will be presented addressing important challenges that must be addressed in order to fabricate a multi-bunch Mark 2 photoinjector. Emittance performance is evaluated under different nominal electronmore » bunch parameters using electrostatic codes such as PARMELA. Wake potential is analyzed using electromagnetic time domain simulations using the ACE3P code T3P. Plans for multi-bunch experiments and implementation of photoinjector advances for the Mark 2 design will also be discussed.« less

  9. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Shawn A.

    The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.

  10. 2011 Computation Directorate Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less

  11. Exploring Model Assumptions Through Three Dimensional Mixing Simulations Using a High-order Hydro Option in the Ares Code

    NASA Astrophysics Data System (ADS)

    White, Justin; Olson, Britton; Morgan, Brandon; McFarland, Jacob; Lawrence Livermore National Laboratory Team; University of Missouri-Columbia Team

    2015-11-01

    This work presents results from a large eddy simulation of a high Reynolds number Rayleigh-Taylor instability and Richtmyer-Meshkov instability. A tenth-order compact differencing scheme on a fixed Eulerian mesh is utilized within the Ares code developed at Lawrence Livermore National Laboratory. (LLNL) We explore the self-similar limit of the mixing layer growth in order to evaluate the k-L-a Reynolds Averaged Navier Stokes (RANS) model (Morgan and Wickett, Phys. Rev. E, 2015). Furthermore, profiles of turbulent kinetic energy, turbulent length scale, mass flux velocity, and density-specific-volume correlation are extracted in order to aid the creation a high fidelity LES data set for RANS modeling. Prepared by LLNL under Contract DE-AC52-07NA27344.

  12. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peck, T; Sparkman, D; Storch, N

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less

  14. Edge Simulation Laboratory Progress and Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, R

    The Edge Simulation Laboratory (ESL) is a project to develop a gyrokinetic code for MFE edge plasmas based on continuum (Eulerian) techniques. ESL is a base-program activity of OFES, with an allied algorithm research activity funded by the OASCR base math program. ESL OFES funds directly support about 0.8 FTE of career staff at LLNL, a postdoc and a small fraction of an FTE at GA, and a graduate student at UCSD. In addition the allied OASCR program funds about 1/2 FTE each in the computations directorates at LBNL and LLNL. OFES ESL funding for LLNL and UCSD began inmore » fall 2005, while funding for GA and the math team began about a year ago. ESL's continuum approach is a complement to the PIC-based methods of the CPES Project, and was selected (1) because of concerns about noise issues associated with PIC in the high-density-contrast environment of the edge pedestal, (2) to be able to exploit advanced numerical methods developed for fluid codes, and (3) to build upon the successes of core continuum gyrokinetic codes such as GYRO, GS2 and GENE. The ESL project presently has three components: TEMPEST, a full-f, full-geometry (single-null divertor, or arbitrary-shape closed flux surfaces) code in E, {mu} (energy, magnetic-moment) coordinates; EGK, a simple-geometry rapid-prototype code, presently of; and the math component, which is developing and implementing algorithms for a next-generation code. Progress would be accelerated if we could find funding for a fourth, computer science, component, which would develop software infrastructure, provide user support, and address needs for data handing and analysis. We summarize the status and plans for the three funded activities.« less

  15. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  16. Dynamic Fracture Simulations of Explosively Loaded Cylinders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arthur, Carly W.; Goto, D. M.

    2015-11-30

    This report documents the modeling results of high explosive experiments investigating dynamic fracture of steel (AerMet® 100 alloy) cylinders. The experiments were conducted at Lawrence Livermore National Laboratory (LLNL) during 2007 to 2008 [10]. A principal objective of this study was to gain an understanding of dynamic material failure through the analysis of hydrodynamic computer code simulations. Two-dimensional and three-dimensional computational cylinder models were analyzed using the ALE3D multi-physics computer code.

  17. Modeling of ion orbit loss and intrinsic toroidal rotation with the COGENT code

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Cohen, R.; Rognlien, T.; Hittinger, J.

    2014-10-01

    We discuss recent advances in cross-separatrix neoclassical transport simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The COGENT code models the axisymmetric transport properties of edge plasmas including the effects of nonlinear (Fokker-Planck) collisions and a self-consistent electrostatic potential. Our recent work has focused on studies of ion orbit loss and the associated toroidal rotation driven by this mechanism. The results of the COGENT simulations are discussed and analyzed for the parameters of the DIII-D experiment. Work performed for USDOE at LLNL under Contract DE-AC52-07NA27344.

  18. Characterization of Proxy Application Performance on Advanced Architectures. UMT2013, MCB, AMG2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howell, Louis H.; Gunney, Brian T.; Bhatele, Abhinav

    2015-10-09

    Three codes were tested at LLNL as part of a Tri-Lab effort to make detailed assessments of several proxy applications on various advanced architectures, with the eventual goal of extending these assessments to codes of programmatic interest running more realistic simulations. Teams from Sandia and Los Alamos tested proxy apps of their own. The focus in this report is on the LLNL codes UMT2013, MCB, and AMG2013. We present weak and strong MPI scaling results and studies of OpenMP efficiency on a large BG/Q system at LLNL, with comparison against similar tests on an Intel Sandy Bridge TLCC2 system. Themore » hardware counters on BG/Q provide detailed information on many aspects of on-node performance, while information from the mpiP tool gives insight into the reasons for the differing scaling behavior on these two different architectures. Results from three more speculative tests are also included: one that exploits NVRAM as extended memory, one that studies performance under a power bound, and one that illustrates the effects of changing the torus network mapping on BG/Q.« less

  19. Simulation of Deformation, Momentum and Energy Coupling Particles Deformed by Intense Shocks

    NASA Astrophysics Data System (ADS)

    Lieberthal, B.; Stewart, D. S.; Bdzil, J. B.; Najjar, F. M.; Balachandar, S.; Ling, Y.

    2011-11-01

    Modern energetic materials have embedded solids and inerts in an explosive matrix. A detonation in condensed phase materials, generates intense shocks that deform particles as the incident shock diffracts around them. The post-shock flow generates a wake behind the particle that is influenced by the shape changes of the particle. The gasdynamic flow in the explosive products and its interaction with the deformation of the particle must be treated simultaneously. Direct numerical simulations are carried out that vary the particle-to-surrounding density and impedance ratios to consider heavier and lighter particle. The vorticity deposited on the interface due to shock interaction with the particle, the resulting particle deformation and the net momentum and energy transferred to the particle, on the acoustic and longer viscous time scale are considered. The LLNL multi-physics hydrodynamic code ALE3D is used to carry out the simulations. BL, DSS and JBB supported by AFRL/RW AF FA8651-10-1-0004 & DTRA, HDTRA1-10-1-0020 Off Campus. FMN's work supported by the U.S. DOE/ LLNL, Contract DE-AC52-07NA27344. LLNL-ABS-491794.

  20. LATIS3D: The Goal Standard for Laser-Tissue-Interaction Modeling

    NASA Astrophysics Data System (ADS)

    London, R. A.; Makarewicz, A. M.; Kim, B. M.; Gentile, N. A.; Yang, T. Y. B.

    2000-03-01

    The goal of this LDRD project has been to create LATIS3D-the world's premier computer program for laser-tissue interaction modeling. The development was based on recent experience with the 2D LATIS code and the ASCI code, KULL. With LATIS3D, important applications in laser medical therapy were researched including dynamical calculations of tissue emulsification and ablation, photothermal therapy, and photon transport for photodynamic therapy. This project also enhanced LLNL's core competency in laser-matter interactions and high-energy-density physics by pushing simulation codes into new parameter regimes and by attracting external expertise. This will benefit both existing LLNL programs such as ICF and SBSS and emerging programs in medical technology and other laser applications. The purpose of this project was to develop and apply a computer program for laser-tissue interaction modeling to aid in the development of new instruments and procedures in laser medicine.

  1. Numerical Simulations of 3D Seismic Data Final Report CRADA No. TC02095.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedmann, S. J.; Kostov, C.

    This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of Califomia)/Lawrence-Livermore National Laboratory (LLNL) and Schlumberger Cambridge Research (SCR), to develop synthetic seismic data sets and supporting codes.

  2. pF3D Simulations of Large Outer-Beam Brillouin Scattering from NIF Rugby Hohlraums

    NASA Astrophysics Data System (ADS)

    Langer, Steven; Strozzi, David; Chapman, Thomas; Amendt, Peter

    2015-11-01

    We assess the cause of large outer-beam stimulated Brillouin scattering (SBS) in a NIF shot with a rugby-shaped hohlraum, which has less wall surface loss and thus higher x-ray drive than a cylindrical hohlraum of the same radius. This shot differed from a prior rugby shot with low SBS in three ways: outer beam pointing, split-pointing of the four beams within each outer-beam quadruplet, and a small amount of neon added to the hohlraum helium fill gas. We use pF3D, a massively-parallel, paraxial-envelope laser plasma interaction code, with plasma profiles from the radiation-hydrodynamics code Lasnex. We determine which change between the two shots increased the SBS by adding them one at a time to the simulations. We compare the simulations to experimental data for total SBS power, its spatial distribution at the lens, and the SBS spectrum. For each shot, we use profiles from Lasnex simulations with and without a model for mix at the hohlraum wall-gas interface. Work performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344. Release number LLNL-ABS-674893.

  3. A multigroup radiation diffusion test problem: Comparison of code results with analytic solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shestakov, A I; Harte, J A; Bolstad, J H

    2006-12-21

    We consider a 1D, slab-symmetric test problem for the multigroup radiation diffusion and matter energy balance equations. The test simulates diffusion of energy from a hot central region. Opacities vary with the cube of the frequency and radiation emission is given by a Wien spectrum. We compare results from two LLNL codes, Raptor and Lasnex, with tabular data that define the analytic solution.

  4. MatProps: Material Properties Database and Associated Access Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durrenberger, J K; Becker, R C; Goto, D M

    2007-08-13

    Coefficients for analytic constitutive and equation of state models (EOS), which are used by many hydro codes at LLNL, are currently stored in a legacy material database (Steinberg, UCRL-MA-106349). Parameters for numerous materials are available through this database, and include Steinberg-Guinan and Steinberg-Lund constitutive models for metals, JWL equations of state for high explosives, and Mie-Gruniesen equations of state for metals. These constitutive models are used in most of the simulations done by ASC codes today at Livermore. Analytic EOSs are also still used, but have been superseded in many cases by tabular representations in LEOS (http://leos.llnl.gov). Numerous advanced constitutivemore » models have been developed and implemented into ASC codes over the past 20 years. These newer models have more physics and better representations of material strength properties than their predecessors, and therefore more model coefficients. However, a material database of these coefficients is not readily available. Therefore incorporating these coefficients with those of the legacy models into a portable database that could be shared amongst codes would be most welcome. The goal of this paper is to describe the MatProp effort at LLNL to create such a database and associated access library that could be used by codes throughout the DOE complex and beyond. We have written an initial version of the MatProp database and access library and our DOE/ASC code ALE3D (Nichols et. al., UCRL-MA-152204) is able to import information from the database. The database, a link to which exists on the Sourceforge server at LLNL, contains coefficients for many materials and models (see Appendix), and includes material parameters in the following categories--flow stress, shear modulus, strength, damage, and equation of state. Future versions of the Matprop database and access library will include the ability to read and write material descriptions that can be exchanged between codes. It will also include an ability to do unit changes, i.e. have the library return parameters in user-specified unit systems. In addition to these, additional material categories can be added (e.g., phase change kinetics, etc.). The Matprop database and access library is part of a larger set of tools used at LLNL for assessing material model behavior. One of these is MSlib, a shared constitutive material model library. Another is the Material Strength Database (MSD), which allows users to compare parameter fits for specific constitutive models to available experimental data. Together with Matprop, these tools create a suite of capabilities that provide state-of-the-art models and parameters for those models to integrated simulation codes. This document is broken into several appendices. Appendix A contains a code example to retrieve several material coefficients. Appendix B contains the API for the Matprop data access library. Appendix C contains a list of the material names and model types currently available in the Matprop database. Appendix D contains a list of the parameter names for the currently recognized model types. Appendix E contains a full xml description of the material Tantalum.« less

  5. Three- and Two- Dimensional Simulations of Re-shock Experiments at High Energy Densities at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Raman, Kumar; MacLaren, Stephan; Huntington, Channing; Nagel, Sabrina

    2016-10-01

    We present simulations of recent high-energy-density (HED) re-shock experiments on the National Ignition Facility (NIF). The experiments study the Rayleigh-Taylor (RT) and Richtmyer-Meshkov (RM) instability growth that occurs after successive shocks transit a sinusoidally-perturbed interface between materials of different densities. The shock tube is driven at one or both ends using indirect-drive laser cavities or hohlraums. X-ray area-backlit imaging is used to visualize the growth at different times. Our simulations are done with the three-dimensional, radiation hydrodynamics code ARES, developed at LLNL. We show the instabilitygrowth rate, inferred from the experimental radiographs, agrees well with our 2D and 3D simulations. We also discuss some 3D geometrical effects, suggested by our simulations, which could deteriorate the images at late times, unless properly accounted for in the experiment design. Work supported by U.S. Department of Energy under Contract DE- AC52-06NA27279. LLNL-ABS-680789.

  6. Running SW4 On New Commodity Technology Systems (CTS-1) Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, Arthur J.; Petersson, N. Anders; Pitarka, Arben

    We have recently been running earthquake ground motion simulations with SW4 on the new capacity computing systems, called the Commodity Technology Systems - 1 (CTS-1) at Lawrence Livermore National Laboratory (LLNL). SW4 is a fourth order time domain finite difference code developed by LLNL and distributed by the Computational Infrastructure for Geodynamics (CIG). SW4 simulates seismic wave propagation in complex three-dimensional Earth models including anelasticity and surface topography. We are modeling near-fault earthquake strong ground motions for the purposes of evaluating the response of engineered structures, such as nuclear power plants and other critical infrastructure. Engineering analysis of structures requiresmore » the inclusion of high frequencies which can cause damage, but are often difficult to include in simulations because of the need for large memory to model fine grid spacing on large domains.« less

  7. Edge-relevant plasma simulations with the continuum code COGENT

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Ghosh, D.; Hittinger, J.; Rognlien, T.; Cohen, R.; Lee, W.; Schwartz, P.

    2016-10-01

    We describe recent advances in cross-separatrix and other edge-relevant plasma simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The distinguishing feature of the COGENT code is its high-order finite-volume discretization methods, which employ arbitrary mapped multiblock grid technology (nearly field-aligned on blocks) to handle the complexity of tokamak divertor geometry with high accuracy. This paper discusses the 4D (axisymmetric) electrostatic version of the code, and the presented topics include: (a) initial simulations with kinetic electrons and development of reduced fluid models; (b) development and application of implicit-explicit (IMEX) time integration schemes; and (c) conservative modeling of drift-waves and the universal instability. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344 and at LBNL under contract DE-AC02-05CH11231.

  8. Load Designs For MJ Dense Plasma Foci

    NASA Astrophysics Data System (ADS)

    Link, A.; Povlius, A.; Anaya, R.; Anderson, M. G.; Angus, J. R.; Cooper, C. M.; Falabella, S.; Goerz, D.; Higginson, D.; Holod, I.; McMahon, M.; Mitrani, J.; Koh, E. S.; Pearson, A.; Podpaly, Y. A.; Prasad, R.; van Lue, D.; Watson, J.; Schmidt, A. E.

    2017-10-01

    Dense plasma focus (DPF) Z-pinches are compact pulse power driven devices with coaxial electrodes. The discharge of DPF consists of three distinct phases: first generation of a plasma sheath, plasma rail gun phase where the sheath is accelerated down the electrodes and finally an implosion phase where the plasma stagnates into a z-pinch geometry. During the z-pinch phase, DPFs can produce MeV ion beams, x-rays and neutrons. Megaampere class DPFs with deuterium fills have demonstrated neutron yields in the 1012 neutrons/shot range with pulse durations of 10-100 ns. Kinetic simulations using the code Chicago are being used to evaluate various load configurations from initial sheath formation to the final z-pinch phase for DPFs with up to 5 MA and 1 MJ coupled to the load. Results will be presented from the preliminary design simulations. LLNL-ABS-734785 This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344 and with support from the Computing Grand Challenge program at LLNL.

  9. Continuum Vlasov Simulation in Four Phase-space Dimensions

    NASA Astrophysics Data System (ADS)

    Cohen, B. I.; Banks, J. W.; Berger, R. L.; Hittinger, J. A.; Brunner, S.

    2010-11-01

    In the VALHALLA project, we are developing scalable algorithms for the continuum solution of the Vlasov-Maxwell equations in two spatial and two velocity dimensions. We use fourth-order temporal and spatial discretizations of the conservative form of the equations and a finite-volume representation to enable adaptive mesh refinement and nonlinear oscillation control [1]. The code has been implemented with and without adaptive mesh refinement, and with electromagnetic and electrostatic field solvers. A goal is to study the efficacy of continuum Vlasov simulations in four phase-space dimensions for laser-plasma interactions. We have verified the code in examples such as the two-stream instability, the weak beam-plasma instability, Landau damping, electron plasma waves with electron trapping and nonlinear frequency shifts [2]^ extended from 1D to 2D propagation, and light wave propagation.^ We will report progress on code development, computational methods, and physics applications. This work was performed under the auspices of the U.S. DOE by LLNL under contract no. DE-AC52-07NA27344. This work was funded by the Lab. Dir. Res. and Dev. Prog. at LLNL under project tracking code 08-ERD-031. [1] J.W. Banks and J.A.F. Hittinger, to appear in IEEE Trans. Plas. Sci. (Sept., 2010). [2] G.J. Morales and T.M. O'Neil, Phys. Rev. Lett. 28,417 (1972); R. L. Dewar, Phys. Fluids 15,712 (1972).

  10. Fast Model Generalized Pseudopotential Theory Interatomic Potential Routine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-03-18

    MGPT is an unclassified source code for the fast evaluation and application of quantum-based MGPT interatomic potentials for mrtals. The present version of MGPT has been developed entirely at LLNL, but is specifically designed for implementation in the open-source molecular0dynamics code LAMMPS maintained by Sandia National Laboratories. Using MGPT in LAMMPS, with separate input potential data, one can perform large-scale atomistic simulations of the structural, thermodynamic, defeat and mechanical properties of transition metals with quantum-mechanical realism.

  11. Crashworthiness analysis using advanced material models in DYNA3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, R.W.; Burger, M.J.; McMichael, L.D.

    1993-10-22

    As part of an electric vehicle consortium, LLNL and Kaiser Aluminum are conducting experimental and numerical studies on crashworthy aluminum spaceframe designs. They have jointly explored the effect of heat treat on crush behavior and duplicated the experimental behavior with finite-element simulations. The major technical contributions to the state of the art in numerical simulation arise from the development and use of advanced material model descriptions for LLNL`s DYNA3D code. Constitutive model enhancements in both flow and failure have been employed for conventional materials such as low-carbon steels, and also for lighter weight materials such as aluminum and fiber compositesmore » being considered for future vehicles. The constitutive model enhancements are developed as extensions from LLNL`s work in anisotropic flow and multiaxial failure modeling. Analysis quality as a function of level of simplification of material behavior and mesh is explored, as well as the penalty in computation cost that must be paid for using more complex models and meshes. The lightweight material modeling technology is being used at the vehicle component level to explore the safety implications of small neighborhood electric vehicles manufactured almost exclusively from these materials.« less

  12. Insensitive Munitions Modeling Improvement Efforts

    DTIC Science & Technology

    2010-10-01

    LLNL) ALE3D . Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to...codes most commonly used by munition designers are CTH and the SIERRA suite of codes produced by Sandia National Labs (SNL) and ALE3D produced by... ALE3D , a LLNL developed code, is also used by various DoD participants. It was however, designed differently than either CTH or Sierra. ALE3D is a

  13. PyORBIT: A Python Shell For ORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jean-Francois Ostiguy; Jeffrey Holmes

    2003-07-01

    ORBIT is code developed at SNS to simulate beam dynamics in accumulation rings and synchrotrons. The code is structured as a collection of external C++ modules for SuperCode, a high level interpreter shell developed at LLNL in the early 1990s. SuperCode is no longer actively supported and there has for some time been interest in replacing it by a modern scripting language, while preserving the feel of the original ORBIT program. In this paper, we describe a new version of ORBIT where the role of SuperCode is assumed by Python, a free, well-documented and widely supported object-oriented scripting language. Wemore » also compare PyORBIT to ORBIT from the standpoint of features, performance and future expandability.« less

  14. Site 300 Spill Prevention, Control, and Countermeasures (SPCC) Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, D.; Mertesdorf, E.

    This Spill Prevention, Control, and Countermeasure (SPCC) Plan describes the measures that are taken at Lawrence Livermore National Laboratory’s (LLNL) Experimental Test Site (Site 300) near Tracy, California, to prevent, control, and handle potential spills from aboveground containers that can contain 55 gallons or more of oil. This SPCC Plan complies with the Oil Pollution Prevention regulation in Title 40 of the Code of Federal Regulations, Part 112 (40 CFR 112) and with 40 CFR 761.65(b) and (c), which regulates the temporary storage of polychlorinated biphenyls (PCBs). This Plan has also been prepared in accordance with Division 20, Chapter 6.67more » of the California Health and Safety Code (HSC 6.67) requirements for oil pollution prevention (referred to as the Aboveground Petroleum Storage Act [APSA]), and the United States Department of Energy (DOE) Order No. 436.1. This SPCC Plan establishes procedures, methods, equipment, and other requirements to prevent the discharge of oil into or upon the navigable waters of the United States or adjoining shorelines for aboveground oil storage and use at Site 300. This SPCC Plan has been prepared for the entire Site 300 facility and replaces the three previous plans prepared for Site 300: LLNL SPCC for Electrical Substations Near Buildings 846 and 865 (LLNL 2015), LLNL SPCC for Building 883 (LLNL 2015), and LLNL SPCC for Building 801 (LLNL 2014).« less

  15. Site 300 SPCC Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, D.

    This Spill Prevention, Control, and Countermeasure (SPCC) Plan describes the measures that are taken at Lawrence Livermore National Laboratory’s (LLNL) Experimental Test Site (Site 300) near Tracy, California, to prevent, control, and handle potential spills from aboveground containers that can contain 55 gallons or more of oil. This SPCC Plan complies with the Oil Pollution Prevention regulation in Title 40 of the Code of Federal Regulations, Part 112 (40 CFR 112) and with 40 CFR 761.65(b) and (c), which regulates the temporary storage of polychlorinated biphenyls (PCBs). This Plan has also been prepared in accordance with Division 20, Chapter 6.67more » of the California Health and Safety Code (HSC 6.67) requirements for oil pollution prevention (referred to as the Aboveground Petroleum Storage Act [APSA]), and the United States Department of Energy (DOE) Order No. 436.1. This SPCC Plan establishes procedures, methods, equipment, and other requirements to prevent the discharge of oil into or upon the navigable waters of the United States or adjoining shorelines for aboveground oil storage and use at Site 300. This SPCC Plan has been prepared for the entire Site 300 facility and replaces the three previous plans prepared for Site 300: LLNL SPCC for Electrical Substations Near Buildings 846 and 865 (LLNL 2015), LLNL SPCC for Building 883 (LLNL 2015), and LLNL SPCC for Building 801 (LLNL 2014).« less

  16. Particle-in-cell/accelerator code for space-charge dominated beam simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-08

    Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model.more » The code is guilt atop the Python interpreter language.« less

  17. Simulation of asteroid impact on ocean surfaces, subsequent wave generation and the effect on US shorelines

    DOE PAGES

    Ezzedine, Souheil M.; Lomov, Ilya; Miller, Paul L.; ...

    2015-05-19

    As part of a larger effort involving members of several other organizations, we have conducted numerical simulations in support of emergency-response exercises of postulated asteroid ocean impacts. We have addressed the problem from source (asteroid entry) to ocean impact (splash) to wave generation, propagation and interaction with the U.S. shoreline. We simulated three impact sites. The first site is located off the east coast by Maryland's shoreline. The second site is located off of the West coast, the San Francisco bay. The third set of sites are situated in the Gulf of Mexico. Asteroid impacts on the ocean surface aremore » conducted using LLNL's hydrocode GEODYN to create the impact wave source for the shallow water wave propagation code, SWWP, a shallow depth averaged water wave code.« less

  18. Simulation of asteroid impact on ocean surfaces, subsequent wave generation and the effect on US shorelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezzedine, Souheil M.; Lomov, Ilya; Miller, Paul L.

    As part of a larger effort involving members of several other organizations, we have conducted numerical simulations in support of emergency-response exercises of postulated asteroid ocean impacts. We have addressed the problem from source (asteroid entry) to ocean impact (splash) to wave generation, propagation and interaction with the U.S. shoreline. We simulated three impact sites. The first site is located off the east coast by Maryland's shoreline. The second site is located off of the West coast, the San Francisco bay. The third set of sites are situated in the Gulf of Mexico. Asteroid impacts on the ocean surface aremore » conducted using LLNL's hydrocode GEODYN to create the impact wave source for the shallow water wave propagation code, SWWP, a shallow depth averaged water wave code.« less

  19. Numerical Parameter Optimization of the Ignition and Growth Model for HMX Based Plastic Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Gambino, James; Tarver, Craig; Springer, H. Keo; White, Bradley; Fried, Laurence

    2017-06-01

    We present a novel method for optimizing parameters of the Ignition and Growth reactive flow (I&G) model for high explosives. The I&G model can yield accurate predictions of experimental observations. However, calibrating the model is a time-consuming task especially with multiple experiments. In this study, we couple the differential evolution global optimization algorithm to simulations of shock initiation experiments in the multi-physics code ALE3D. We develop parameter sets for HMX based explosives LX-07 and LX-10. The optimization finds the I&G model parameters that globally minimize the difference between calculated and experimental shock time of arrival at embedded pressure gauges. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC LLNL-ABS- 724898.

  20. Spherical harmonic results for the 3D Kobayashi Benchmark suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, P N; Chang, B; Hanebutte, U R

    1999-03-02

    Spherical harmonic solutions are presented for the Kobayashi benchmark suite. The results were obtained with Ardra, a scalable, parallel neutron transport code developed at Lawrence Livermore National Laboratory (LLNL). The calculations were performed on the IBM ASCI Blue-Pacific computer at LLNL.

  1. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  2. Computational Electronics and Electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeFord, J.F.

    The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust areamore » fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.« less

  3. Workshop on Models for Plasma Spectroscopy

    NASA Astrophysics Data System (ADS)

    1993-09-01

    A meeting was held at St. Johns College, Oxford from Monday 27th to Thursday 30th of September 1993 to bring together a group of physicists working on computational modelling of plasma spectroscopy. The group came from the UK, France, Israel and the USA. The meeting was organized by myself, Dr. Steven Rose of RAL and Dr. R.W. Lee of LLNL. It was funded by the U.S. European Office of Aerospace Research and Development and by LLNL. The meeting grew out of a wish by a group of core participants to make available to practicing plasma physicists (particularly those engaged in the design and analysis of experiments) sophisticated numerical models of plasma physics. Additional plasma physicists attended the meeting in Oxford by invitation. These were experimentalists and users of plasma physics simulation codes whose input to the meeting was to advise the core group as to what was really needed.

  4. Dislocation dynamics: simulation of plastic flow of bcc metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lassila, D H

    This is the final report for the LDRD strategic initiative entitled ''Dislocation Dynamic: Simulation of Plastic Flow of bcc Metals'' (tracking code: 00-SI-011). This report is comprised of 6 individual sections. The first is an executive summary of the project and describes the overall project goal, which is to establish an experimentally validated 3D dislocation dynamics simulation. This first section also gives some information of LLNL's multi-scale modeling efforts associated with the plasticity of bcc metals, and the role of this LDRD project in the multiscale modeling program. The last five sections of this report are journal articles that weremore » produced during the course of the FY-2000 efforts.« less

  5. Los Alamos and Lawrence Livermore National Laboratories Code-to-Code Comparison of Inter Lab Test Problem 1 for Asteroid Impact Hazard Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Robert P.; Miller, Paul; Howley, Kirsten

    The NNSA Laboratories have entered into an interagency collaboration with the National Aeronautics and Space Administration (NASA) to explore strategies for prevention of Earth impacts by asteroids. Assessment of such strategies relies upon use of sophisticated multi-physics simulation codes. This document describes the task of verifying and cross-validating, between Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), modeling capabilities and methods to be employed as part of the NNSA-NASA collaboration. The approach has been to develop a set of test problems and then to compare and contrast results obtained by use of a suite of codes, includingmore » MCNP, RAGE, Mercury, Ares, and Spheral. This document provides a short description of the codes, an overview of the idealized test problems, and discussion of the results for deflection by kinetic impactors and stand-off nuclear explosions.« less

  6. Kinetic turbulence simulations at extreme scale on leadership-class systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    2013-01-01

    Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCFmore » and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).« less

  7. Studies of Particle Wake Potentials in Plasmas

    NASA Astrophysics Data System (ADS)

    Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren

    2011-10-01

    Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and with BEPS using various cell sizes. In this poster, we compare the wakes we observe in these simulations with each other and predictions from Vlasov theory. Prepared by LLNL under Contract DE-AC52-07NA27344 and by UCLA under Grant DE-FG52-09NA29552.

  8. Pulsed Magnetic Field System for Magnetized Target Experiments at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Rhodes, M. A.; Solberg, J. M.; Logan, B. G.; Perkins, L. J.

    2014-10-01

    High-magnitude magnetic fields applied to inertially confined targets may improve fusion yield and enable basic science applications. We discuss the development of a pulsed magnetic field system for NIF with the goal of applying 10--70 T to various NIF targets. While the driver may be little more than a spark-gap switched capacitor, numerous complex challenges exist in fielding such a system on NIF. The coil surrounding the metallic hohlraum drives induced current in the hohlraum wall. Both the coil and hohlraum wall must survive ohmic heating and J × B forces for several microseconds. Pulsed power must couple to the coil in the NIF environment. The system must not cause late-time optics damage due to debris. There is very limited volume for the driver in a NIF Diagnostic Instrument Manipulator (DIM). We are modeling the coil and hohlraum MHD effects with the LLNL code, ALE3D. However, the simulations lack complete and accurate data for all the required thermo-physical material properties over the expected range of temperatures (below vaporization) and pressures. Therefore, substantial experimental development is planned in the coming year. We present coil and hohlraum simulations results, overall system design, and progress towards an operational prototype test-stand. LLNL is operated by LLNS, LLC, for the U.S. D.O.E., NNSA under Contract DE-AC52-07NA27344. This work was supported by LLNL LDRD 14-ER-028.

  9. Simulation of Powder Layer Deposition in Additive Manufacturing Processes Using the Discrete Element Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herbold, E. B.; Walton, O.; Homel, M. A.

    2015-10-26

    This document serves as a final report to a small effort where several improvements were added to a LLNL code GEODYN-­L to develop Discrete Element Method (DEM) algorithms coupled to Lagrangian Finite Element (FE) solvers to investigate powder-­bed formation problems for additive manufacturing. The results from these simulations will be assessed for inclusion as the initial conditions for Direct Metal Laser Sintering (DMLS) simulations performed with ALE3D. The algorithms were written and performed on parallel computing platforms at LLNL. The total funding level was 3-­4 weeks of an FTE split amongst two staff scientists and one post-­doc. The DEM simulationsmore » emulated, as much as was feasible, the physical process of depositing a new layer of powder over a bed of existing powder. The DEM simulations utilized truncated size distributions spanning realistic size ranges with a size distribution profile consistent with realistic sample set. A minimum simulation sample size on the order of 40-­particles square by 10-­particles deep was utilized in these scoping studies in order to evaluate the potential effects of size segregation variation with distance displaced in front of a screed blade. A reasonable method for evaluating the problem was developed and validated. Several simulations were performed to show the viability of the approach. Future investigations will focus on running various simulations investigating powder particle sizing and screen geometries.« less

  10. Level-2 Milestone 6007: Sierra Early Delivery System Deployed to Secret Restricted Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertsch, A. D.

    This report documents the delivery and installation of Shark, a CORAL Sierra early delivery system deployed on the LLNL SRD network. Early ASC program users have run codes on the machine in support of application porting for the final Sierra system which will be deployed at LLNL in CY2018. In addition to the SRD resource, Shark, unclassified resources, Rzmanta and Ray, have been deployed on the LLNL Restricted Zone and Collaboration Zone networks in support of application readiness for the Sierra platform.

  11. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  12. Simulating Small-Scale Experiments of In-Tunnel Airblast Using STUN and ALE3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuscamman, Stephanie; Glenn, Lewis; Schebler, Gregory

    2011-09-12

    This report details continuing validation efforts for the Sphere and Tunnel (STUN) and ALE3D codes. STUN has been validated previously for blast propagation through tunnels using several sets of experimental data with varying charge sizes and tunnel configurations, including the MARVEL nuclear driven shock tube experiment (Glenn, 2001). The DHS-funded STUNTool version is compared to experimental data and the LLNL ALE3D hydrocode. In this particular study, we compare the performance of the STUN and ALE3D codes in modeling an in-tunnel airblast to experimental results obtained by Lunderman and Ohrt in a series of small-scale high explosive experiments (1997).

  13. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  14. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  15. Waves generated by Asteroid impacts and their effects on US shorelines

    NASA Astrophysics Data System (ADS)

    Ezzedine, S. M.; Miller, P. L.; Dearborn, D. S.; Dennison, D. S.; Glascoe, L. G.; Antoun, T.

    2013-12-01

    On February 15, 2013 an undetected ~17-20-m diameter asteroid entered earth's atmosphere and, due to its large entry speed of 18.6 km/s and its shallow entry angle, the asteroid exploded in an airburst over Chelyabinsk, Russia, generating a bright flash, producing many small fragment meteorites and causing a powerful shock wave which released the equivalent of ~440 kt TNT of energy. About 16 hours after the Chelyabinsk asteroid, the elongated ~20m by ~40m (~30 m diameter) NEA 2012 DA14 with an estimated mass of 40 kt neared the earth surface at ~28,100km, ~2.2 earth's diameter. These two consecutive events, which were unrelated and had drastically different orbits, generated considerable attention and awareness from the public, confusion among the local residents, and raised the issue of emergency response and preparedness of local, state and government agencies. LLNL and other government agencies have performed numerical simulations of a postulated asteroid impact onto the ocean and generated data to support an emergency preparedness exercise. We illustrate the exercise through the application of several codes from source (asteroid entry) to ocean impact (splash rim) to wave generation, propagation and interaction with the shoreline. Using state-of-the-art high performance computing codes we simulate three impact sites; one site is located off the eat coat by Maryland's shoreline and two other sites on the west coast: the San Francisco bay and the Los Angeles bay shorelines, respectively. Simulations were conducted not only under deterministic conditions but also under conditions of uncertainty. Uncertainty assessment of flood hazards zones and structural integrity of infrastructures will be presented. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, and partially funded by the Laboratory Directed Research and Development Program at LLNL under tracking code 12-ERD-005.

  16. LLNL NESHAPs 2015 Annual Report - June 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, K. R.; Gallegos, G. M.; MacQueen, D. H.

    2016-06-01

    Lawrence Livermore National Security, LLC operates facilities at Lawrence Livermore National Laboratory (LLNL) in which radionuclides are handled and stored. These facilities are subject to the U.S. Environmental Protection Agency (EPA) National Emission Standards for Hazardous Air Pollutants (NESHAPs) in Code of Federal Regulations (CFR) Title 40, Part 61, Subpart H, which regulates radionuclide emissions to air from Department of Energy (DOE) facilities. Specifically, NESHAPs limits the emission of radionuclides to the ambient air to levels resulting in an annual effective dose equivalent of 10 mrem (100 μSv) to any member of the public. Using measured and calculated emissions, andmore » building-specific and common parameters, LLNL personnel applied the EPA-approved computer code, CAP88-PC, Version 4.0.1.17, to calculate the dose to the maximally exposed individual member of the public for the Livermore Site and Site 300.« less

  17. Integrated Predictive Tools for Customizing Microstructure and Material Properties of Additively Manufactured Aerospace Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radhakrishnan, Balasubramaniam; Fattebert, Jean-Luc; Gorti, Sarma B.

    Additive Manufacturing (AM) refers to a process by which digital three-dimensional (3-D) design data is converted to build up a component by depositing material layer-by-layer. United Technologies Corporation (UTC) is currently involved in fabrication and certification of several AM aerospace structural components made from aerospace materials. This is accomplished by using optimized process parameters determined through numerous design-of-experiments (DOE)-based studies. Certification of these components is broadly recognized as a significant challenge, with long lead times, very expensive new product development cycles and very high energy consumption. Because of these challenges, United Technologies Research Center (UTRC), together with UTC business unitsmore » have been developing and validating an advanced physics-based process model. The specific goal is to develop a physics-based framework of an AM process and reliably predict fatigue properties of built-up structures as based on detailed solidification microstructures. Microstructures are predicted using process control parameters including energy source power, scan velocity, deposition pattern, and powder properties. The multi-scale multi-physics model requires solution and coupling of governing physics that will allow prediction of the thermal field and enable solution at the microstructural scale. The state-of-the-art approach to solve these problems requires a huge computational framework and this kind of resource is only available within academia and national laboratories. The project utilized the parallel phase-fields codes at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL), along with the high-performance computing (HPC) capabilities existing at the two labs to demonstrate the simulation of multiple dendrite growth in threedimensions (3-D). The LLNL code AMPE was used to implement the UTRC phase field model that was previously developed for a model binary alloy, and the simulation results were compared against the UTRC simulation results, followed by extension of the UTRC model to simulate multiple dendrite growth in 3-D. The ORNL MEUMAPPS code was used to simulate dendritic growth in a model ternary alloy with the same equilibrium solidification range as the Ni-base alloy 718 using realistic model parameters, including thermodynamic integration with a Calphad based model for the ternary alloy. Implementation of the UTRC model in AMPE met with several numerical and parametric issues that were resolved and good comparison between the simulation results obtained by the two codes was demonstrated for two dimensional (2-D) dendrites. 3-D dendrite growth was then demonstrated with the AMPE code using nondimensional parameters obtained in 2-D simulations. Multiple dendrite growth in 2-D and 3-D were demonstrated using ORNL’s MEUMAPPS code using simple thermal boundary conditions. MEUMAPPS was then modified to incorporate the complex, time-dependent thermal boundary conditions obtained by UTRC’s thermal modeling of single track AM experiments to drive the phase field simulations. The results were in good agreement with UTRC’s experimental measurements.« less

  18. Simulations of a Molecular Cloud experiment using CRASH

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Keiter, Paul; Vandervort, Robert; Drake, R. Paul; Shvarts, Dov

    2017-10-01

    Recent laboratory experiments explore molecular cloud radiation hydrodynamics. The experiment irradiates a gold foil with a laser producing x-rays to drive the implosion or explosion of a foam ball. The CRASH code, an Eulerian code with block-adaptive mesh refinement, multigroup diffusive radiation transport, and electron heat conduction developed at the University of Michigan to design and analyze high-energy-density experiments, is used to perform a parameter search in order to identify optically thick, optically thin and transition regimes suitable for these experiments. Specific design issues addressed by the simulations are the x-ray drive temperature, foam density, distance from the x-ray source to the ball, as well as other complicating issues such as the positioning of the stalk holding the foam ball. We present the results of this study and show ways the simulations helped improve the quality of the experiment. This work is funded by the LLNL under subcontract B614207 and NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0002956.

  19. Computational Meso-Scale Study of Representative Unit Cubes for Inert Spheres Subject to Intense Shocks

    NASA Astrophysics Data System (ADS)

    Stewart, Cameron; Najjar, Fady; Stewart, D. Scott; Bdzil, John

    2012-11-01

    Modern-engineered high explosive (HE) materials can consist of a matrix of solid, inert particles embedded into an HE charge. When this charge is detonated, intense shock waves are generated. As these intense shocks interact with the inert particles, large deformations occur in the particles while the incident shock diffracts around the particle interface. We will present results from a series of 3-D DNS of an intense shock interacting with unit-cube configurations of inert particles embedded into nitromethane. The LLNL multi-physics massively parallel hydrodynamics code ALE3D is used to carry out high-resolution (4 million nodes) simulations. Three representative unit-cube configurations are considered: primitive cubic, face-centered and body-centered cubic for two particle material types of varying impedance ratios. Previous work has only looked at in-line particles configurations. We investigate the time evolution of the unit cell configurations, vorticity being generated by the shock interaction, as well as the velocity and acceleration of the particles until they reach the quasi-steady regime. LLNL-ABS-567694. CSS was supported by a summer internship through the HEDP program at LLNL. FMN's work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. Computational Study of 3-D Hot-Spot Initiation in Shocked Insensitive High-Explosive

    NASA Astrophysics Data System (ADS)

    Najjar, F. M.; Howard, W. M.; Fried, L. E.

    2011-06-01

    High explosive shock sensitivity is controlled by a combination of mechanical response, thermal properties, and chemical properties. The interplay of these physical phenomena in realistic condensed energetic materials is currently lacking. A multiscale computational framework is developed investigating hot spot (void) ignition in a single crystal of an insensitive HE, TATB. Atomistic MD simulations are performed to provide the key chemical reactions and these reaction rates are used in 3-D multiphysics simulations. The multiphysics code, ALE3D, is linked to the chemistry software, Cheetah, and a three-way coupled approach is pursued including hydrodynamics, thermal and chemical analyses. A single spherical air bubble is embedded in the insensitive HE and its collapse due to shock initiation is evolved numerically in time; while the ignition processes due chemical reactions are studied. Our current predictions showcase several interesting features regarding hot spot dynamics including the formation of a ``secondary'' jet. Results obtained with hydro-thermo-chemical processes leading to ignition growth will be discussed for various pore sizes and different shock pressures. LLNL-ABS-471438. This work performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344.

  1. ALPHA SMP SYSTEM(S) Final Report CRADA No. TC-1404-97

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seager, M.; Beaudet, T.

    Within the scope of this subcontract, Digital Equipment Corporation (DIGITAL) and the University, through the Lawrence Livermore National Laboratory (LLNL), engaged in joint research and development activities of mutual interest and benefit. The primary objectives of these activities were, for LLNL to improve its capability to perform its mission, and for DIGITAL to develop technical capability complimentary to this mission. The collaborative activities had direct manpower investments by DIGITAL and LLNL. The project was divided into four areas of concern, which were handled concurrently. These areas included Gang Scheduling, Numerical Methods, Applications Development and Code Development Tools.

  2. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  3. Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.

    2012-12-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  4. FY06 L2C2 HE program report Zaug et al.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaug, J M; Crowhurst, J C; Howard, W M

    2008-08-01

    The purpose of this project is to advance the improvement of LLNL thermochemical computational models that form the underlying basis or input for laboratory hydrodynamic simulations. Our general work approach utilizes, by design, tight experimental-theoretical research interactions that allow us to not empirically, but rather more scientifically improve LLNL computational results. The ultimate goal here is to confidently predict through computer models, the performance and safety parameters of currently maintained, modified, and newly designed stockpile systems. To attain our goal we make relevant experimental measurements on candidate detonation products constrained under static high-pressure and temperature conditions. The reduced information frommore » these measurements is then used to construct analytical forms that describe the potential surface (repulsive energy as a function of interatomic separation distance) of single and mixed fluid or detonation product species. These potential surface shapes are also constructed using input from well-trusted shock wave physics and assorted thermodynamic data available in the open literature. Our potential surfaces permit one to determine the equations of state (P,V,T), the equilibrium chemistry, phase, and chemical interactions of detonation products under a very wide range of extreme pressure temperature conditions. Using our foundation of experimentally refined potential surfaces we are in a position to calculate, with confidence, the energetic output and chemical speciation occurring from a specific combustion and/or detonation reaction. The thermochemical model we developed and use for calculating the equilibrium chemistry, kinetics, and energy from ultrafast processes is named 'Cheetah'. Computational results from our Cheetah code are coupled to laboratory ALE3D hydrodynamic simulation codes where the complete response behavior of an existing or proposed system is ultimately predicted. The Cheetah thermochemical code is also used by well over 500 U.S. government DoD and DOE community users who calculate the chemical properties of detonated high explosives, propellants, and pyrotechnics. To satisfy the growing needs of LLNL and the general user community we continue to improve the robustness of our Cheetah code. The P-T range of current speed of sound experiments will soon be extended by a factor of four and our recently developed technological advancements permit us to, for the first time, study any chemical specie or fluid mixture. New experiments will focus on determining the miscibility or coexistence curves of detonation product mixtures. Our newly constructed ultrafast laser diagnostics will permit us to determine what chemical species exist under conditions approaching Chapman-Jouguet (CJ) detonation states. Furthermore we will measure the time evolution of candidate species and use our chemical kinetics data to develop new and validate existing rate laws employed in future versions of our Cheetah thermochemical code.« less

  5. 2009.3 Revision of the Evaluated Nuclear Data Library (ENDL2009.3)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, I. J.; Beck, B.; Descalle, M. A.

    LLNL's Computational Nuclear Data and Theory Group have created a 2009.3 revised release of the Evaluated Nuclear Data Library (ENDL2009.3). This library is designed to support LLNL's current and future nuclear data needs and will be employed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. The ENDL2009 database was the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. It was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/Office of Science's US Nuclear Data Program. This document listsmore » the revisions and fixes made in a new release called ENDL2009.3, by com- paring with the existing data in the previous release ENDL2009.2. These changes are made in conjunction with the revisions for ENDL2011.3, so that both the .3 releases are as free as possible of known defects.« less

  6. 2011.2 Revision of the Evaluated Nuclear Data Library (ENDL2011.2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, B.; Descalles, M. A.; Mattoon, C.

    LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have col- laborated to create the 2011.2 revised release of the Evaluated Nuclear Data Library (ENDL2011.2). ENDL2011.2 is designed to support LLNL's current and future nuclear data needs and will be em- ployed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. This library was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/O cemore » of Science's US Nuclear Data Program. This document lists the revisions made in ENDL2011.2 compared with the data existing in the original ENDL2011.0 release and the ENDL2011.1-rc4 re- lease candidate of April 2015. These changes are made in parallel with some similar revisions for ENDL2009.2.« less

  7. Multiple Independent File Parallel I/O with HDF5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M. C.

    2016-07-13

    The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as O(10 6) parallel tasks.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A.; Barnard, J.J.; Briggs, R.J.

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL), a collaborationof LBNL, LLNL, and PPPL, has achieved 60-fold pulse compression of ion beams on the Neutralized Drift Compression eXperiment (NDCX) at LBNL. In NDCX, a ramped voltage pulse from an induction cell imparts a velocity"tilt" to the beam; the beam's tail then catches up with its head in a plasma environment that provides neutralization. The HIFS-VNL's mission is to carry out studies of Warm Dense Matter (WDM) physics using ion beams as the energy source; an emerging thrust is basic target physics for heavy ion-driven Inertial Fusion Energy (IFE). Thesemore » goals require an improved platform, labeled NDCX-II. Development of NDCX-II at modest cost was recently enabled by the availability of induction cells and associated hardware from the decommissioned Advanced Test Accelerator (ATA) facility at LLNL. Our initial physics design concept accelerates a ~;;30 nC pulse of Li+ ions to ~;;3 MeV, then compresses it to ~;;1 ns while focusing it onto a mm-scale spot. It uses the ATA cells themselves (with waveforms shaped by passive circuits) to impart the final velocity tilt; smart pulsers provide small corrections. The ATA accelerated electrons; acceleration of non-relativistic ions involves more complex beam dynamics both transversely and longitudinally. We are using analysis, an interactive one-dimensional kinetic simulation model, and multidimensional Warp-code simulations to develop the NDCX-II accelerator section. Both LSP and Warp codes are being applied to the beam dynamics in the neutralized drift and final focus regions, and the plasma injection process. The status of this effort is described.« less

  9. Simulations of the Richtmyer-Meshkov Instability in a two-shock vertical shock tube

    NASA Astrophysics Data System (ADS)

    Ferguson, Kevin; Olson, Britton; Jacobs, Jeffrey

    2017-11-01

    Simulations of the Richtmyer-Meshkov Instability (RMI) in a new two-shock vertical shock tube configuration are presented. The simulations are performed using the ARES code at Lawrence-Livermore National Laboratory (LLNL). Two M=1.2 shock waves travel in opposing directions and impact an initially stationary interface formed by sulfur hexaflouride (SF6) and air. The delay between the two shocks is controlled to achieve a prescribed temporal separation in shock wave arrival time. Initial interface perturbations and diffusion profiles are generated in keeping with previously gathered experimental data. The effect of varying the inter-shock delay and initial perturbation structure on instability growth and mixing parameters is examined. Information on the design, construction, and testing of a new two-shock vertical shock tube are also presented.

  10. Particle-In-Cell Modeling For MJ Dense Plasma Focus with Varied Anode Shape

    NASA Astrophysics Data System (ADS)

    Link, A.; Halvorson, C.; Schmidt, A.; Hagen, E. C.; Rose, D.; Welch, D.

    2014-10-01

    Megajoule scale dense plasma focus (DPF) Z-pinches with deuterium gas fill are compact devices capable of producing 1012 neutrons per shot but past predictive models of large-scale DPF have not included kinetic effects such as ion beam formation or anomalous resistivity. We report on progress of developing a predictive DPF model by extending our 2D axisymmetric collisional kinetic particle-in-cell (PIC) simulations to the 1 MJ, 2 MA Gemini DPF using the PIC code LSP. These new simulations incorporate electrodes, an external pulsed-power driver circuit, and model the plasma from insulator lift-off through the pinch phase. The simulations were performed using a new hybrid fluid-to-kinetic model transitioning from a fluid description to a fully kinetic PIC description during the run-in phase. Simulations are advanced through the final pinch phase using an adaptive variable time-step to capture the fs and sub-mm scales of the kinetic instabilities involved in the ion beam formation and neutron production. Results will be present on the predicted effects of different anode configurations. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344 and supported by the Laboratory Directed Research and Development Program (11-ERD-063) and the Computing Grand Challenge program at LLNL. This work supported by Office of Defense Nuclear Nonproliferation Research and Development within U.S. Department of Energy's National Nuclear Security Administration.

  11. Comparative ELM study between the observation by ECEI and linear/nonlinear simulation in the KSTAR plasmas

    NASA Astrophysics Data System (ADS)

    Kim, Minwoo; Park, Hyeon K.; Yun, Gunsu; Lee, Jaehyun; Lee, Jieun; Lee, Woochang; Jardin, Stephen; Xu, X. Q.; Kstar Team

    2015-11-01

    The modeling of the Edge-localized-mode (ELM) should be rigorously pursued for reliable and robust ELM control for steady-state long-pulse H-mode operation in ITER as well as DEMO. In the KSTAR discharge #7328, a linear stability of the ELMs is investigated using M3D-C1 and BOUT + + codes. This is achieved by linear simulation for the n = 8 mode structure of the ELM observed by the KSTAR electron cyclotron emission imaging (ECEI) systems. In the process of analysis, variations due to the plasma equilibrium profiles and transport coefficients on the ELM growth rate are investigated and simulation results with the two codes are compared. The numerical simulations are extended to nonlinear phase of the ELM dynamics, which includes saturation and crash of the modes. Preliminary results of the nonlinear simulations are compared with the measured images especially from the saturation to the crash. This work is supported by NRF of Korea under contract no. NRF-2014M1A7A1A03029865, US DoE by LLNL under contract DE-AC52-07NA27344 and US DoE by PPPL under contract DE-AC02-09CH11466.

  12. View Factor and Radiation-Hydrodynamic Simulations of Gas-Filled Outer-Quad-Only Hohlraums at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Young, Christopher; Meezan, Nathan; Landen, Otto

    2017-10-01

    A cylindrical National Ignition Facility hohlraum irradiated exclusively by NOVA-like outer quads (44 .5° and 50° beams) is proposed to minimize laser plasma interaction (LPI) losses and avoid problems with propagating the inner (23 .5° and 30°) beams. Symmetry and drive are controlled by shortening the hohlraum, using a smaller laser entrance hole (LEH), beam phasing the 44 .5° and 50° beams, and correcting the remaining P4 asymmetry with a capsule shim. Ensembles of time-resolved view factor simulations help narrow the design space of the new configuration, with fine tuning provided by the radiation-hydrodynamic code HYDRA. Prepared by LLNL under Contract DE-AC52-07NA27344.

  13. Non-local electron transport validation using 2D DRACO simulations

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Chenhall, Jeff; Moll, Eli; Prochaska, Alex; Moses, Gregory; Delettrez, Jacques; Collins, Tim

    2012-10-01

    Comparison of 2D DRACO simulations, using a modified versionfootnotetextprivate communications with M. Marinak and G. Zimmerman, LLNL. of the Schurtz, Nicolai and Busquet (SNB) algorithmfootnotetextSchurtz, Nicolai and Busquet, ``A nonlocal electron conduction model for multidimensional radiation hydrodynamics codes,'' Phys. Plasmas 7, 4238(2000). for non-local electron transport, with direct drive shock timing experimentsfootnotetextT. Boehly, et. al., ``Multiple spherically converging shock waves in liquid deuterium,'' Phys. Plasmas 18, 092706(2011). and with the Goncharov non-local modelfootnotetextV. Goncharov, et. al., ``Early stage of implosion in inertial confinement fusion: Shock timing and perturbation evolution,'' Phys. Plasmas 13, 012702(2006). in 1D LILAC will be presented. Addition of an improved SNB non-local electron transport algorithm in DRACO allows direct drive simulations with no need for an electron conduction flux limiter. Validation with shock timing experiments that mimic the laser pulse profile of direct drive ignition targets gives a higher confidence level in the predictive capability of the DRACO code. This research was supported by the University of Rochester Laboratory for Laser Energetics.

  14. Complete event simulations of nuclear fission

    NASA Astrophysics Data System (ADS)

    Vogt, Ramona

    2015-10-01

    For many years, the state of the art for treating fission in radiation transport codes has involved sampling from average distributions. In these average fission models energy is not explicitly conserved and everything is uncorrelated because all particles are emitted independently. However, in a true fission event, the energies, momenta and multiplicities of the emitted particles are correlated. Such correlations are interesting for many modern applications. Event-by-event generation of complete fission events makes it possible to retain the kinematic information for all particles emitted: the fission products as well as prompt neutrons and photons. It is therefore possible to extract any desired correlation observables. Complete event simulations can be included in general Monte Carlo transport codes. We describe the general functionality of currently available fission event generators and compare results for several important observables. This work was performed under the auspices of the US DOE by LLNL, Contract DE-AC52-07NA27344. We acknowledge support of the Office of Defense Nuclear Nonproliferation Research and Development in DOE/NNSA.

  15. Experiments and simulations of Richtmyer-Meshkov Instability with measured,volumetric initial conditions

    NASA Astrophysics Data System (ADS)

    Sewell, Everest; Ferguson, Kevin; Jacobs, Jeffrey; Greenough, Jeff; Krivets, Vitaliy

    2016-11-01

    We describe experiments of single-shock Richtmyer-Meskhov Instability (RMI) performed on the shock tube apparatus at the University of Arizona in which the initial conditions are volumetrically imaged prior to shock wave arrival. Initial perturbations play a major role in the evolution of RMI, and previous experimental efforts only capture a single plane of the initial condition. The method presented uses a rastered laser sheet to capture additional images throughout the depth of the initial condition immediately before the shock arrival time. These images are then used to reconstruct a volumetric approximation of the experimental perturbation. Analysis of the initial perturbations is performed, and then used as initial conditions in simulations using the hydrodynamics code ARES, developed at Lawrence Livermore National Laboratory (LLNL). Experiments are presented and comparisons are made with simulation results.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A.; Barnard, J. J.; Cohen, R. H.

    The Heavy Ion Fusion Science Virtual National Laboratory(a collaboration of LBNL, LLNL, and PPPL) is using intense ion beams to heat thin foils to the"warm dense matter" regime at<~;; 1 eV, and is developing capabilities for studying target physics relevant to ion-driven inertial fusion energy. The need for rapid target heating led to the development of plasma-neutralized pulse compression, with current amplification factors exceeding 50 now routine on the Neutralized Drift Compression Experiment (NDCX). Construction of an improved platform, NDCX-II, has begun at LBNL with planned completion in 2012. Using refurbished induction cells from the Advanced Test Accelerator at LLNL,more » NDCX-II will compress a ~;;500 ns pulse of Li+ ions to ~;;1 ns while accelerating it to 3-4 MeV over ~;;15 m. Strong space charge forces are incorporated into the machine design at a fundamental level. We are using analysis, an interactive 1D PIC code (ASP) with optimizing capabilities and centroid tracking, and multi-dimensional Warpcode PIC simulations, to develop the NDCX-II accelerator. This paper describes the computational models employed, and the resulting physics design for the accelerator.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A; Barnard, J J; Cohen, R H

    The Heavy Ion Fusion Science Virtual National Laboratory (a collaboration of LBNL, LLNL, and PPPL) is using intense ion beams to heat thin foils to the 'warm dense matter' regime at {approx}< 1 eV, and is developing capabilities for studying target physics relevant to ion-driven inertial fusion energy. The need for rapid target heating led to the development of plasma-neutralized pulse compression, with current amplification factors exceeding 50 now routine on the Neutralized Drift Compression Experiment (NDCX). Construction of an improved platform, NDCX-II, has begun at LBNL with planned completion in 2012. Using refurbished induction cells from the Advanced Testmore » Accelerator at LLNL, NDCX-II will compress a {approx}500 ns pulse of Li{sup +} ions to {approx} 1 ns while accelerating it to 3-4 MeV over {approx} 15 m. Strong space charge forces are incorporated into the machine design at a fundamental level. We are using analysis, an interactive 1D PIC code (ASP) with optimizing capabilities and centroid tracking, and multi-dimensional Warpcode PIC simulations, to develop the NDCX-II accelerator. This paper describes the computational models employed, and the resulting physics design for the accelerator.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, A; Barnard, J J; Briggs, R J

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL), a collaboration of LBNL, LLNL, and PPPL, has achieved 60-fold pulse compression of ion beams on the Neutralized Drift Compression eXperiment (NDCX) at LBNL. In NDCX, a ramped voltage pulse from an induction cell imparts a velocity 'tilt' to the beam; the beam's tail then catches up with its head in a plasma environment that provides neutralization. The HIFS-VNL's mission is to carry out studies of warm dense matter (WDM) physics using ion beams as the energy source; an emerging thrust is basic target physics for heavy ion-driven inertial fusion energymore » (IFE). These goals require an improved platform, labeled NDCX-II. Development of NDCX-II at modest cost was recently enabled by the availability of induction cells and associated hardware from the decommissioned advanced test accelerator (ATA) facility at LLNL. Our initial physics design concept accelerates an {approx} 30 nC pulse of Li{sup +} ions to {approx} 3 MeV, then compresses it to {approx} 1 ns while focusing it onto a mm-scale spot. It uses the ATA cells themselves (with waveforms shaped by passive circuits) to impart the final velocity tilt; smart pulsers provide small corrections. The ATA accelerated electrons; acceleration of non-relativistic ions involves more complex beam dynamics both transversely and longitudinally. We are using an interactive one-dimensional kinetic simulation model and multidimensional Warp-code simulations to develop the NDCX-II accelerator section. Both LSP and Warp codes are being applied to the beam dynamics in the neutralized drift and final focus regions, and the plasma injection process. The status of this effort is described.« less

  19. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  20. Axial deformed solution of the Skyrme-Hartree-Fock-Bogolyubov equations using the transformed harmonic oscillator Basis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez, R. Navarro; Schunck, N.; Lasseri, R.

    2017-03-09

    HFBTHO is a physics computer code that is used to model the structure of the nucleus. It is an implementation of the nuclear energy Density Functional Theory (DFT), where the energy of the nucleus is obtained by integration over space of some phenomenological energy density, which is itself a functional of the neutron and proton densities. In HFBTHO, the energy density derives either from the zero-range Dkyrme or the finite-range Gogny effective two-body interaction between nucleons. Nuclear superfluidity is treated at the Hartree-Fock-Bogoliubov (HFB) approximation, and axial-symmetry of the nuclear shape is assumed. This version is the 3rd release ofmore » the program; the two previous versions were published in Computer Physics Communications [1,2]. The previous version was released at LLNL under GPL 3 Open Source License and was given release code LLNL-CODE-573953.« less

  1. Rarefaction-driven Rayleigh–Taylor instability. Part 2. Experiments and simulations in the nonlinear regime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, R. V.; Cabot, W. H.; Greenough, J. A.

    Experiments and large eddy simulation (LES) were performed to study the development of the Rayleigh–Taylor instability into the saturated, nonlinear regime, produced between two gases accelerated by a rarefaction wave. Single-mode two-dimensional, and single-mode three-dimensional initial perturbations were introduced on the diffuse interface between the two gases prior to acceleration. The rarefaction wave imparts a non-constant acceleration, and a time decreasing Atwood number,more » $$A=(\\unicode[STIX]{x1D70C}_{2}-\\unicode[STIX]{x1D70C}_{1})/(\\unicode[STIX]{x1D70C}_{2}+\\unicode[STIX]{x1D70C}_{1})$$, where$$\\unicode[STIX]{x1D70C}_{2}$$and$$\\unicode[STIX]{x1D70C}_{1}$$are the densities of the heavy and light gas, respectively. Experiments and simulations are presented for initial Atwood numbers of$A=0.49$$,$$A=0.63$$,$$A=0.82$$and$$A=0.94$$. Nominally two-dimensional (2-D) experiments (initiated with nearly 2-D perturbations) and 2-D simulations are observed to approach an intermediate-time velocity plateau that is in disagreement with the late-time velocity obtained from the incompressible model of Goncharov (Phys. Rev. Lett., vol. 88, 2002, 134502). Reacceleration from an intermediate velocity is observed for 2-D bubbles in large wavenumber,$$k=2\\unicode[STIX]{x03C0}/\\unicode[STIX]{x1D706}=0.247~\\text{mm}^{-1}$$, experiments and simulations, where$$\\unicode[STIX]{x1D706}$is the wavelength of the initial perturbation. At moderate Atwood numbers, the bubble and spike velocities approach larger values than those predicted by Goncharov’s model. These late-time velocity trends are predicted well by numerical simulations using the LLNL Miranda code, and by the 2009 model of Mikaelian (Phys. Fluids., vol. 21, 2009, 024103) that extends Layzer type models to variable acceleration and density. Large Atwood number experiments show a delayed roll up, and exhibit a free-fall like behaviour. Finally, experiments initiated with three-dimensional perturbations tend to agree better with models and a simulation using the LLNL Ares code initiated with an axisymmetric rather than Cartesian symmetry.« less

  2. Rarefaction-driven Rayleigh–Taylor instability. Part 2. Experiments and simulations in the nonlinear regime

    DOE PAGES

    Morgan, R. V.; Cabot, W. H.; Greenough, J. A.; ...

    2018-01-12

    Experiments and large eddy simulation (LES) were performed to study the development of the Rayleigh–Taylor instability into the saturated, nonlinear regime, produced between two gases accelerated by a rarefaction wave. Single-mode two-dimensional, and single-mode three-dimensional initial perturbations were introduced on the diffuse interface between the two gases prior to acceleration. The rarefaction wave imparts a non-constant acceleration, and a time decreasing Atwood number,more » $$A=(\\unicode[STIX]{x1D70C}_{2}-\\unicode[STIX]{x1D70C}_{1})/(\\unicode[STIX]{x1D70C}_{2}+\\unicode[STIX]{x1D70C}_{1})$$, where$$\\unicode[STIX]{x1D70C}_{2}$$and$$\\unicode[STIX]{x1D70C}_{1}$$are the densities of the heavy and light gas, respectively. Experiments and simulations are presented for initial Atwood numbers of$A=0.49$$,$$A=0.63$$,$$A=0.82$$and$$A=0.94$$. Nominally two-dimensional (2-D) experiments (initiated with nearly 2-D perturbations) and 2-D simulations are observed to approach an intermediate-time velocity plateau that is in disagreement with the late-time velocity obtained from the incompressible model of Goncharov (Phys. Rev. Lett., vol. 88, 2002, 134502). Reacceleration from an intermediate velocity is observed for 2-D bubbles in large wavenumber,$$k=2\\unicode[STIX]{x03C0}/\\unicode[STIX]{x1D706}=0.247~\\text{mm}^{-1}$$, experiments and simulations, where$$\\unicode[STIX]{x1D706}$is the wavelength of the initial perturbation. At moderate Atwood numbers, the bubble and spike velocities approach larger values than those predicted by Goncharov’s model. These late-time velocity trends are predicted well by numerical simulations using the LLNL Miranda code, and by the 2009 model of Mikaelian (Phys. Fluids., vol. 21, 2009, 024103) that extends Layzer type models to variable acceleration and density. Large Atwood number experiments show a delayed roll up, and exhibit a free-fall like behaviour. Finally, experiments initiated with three-dimensional perturbations tend to agree better with models and a simulation using the LLNL Ares code initiated with an axisymmetric rather than Cartesian symmetry.« less

  3. Experiments and simulations of single shock Richtmeyer-Meshkov Instability with measured, volumetric initial conditions

    NASA Astrophysics Data System (ADS)

    Sewell, Everest; Ferguson, Kevin; Greenough, Jeffrey; Jacobs, Jeffrey

    2014-11-01

    We describe new experiments of single shock Richtmeyer-Meshkov Instability (RMI) performed on the shock tube apparatus at the University of Arizona in which the initial conditions are volumetrically imaged prior to shock wave arrival. Initial perturbation plays a major role in the evolution of RMI, and previous experimental efforts only capture a narrow slice of the initial condition. The method presented uses a rastered laser sheet to capture additional images in the depth of the initial condition shortly before the experimental start time. These images are then used to reconstruct a volumetric approximation of the experimental perturbation, which is simulated using the hydrodynamics code ARES, developed at Lawrence Livermore National Laboratory (LLNL). Comparison is made between the time evolution of the interface width and the mixedness ratio measured from the experiments against the predictions from the numerical simulations.

  4. CRUNCH_PARALLEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaker, Dana E.; Steefel, Carl I.

    The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.

  5. Livermore Compiler Analysis Loop Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, R. D.

    2013-03-01

    LCALS is designed to evaluate compiler optimizations and performance of a variety of loop kernels and loop traversal software constructs. Some of the loop kernels are pulled directly from "Livermore Loops Coded in C", developed at LLNL (see item 11 below for details of earlier code versions). The older suites were used to evaluate floating-point performances of hardware platforms prior to porting larger application codes. The LCALS suite is geared toward assissing C++ compiler optimizations and platform performance related to SIMD vectorization, OpenMP threading, and advanced C++ language features. LCALS contains 20 of 24 loop kernels from the older Livermoremore » Loop suites, plus various others representative of loops found in current production appkication codes at LLNL. The latter loops emphasize more diverse loop constructs and data access patterns than the others, such as multi-dimensional difference stencils. The loops are included in a configurable framework, which allows control of compilation, loop sampling for execution timing, which loops are run and their lengths. It generates timing statistics for analysis and comparing variants of individual loops. Also, it is easy to add loops to the suite as desired.« less

  6. Construction safety program for the National Ignition Facility, July 30, 1999 (NIF-0001374-OC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin, D W

    1999-07-30

    These rules apply to all LLNL employees, non-LLNL employees (including contract labor, supplemental labor, vendors, personnel matrixed/assigned from other National Laboratories, participating guests, visitors and students) and contractors/subcontractors. The General Rules-Code of Safe Practices shall be used by management to promote accident prevention through indoctrination, safety and health training and on-the-job application. As a condition for contracts award, all contractors and subcontractors and their employees must certify on Form S and H A-l that they have read and understand, or have been briefed and understand, the National Ignition Facility OCIP Project General Rules-Code of Safe Practices. (An interpreter must briefmore » those employees who do not speak or read English fluently.) In addition, all contractors and subcontractors shall adopt a written General Rules-Code of Safe Practices that relates to their operations. The General Rules-Code of Safe Practices must be posted at a conspicuous location at the job site office or be provided to each supervisory employee who shall have it readily available. Copies of the General Rules-Code of Safe Practices can also be included in employee safety pamphlets.« less

  7. Parallel Monte Carlo transport modeling in the context of a time-dependent, three-dimensional multi-physics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procassini, R.J.

    1997-12-31

    The fine-scale, multi-space resolution that is envisioned for accurate simulations of complex weapons systems in three spatial dimensions implies flop-rate and memory-storage requirements that will only be obtained in the near future through the use of parallel computational techniques. Since the Monte Carlo transport models in these simulations usually stress both of these computational resources, they are prime candidates for parallelization. The MONACO Monte Carlo transport package, which is currently under development at LLNL, will utilize two types of parallelism within the context of a multi-physics design code: decomposition of the spatial domain across processors (spatial parallelism) and distribution ofmore » particles in a given spatial subdomain across additional processors (particle parallelism). This implementation of the package will utilize explicit data communication between domains (message passing). Such a parallel implementation of a Monte Carlo transport model will result in non-deterministic communication patterns. The communication of particles between subdomains during a Monte Carlo time step may require a significant level of effort to achieve a high parallel efficiency.« less

  8. Hydrodynamic Modeling of the Deep Impact Mission into Comet Tempel 1

    NASA Astrophysics Data System (ADS)

    Sorli, Kya; Remington, Tané; Bruck Syal, Megan

    2018-01-01

    Kinetic impact is one of the primary strategies to deflect hazardous objects off of an Earth-impacting trajectory. The only test of a small-body impact is the 2005 Deep Impact mission into comet Tempel 1, where a 366-kg mass impactor collided at ~10 km/s into the comet, liberating an enormous amount of vapor and ejecta. Code comparisons with observations of the event represent an important source of new information about the initial conditions of small bodies and an extraordinary opportunity to test our simulation capabilities on a rare, full-scale experiment. Using the Adaptive Smoothed Particle Hydrodynamics (ASPH) code, Spheral, we explore how variations in target material properties such as strength, composition, porosity, and layering affect impact results, in order to best match the observed crater size and ejecta evolution. Benchmarking against this unique small-body experiment provides an enhanced understanding of our ability to simulate asteroid or comet response to future deflection missions. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-739336-DRAFT.

  9. Simulation of Plasma Transport in a Toroidal Annulus with TEMPEST

    NASA Astrophysics Data System (ADS)

    Xiong, Z.

    2005-10-01

    TEMPEST is an edge gyro-kinetic continuum code currently under development at LLNL to study boundary plasma transport over a region extending from inside the H-mode pedestal across the separatrix to the divertor plates. Here we report simulation results from the 4D (θ, ψ, E, μ) TEMPEST, for benchmark purpose, in an annulus region immediately inside the separatrix of a large aspect ratio, circular cross-section tokamak. Besides the normal poloidal trapping regions, there are radial inaccessible regions at a fixed poloid angle, energy and magnetic moment due to the radial variation of the B field. To handle such cases, a fifth-order WENO differencing scheme is used in the radial direction. The particle and heat transport coefficients are obtained for different collisional regimes and compared with the neo-classical transport theory.

  10. Modeling and design of radiative hydrodynamic experiments with X-ray Thomson Scattering measurements on NIF

    NASA Astrophysics Data System (ADS)

    Ma, K. H.; Lefevre, H. J.; Belancourt, P. X.; MacDonald, M. J.; Doeppner, T.; Keiter, P. A.; Kuranz, C. C.; Johnsen, E.

    2017-10-01

    Recent experiments at the National Ignition Facility studied the effect of radiation on shock-driven hydrodynamic instability growth. X-ray radiography images from these experiments indicate that perturbation growth is lower in highly radiative shocks compared to shocks with negligible radiation flux. The reduction in instability growth is attributed to ablation from higher temperatures in the foam for highly radiative shocks. The proposed design implements the X-ray Thomson Scattering (XRTS) technique in the radiative shock tube platform to measure electron temperatures and densities in the shocked foam. We model these experiments with CRASH, an Eulerian radiation hydrodynamics code with block-adaptive mesh refinement, multi-group radiation transport and electron heat conduction. Simulations are presented with SiO2 and carbon foams for both the high temperature, radiative shock and the low-temperature, hydrodynamic shock cases. Calculations from CRASH give estimations for shock speed, electron temperature, effective ionization, and other quantities necessary for designing the XRTS diagnostic measurement. This work is funded by the LLNL under subcontract B614207, and was performed under the auspices of the U.S. DOE by LLNL under Contract No. DE-AC52-07NA27344.

  11. Deflection by Kinetic Impact or Nuclear Ablation: Sensitivity to Asteroid Properties

    NASA Astrophysics Data System (ADS)

    Bruck Syal, M.

    2015-12-01

    Impulsive deflection of a threatening asteroid can be achieved by deploying either a kinetic impactor or a standoff nuclear device to impart a modest velocity change to the body. Response to each of these methods is sensitive to the individual asteroid's characteristics, some of which may not be well constrained before an actual deflection mission. Numerical simulations of asteroid deflection, using both hypervelocity impacts and nuclear ablation of the asteroid's surface, provide detailed information on asteroid response under a range of initial conditions. Here we present numerical results for the deflection of asteroids by both kinetic and nuclear methods, focusing on the roles of target body composition, strength, porosity, rotational state, shape, and internal structure. These results provide a framework for evaluating the planetary defense-related value of future asteroid characterization missions and capture some of the uncertainty that may be present in a real threat scenario. Part of this work was funded by the Laboratory Directed Research and Development Program at LLNL under project tracking code 12-ERD-005, performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-675914.

  12. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  13. Application Modernization at LLNL and the Sierra Center of Excellence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neely, J. Robert; de Supinski, Bronis R.

    We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less

  14. Ferrenberg Swendsen Analysis of LLNL and NYBlue BG/L p4rhms Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soltz, R

    2007-12-05

    These results are from the continuing Lattice Quantum Chromodynamics runs on BG/L. These results are from the Ferrenberg-Swendsen analysis [?] of the combined data from LLNL and NYBlue BG/L runs for 32{sup 3} x 8 runs with the p4rhmc v2.0 QMP-MPI.X (semi-optimized p4 code using qmp over mpi). The jobs include beta values ranging from 3.525 to 3.535 with an alternate analysis extending to 3.540. The NYBlue data sets are from 9k trajectories from Oct 2007, and the LLNL data are from two independent streams of {approx}5k each, taking from the July 2007 runs. The following outputs are produced bymore » the fs-2+1-chiub.c program. All outputs have had checksums produced by addCks.pl and checked by the checkCks.pl perl script after scanning.« less

  15. The Effect of Interchanging the Polarity of the Dense Plasma Focus on Neutron Yield

    NASA Astrophysics Data System (ADS)

    Jiang, Sheng; Higginson, Drew; Link, Anthony; Schmidt, Andrea

    2017-10-01

    The dense plasma focus (DPF) Z-pinch devices can serve as portable neutron sources when deuterium is used as the filling gas. DPF devices are normally operated with the inner electrode as the anode. It has been found that interchanging the polarity of the electrodes can cause orders of magnitude decrease in the neutron yield. Here we use the particle-in-cell (PIC) code LSP to model a DPF with both polarities. We have found the difference in the shape of the sheath, the voltage and current traces, and the electric and magnetic fields in the pinch region due to different polarities. A detailed comparison will be presented. Prepared by LLNL under Contract DE-AC52-07NA27344 and supported by the Laboratory Directed Research and Development Program (15-ERD-034) at LLNL. Computing support for this work came from the LLNL Institutional Computing Grand Challenge program.

  16. Application Modernization at LLNL and the Sierra Center of Excellence

    DOE PAGES

    Neely, J. Robert; de Supinski, Bronis R.

    2017-09-01

    We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less

  17. SDAV Viz July Progress Update: LANL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, Christopher Meyer

    2012-07-30

    SDAV Viz July Progress Update: (1) VPIC (Vector Particle in Cell) Kinetic Plasma Simulation Code - (a) Implemented first version of an in-situ adapter based on Paraview CoProcessing Library, (b) Three pipelines: vtkDataSetMapper, vtkContourFilter, vtkPistonContour, (c) Next, resolve issue at boundaries of processor domains; add more advanced viz/analysis pipelines; (2) Halo finding/merger trees - (a) Summer student Wathsala W. from University of Utah is working on data-parallel halo finder algorithm using PISTON, (b) Timo Bremer (LLNL), Valerio Pascucci (Utah), George Zagaris (Kitware), and LANL people are interested in using merger trees for tracking the evolution of halos in cosmo simulations;more » discussed possible overlap with work by Salman Habib and Katrin Heitmann (Argonne) during their visit to LANL 7/11; (3) PISTON integration in ParaView - Now available from ParaView github.« less

  18. Development of a New System for Transport Simulation and Analysis at General Atomics

    NASA Astrophysics Data System (ADS)

    St. John, H. E.; Peng, Q.; Freeman, J.; Crotinger, J.

    1997-11-01

    General Atomics has begun a long term program to improve all aspects of experimental data analysis related to DIII--D. The object is to make local and visiting physicists as productive as possible, with only a small investment in training, by developing intuitive, sophisticated interfaces to existing and newly created computer programs. Here we describe our initial work and results of a pilot project in this program. The pilot project is a collaboratory effort between LLNL and GA which will ultimately result in the merger of Corsica and ONETWO (and selected modules from other codes) into a new advanced transport code system. The initial goal is to produce a graphical user interface to the transport code ONETWO which will couple to a programmable (steerable) front end designed for the transport system. This will be an object oriented scheme written primarily in python. The programmable application will integrate existing C, C^++, and Fortran methods in a single computational paradigm. Its most important feature is the use of plug in physics modules which will allow a high degree of customization.

  19. Waves Generated by Asteroid Impacts and Their Hazard Consequences on The Shorelines

    NASA Astrophysics Data System (ADS)

    Ezzedine, S. M.; Miller, P. L.; Dearborn, D. S.

    2014-12-01

    We have performed numerical simulations of a hypothetical asteroid impact onto the ocean in support of an emergency preparedness, planning, and management exercise. We addressed the scenario from asteroid entry; to ocean impact (splash rim); to wave generation, propagation, and interaction with the shoreline. For the analysis we used GEODYN, a hydrocode, to simulate the impact and generate the source wave for the large-scale shallow water wave program, SWWP. Using state-of-the-art, high-performance computing codes we simulated three impact areas — two are located on the West Coast near Los Angeles's shoreline and the San Francisco Bay, respectively, and the third is located in the Gulf of Mexico, with a possible impact location between Texas and Florida. On account of uncertainty in the exact impact location within the asteroid risk corridor, we examined multiple possibilities for impact points within each area. Uncertainty in the asteroid impact location was then convolved and represented as uncertainty in the shoreline flooding zones. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, and partially funded by the Laboratory Directed Research and Development Program at LLNL under tracking code 12-ERD-005.

  20. PIC simulations of the trapped electron filamentation instability in finite-width electron plasma waves

    NASA Astrophysics Data System (ADS)

    Winjum, B. J.; Banks, J. W.; Berger, R. L.; Cohen, B. I.; Chapman, T.; Hittinger, J. A. F.; Rozmus, W.; Strozzi, D. J.; Brunner, S.

    2012-10-01

    We present results on the kinetic filamentation of finite-width nonlinear electron plasma waves (EPW). Using 2D simulations with the PIC code BEPS, we excite a traveling EPW with a Gaussian transverse profile and a wavenumber k0λDe= 1/3. The transverse wavenumber spectrum broadens during transverse EPW localization for small width (but sufficiently large amplitude) waves, while the spectrum narrows to a dominant k as the initial EPW width increases to the plane-wave limit. For large EPW widths, filaments can grow and destroy the wave coherence before transverse localization destroys the wave; the filaments in turn evolve individually as self-focusing EPWs. Additionally, a transverse electric field develops that affects trapped electrons, and a beam-like distribution of untrapped electrons develops between filaments and on the sides of a localizing EPW. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and funded by the Laboratory Research and Development Program at LLNL under project tracking code 12-ERD-061. Supported also under Grants DE-FG52-09NA29552 and NSF-Phy-0904039. Simulations were performed on UCLA's Hoffman2 and NERSC's Hopper.

  1. An atomic and molecular fluid model for efficient edge-plasma transport simulations at high densities

    NASA Astrophysics Data System (ADS)

    Rognlien, Thomas; Rensink, Marvin

    2016-10-01

    Transport simulations for the edge plasma of tokamaks and other magnetic fusion devices requires the coupling of plasma and recycling or injected neutral gas. There are various neutral models used for this purpose, e.g., atomic fluid model, a Monte Carlo particle models, transition/escape probability methods, and semi-analytic models. While the Monte Carlo method is generally viewed as the most accurate, it is time consuming, which becomes even more demanding for device simulations of high densities and size typical of fusion power plants because the neutral collisional mean-free path becomes very small. Here we examine the behavior of an extended fluid neutral model for hydrogen that includes both atoms and molecules, which easily includes nonlinear neutral-neutral collision effects. In addition to the strong charge-exchange between hydrogen atoms and ions, elastic scattering is included among all species. Comparisons are made with the DEGAS 2 Monte Carlo code. Work performed for U.S. DoE by LLNL under Contract DE-AC52-07NA27344.

  2. Institutional Computing Executive Group Review of Multi-programmatic & Institutional Computing, Fiscal Year 2005 and 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langer, S; Rotman, D; Schwegler, E

    The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less

  3. Vlasov Simulation of the Effects of Collisions on the Damping of Electron Plasma Waves

    NASA Astrophysics Data System (ADS)

    Banks, Jeff; Berger, Richard; Chapman, Thomas; Brunner, Stephan; Tran, T.

    2015-11-01

    Kinetic simulation of two dimensional plasma waves through direct discretization of the Vlasov equation may be particularly attractive for situations where minimal numerical fluctuation levels are desired, such as when measuring growth rates of plasma wave instabilities. In many cases collisional effects can be important to the evolution of plasma waves because they both set a minimum damping rate for plasma waves and can scatter particles out of resonance through pitch angle scattering. Here we present Vlasov simulations of evolving electron plasma waves (EPWs) in plasmas of varying collisionality. We consider first the effects of electron-ion pitch angle collisions on the frequency and damping, Landau and collisional, of small-amplitude EPWs for a range of collision rates. In addition, the wave phase velocities are extracted from the simulation results and compared with theory. For this study we use the Eulerian-based kinetic code LOKI that evolves the Vlasov-Poisson system in 2+2-dimensional phase space. We then discuss extensions of the collision operator to include thermalization. Discretization of these collision operators using 4th order accurate conservative finite-differencing will be discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and funded by the LDRD program at LLNL under project tracking code 15-ERD-038.

  4. DYNA3D: A computer code for crashworthiness engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallquist, J.O.; Benson, D.J.

    1986-09-01

    A finite element program with crashworthiness applications has been developed at LLNL. DYNA3D, an explicit, fully vectorized, finite deformation structural dynamics program, has four capabilities that are critical for the efficient and realistic modeling crash phenomena: (1) fully optimized nonlinear solid, shell, and beam elements for representing a structure; (2) a broad range of constitutive models for simulating material behavior; (3) sophisticated contact algorithms for impact interactions; (4) a rigid body capability to represent the bodies away from the impact region at a greatly reduced cost without sacrificing accuracy in the momentum calculations. Basic methodologies of the program are brieflymore » presented along with several crashworthiness calculations. Efficiencies of the Hughes-Liu and Belytschko-Tsay shell formulations are considered.« less

  5. Modeling of induction-linac based free-electron laser amplifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jong, R.A.; Fawley, W.M.; Scharlemann, E.T.

    We describe the modeling of an induction-linac based free-electron laser (IFEL) amplifier for producing multimegawatt levels of microwave power. We have used the Lawrence Livermore National Laboratory (LLNL) free-electron laser simulation code, FRED, and the simulation code for sideband calculations, GINGER for this study. For IFEL amplifiers in the frequency range of interest (200 to 600 GHz), we have devised a wiggler design strategy which incorporates a tapering algorithm that is suitable for free-electron laser (FEL) systems with moderate space-charge effects and that minimizes spontaneous noise growth at frequencies below the fundamental, while enhancing the growth of the signal atmore » the fundamental. In addition, engineering design considerations of the waveguide wall loading and electron beam fill factor in the waveguide set limits on the waveguide dimensions, the wiggler magnet gap spacing, the wiggler period, and the minimum magnetic field strength in the tapered region of the wiggler. As an example, we shall describe an FEL amplifier designed to produce an average power of about 10 MW at a frequency of 280 GHz to be used for electron cyclotron resonance heating of tokamak fusion devices. 17 refs., 4 figs.« less

  6. LLNL contributions to ANL Report ANL/NE-16/6 "Sharp User Manual"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solberg, J. M.

    Diablo is a Multiphysics implicit finite element code with an emphasis on coupled structural/thermal analysis. In the SHARP framework, it is used as the structural solver, and may also be used as the mesh smoother.

  7. Exploding Pusher Targets for Electron-Ion Coupling Measurements

    NASA Astrophysics Data System (ADS)

    Whitley, Heather D.; Pino, Jesse; Schneider, Marilyn; Shepherd, Ronnie; Benedict, Lorin; Bauer, Joseph; Graziani, Frank; Garbett, Warren

    2015-11-01

    Over the past several years, we have conducted theoretical investigations of electron-ion coupling and electronic transport in plasmas. In the regime of weakly coupled plasmas, we have identified models that we believe describe the physics well, but experimental data is still needed to validate the models. We are currently designing spectroscopic experiments to study electron-ion equilibration and/or electron heat transport using exploding pusher (XP) targets for experiments at the National Ignition Facility. Two platforms are being investigated: an indirect drive XP (IDXP) with a plastic ablator and a polar-direct drive XP (PDXP) with a glass ablator. The fill gas for both designs is D2. We propose to use a higher-Z dopant, such as Ar, as a spectroscopic tracer for time-resolved electron and ion temperature measurements. We perform 1D simulations using the ARES hydrodynamic code, in order to produce the time-resolved plasma conditions, which are then post-processed with CRETIN to assess the feasibility of a spectroscopic measurement. We examine target performance with respect to variations in gas fill pressure, ablator thickness, atom fraction of the Ar dopant, and drive energy, and assess the sensitivity of the predicted spectra to variations in the models for electron-ion equilibration and thermal conductivity. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675219.

  8. Chemistry Resolved Kinetic Flow Modeling of TATB Based Explosives

    NASA Astrophysics Data System (ADS)

    Vitello, Peter; Fried, Lawrence; Howard, Mike; Levesque, George; Souers, Clark

    2011-06-01

    Detonation waves in insensitive, TATB based explosives are believed to have multi-time scale regimes. The initial burn rate of such explosives has a sub-microsecond time scale. However, significant late-time slow release in energy is believed to occur due to diffusion limited growth of carbon. In the intermediate time scale concentrations of product species likely change from being in equilibrium to being kinetic rate controlled. We use the thermo-chemical code CHEETAH linked to ALE hydrodynamics codes to model detonations. We term our model chemistry resolved kinetic flow as CHEETAH tracks the time dependent concentrations of individual species in the detonation wave and calculate EOS values based on the concentrations. A validation suite of model simulations compared to recent high fidelity metal push experiments at ambient and cold temperatures has been developed. We present here a study of multi-time scale kinetic rate effects for these experiments. Prepared by LLNL under Contract DE-AC52-07NA27344.

  9. Simulating Afterburn with LLNL Hydrocodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, L D

    2004-06-11

    Presented here is a working methodology for adapting a Lawrence Livermore National Laboratory (LLNL) developed hydrocode, ALE3D, to simulate weapon damage effects when afterburn is a consideration in the blast propagation. Experiments have shown that afterburn is of great consequence in enclosed environments (i.e. bomb in tunnel scenario, penetrating conventional munition in a bunker, or satchel charge placed in a deep underground facility). This empirical energy deposition methodology simulates the anticipated addition of kinetic energy that has been demonstrated by experiment (Kuhl, et. al. 1998), without explicitly solving the chemistry, or resolving the mesh to capture small-scale vorticity. This effortmore » is intended to complement the existing capability of either coupling ALE3D blast simulations with DYNA3D or performing fully coupled ALE3D simulations to predict building or component failure, for applications in National Security offensive strike planning as well as Homeland Defense infrastructure protection.« less

  10. Optimization and Control of Burning Plasmas Through High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pankin, Alexei

    This project has revived the FACETS code, that has been developed under SciDAC fund- ing in 2008-2012. The code has been dormant for a number of years after the SciDAC funding stopped. FACETS depends on external packages. The external packages and libraries such as PETSc, FFTW, HDF5 and NETCDF that are included in FACETS have evolved during these years. Some packages in FACETS are also parts of other codes such as PlasmaState, NUBEAM, GACODES, and UEDGE. These packages have been also evolved together with their host codes which include TRANSP, TGYRO and XPTOR. Finally, there is also a set ofmore » packages in FACETS that are being developed and maintained by Tech-X. These packages include BILDER, SciMake, and FcioWrappers. Many of these packages evolved significantly during the last several years and FACETS had to be updated to synchronize with the re- cent progress in the external packages. The PI has introduced new changes to the BILDER package to support the updated interfaces to the external modules. During the last year of the project, the FACETS version of the UEDGE code has been extracted from FACETS as a standalone package. The PI collaborates with the scientists from LLNL on the updated UEDGE model in FACETS. Drs. T. Rognlien, M. Umansky and A. Dimits from LLNL are contributing to this task.« less

  11. Petascale Simulation Initiative Tech Base: FY2007 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, J; Chen, R; Jefferson, D

    The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending on when Peloton becomes available); (2) Improve SARS's robustness and ease-of-use, and develop user documentation; and (3) Work with LLNL code teams to help them determine how Symponents could benefit their applications. The original funding request was $296,000 for the year, and we eventually received $252,000. The remainder of this report describes our efforts and accomplishments for each of the goals listed above.« less

  12. DHS Summary Report -- Robert Weldon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weldon, Robert A.

    This summer I worked on benchmarking the Lawrence Livermore National Laboratory fission multiplicity capability used in the Monte Carlo particle transport code MCNPX. This work involved running simulations and then comparing the simulation results with experimental experiments. Outlined in this paper is a brief description of the work completed this summer, skills and knowledge gained, and how the internship has impacted my planning for the future. Neutron multiplicity counting is a neutron detection technique that leverages the multiplicity emissions of neutrons from fission to identify various actinides in a lump of material. The identification of individual actinides in lumps ofmore » material crossing our boarders, especially U-235 and Pu-239, is a key component for maintaining the safety of the country from nuclear threats. Several multiplicity emission options from spontaneous and induced fission already existed in MCNPX 2.4.0. These options can be accessed through use of the 6th entry on the PHYS:N card. Lawrence Livermore National Laboratory (LLNL) developed a physics model for the simulation of neutron and gamma ray emission from fission and photofission that was included in MCNPX 2.7.B as an undocumented feature and then was documented in MCNPX 2.7.C. The LLNL multiplicity capability provided a different means for MCNPX to simulate neutron and gamma-ray distributions for neutron induced, spontaneous and photonuclear fission reactions. The original testing on the model for implementation into MCNPX was conducted by Gregg McKinney and John Hendricks. The model is an encapsulation of measured data of neutron multiplicity distributions from Gwin, Spencer, and Ingle, along with the data from Zucker and Holden. One of the founding principles of MCNPX was that it would have several redundant capabilities, providing the means of testing and including various physics packages. Though several multiplicity sampling methodologies already existed within MCNPX, the LLNL fission multiplicity was included to provide a separate capability for computing multiplicity as well as including several new features not already included in MCNPX. These new features include: (1) prompt gamma emission/multiplicity from neutron-induced fission; (2) neutron multiplicity and gamma emission/multiplicity from photofission; and (3) an option to enforce energy correlation for gamma neutron multiplicity emission. These new capabilities allow correlated signal detection for identifying presence of special nuclear material (SNM). Therefore, these new capabilities help meet the missions of the Domestic Nuclear Detection Office (DNDO), which is tasked with developing nuclear detection strategies for identifying potential radiological and nuclear threats, by providing new simulation capability for detection strategies that leverage the new available physics in the LLNL multiplicity capability. Two types of tests were accomplished this summer to test the default LLNL neutron multiplicity capability: neutron-induced fission tests and spontaneous fission tests. Both cases set the 6th entry on the PHYS:N card to 5 (i.e. use LLNL multiplicity). The neutron-induced fission tests utilized a simple 0.001 cm radius sphere where 0.0253 eV neutrons were released at the sphere center. Neutrons were forced to immediately collide in the sphere and release all progeny from the sphere, without further collision, using the LCA card, LCA 7j -2 (therefore density and size of the sphere were irrelevant). Enough particles were run to ensure that the average error of any specific multiplicity did not exceed 0.36%. Neutron-induced fission multiplicities were computed for U-233, U-235, Pu-239, and Pu-241. The spontaneous fission tests also used the same spherical geometry, except: (1) the LCA card was removed; (2) the density of the sphere was set to 0.001 g/cm3; and (3) instead of emitting a thermal neutron, the PAR keyword was set to PAR=SF. The purpose of the small density was to ensure that the spontaneous fission neutrons would not further interact and induce fissions (i.e. the mean free path greatly exceeded the size of the sphere). Enough particles were run to ensure that the average error of any specific spontaneous multiplicity did not exceed 0.23%. Spontaneous fission multiplicities were computed for U-238, Pu-238, Pu-240, Pu-242, Cm-242, and Cm-244. All of the computed results were compared against experimental results compiled by Holden at Brookhaven National Laboratory.« less

  13. Propagation of Reactions in Thermally-damaged PBX-9501

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tringe, J W; Glascoe, E A; Kercher, J R

    A thermally-initiated explosion in PBX-9501 (octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine) is observed in situ by flash x-ray imaging, and modeled with the LLNL multi-physics arbitrary-Lagrangian-Eulerian code ALE3D. The containment vessel deformation provides a useful estimate of the reaction pressure at the time of the explosion, which we calculate to be in the range 0.8-1.4 GPa. Closely-coupled ALE3D simulations of these experiments, utilizing the multi-phase convective burn model, provide detailed predictions of the reacted mass fraction and deflagration front acceleration. During the preinitiation heating phase of these experiments, the solid HMX portion of the PBX-9501 undergoes a {beta}-phase to {delta}-phase transition which damages the explosivemore » and induces porosity. The multi-phase convective burn model results demonstrate that damaged particle size and pressure are critical for predicting reaction speed and violence. In the model, energetic parameters are taken from LLNL's thermochemical-kinetics code Cheetah and burn rate parameters from Son et al. (2000). Model predictions of an accelerating deflagration front are in qualitative agreement with the experimental images assuming a mode particle diameter in the range 300-400 {micro}m. There is uncertainty in the initial porosity caused by thermal damage of PBX-9501 and, thus, the effective surface area for burning. To better understand these structures, we employ x-ray computed tomography (XRCT) to examine the microstructure of PBX-9501 before and after thermal damage. Although lack of contrast between grains and binder prevents the determination of full grain size distribution in this material, there are many domains visible in thermally damaged PBX-9501 with diameters in the 300-400 {micro}m range.« less

  14. Cross Domain Deterrence: Livermore Technical Report, 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Peter D.; Bahney, Ben; Matarazzo, Celeste

    2016-08-03

    Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on themore » CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.« less

  15. Initial development of 5D COGENT

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Lee, W.; Dorf, M.; Dorr, M.

    2015-11-01

    COGENT is a continuum gyrokinetic edge code being developed by the by the Edge Simulation Laboratory (ESL) collaboration. Work to date has been primarily focussed on a 4D (axisymmetric) version that models transport properties of edge plasmas. We have begun development of an initial 5D version to study edge turbulence, with initial focus on kinetic effects on blob dynamics and drift-wave instability in a shearless magnetic field. We are employing compiler directives and preprocessor macros to create a single source code that can be compiled in 4D or 5D, which helps to ensure consistency of physics representation between the two versions. A key aspect of COGENT is the employment of mapped multi-block grid capability to handle the complexity of diverter geometry. It is planned to eventually exploit this capability to handle magnetic shear, through a series of successively skewed unsheared grid blocks. The initial version has an unsheared grid and will be used to explore the degree to which a radial domain must be block decomposed. We report on the status of code development and initial tests. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344.

  16. Efficient simulation of pitch angle collisions in a 2+2-D Eulerian Vlasov code

    NASA Astrophysics Data System (ADS)

    Banks, Jeff; Berger, R.; Brunner, S.; Tran, T.

    2014-10-01

    Here we discuss pitch angle scattering collisions in the context of the Eulerian-based kinetic code LOKI that evolves the Vlasov-Poisson system in 2+2-dimensional phase space. The collision operator is discretized using 4th order accurate conservative finite-differencing. The treatment of the Vlasov operator in phase-space uses an approach based on a minimally diffuse, fourth-order-accurate discretization (Banks and Hittinger, IEEE T. Plasma Sci. 39, 2198). The overall scheme is therefore discretely conservative and controls unphysical oscillations. Some details of the numerical scheme will be presented, and the implementation on modern highly concurrent parallel computers will be discussed. We will present results of collisional effects on linear and non-linear Landau damping of electron plasma waves (EPWs). In addition we will present initial results showing the effect of collisions on the evolution of EPWs in two space dimensions. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and funded by the LDRD program at LLNL under project tracking code 12-ERD-061.

  17. Large Survey of Neutron Spectrum Moments Due to ICF Drive Asymmetry

    NASA Astrophysics Data System (ADS)

    Field, J. E.; Munro, D.; Spears, B.; Peterson, J. L.; Brandon, S.; Gaffney, J. A.; Hammer, J.; Langer, S.; Nora, R. C.; Springer, P.; ICF Workflow Collaboration Collaboration

    2016-10-01

    We have recently completed the largest HYDRA simulation survey to date ( 60 , 000 runs) of drive asymmetry on the new Trinity computer at LANL. The 2D simulations covered a large space of credible perturbations to the drive of ICF implosions on the NIF. Cumulants of the produced birth energy spectrum for DD and DT reaction neutrons were tallied using new methods. Comparison of the experimental spectra with our map of predicted spectra from simulation should provide a wealth of information about the burning plasma region. We report on our results, highlighting areas of agreement (and disagreement) with experimental spectra. We also identify features in the predicted spectra that might be amenable to measurement with improved diagnostics. Prepared by LLNL under Contract DE-AC52-07NA27344. IM release #: LLNL-PROC-697321.

  18. An Archive of Downscaled WCRP CMIP3 Climate Projections for Planning Applications in the Contiguous United States

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.

    2007-12-01

    Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).

  19. Programming for 1.6 Millon cores: Early experiences with IBM's BG/Q SMP architecture

    NASA Astrophysics Data System (ADS)

    Glosli, James

    2013-03-01

    With the stall in clock cycle improvements a decade ago, the drive for computational performance has continues along a path of increasing core counts on a processor. The multi-core evolution has been expressed in both a symmetric multi processor (SMP) architecture and cpu/GPU architecture. Debates rage in the high performance computing (HPC) community which architecture best serves HPC. In this talk I will not attempt to resolve that debate but perhaps fuel it. I will discuss the experience of exploiting Sequoia, a 98304 node IBM Blue Gene/Q SMP at Lawrence Livermore National Laboratory. The advantages and challenges of leveraging the computational power BG/Q will be detailed through the discussion of two applications. The first application is a Molecular Dynamics code called ddcMD. This is a code developed over the last decade at LLNL and ported to BG/Q. The second application is a cardiac modeling code called Cardioid. This is a code that was recently designed and developed at LLNL to exploit the fine scale parallelism of BG/Q's SMP architecture. Through the lenses of these efforts I'll illustrate the need to rethink how we express and implement our computational approaches. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. High density arrays of micromirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folta, J. M.; Decker, J. Y.; Kolman, J.

    We established and achieved our goal to (1) fabricate and evaluate test structures based on the micromirror design optimized for maskless lithography applications, (2) perform system analysis and code development for the maskless lithography concept, and (3) identify specifications for micromirror arrays (MMAs) for LLNL's adaptive optics (AO) applications and conceptualize new devices.

  1. High-resolution 3D simulations of NIF ignition targets performed on Sequoia with HYDRA

    NASA Astrophysics Data System (ADS)

    Marinak, M. M.; Clark, D. S.; Jones, O. S.; Kerbel, G. D.; Sepke, S.; Patel, M. V.; Koning, J. M.; Schroeder, C. R.

    2015-11-01

    Developments in the multiphysics ICF code HYDRA enable it to perform large-scale simulations on the Sequoia machine at LLNL. With an aggregate computing power of 20 Petaflops, Sequoia offers an unprecedented capability to resolve the physical processes in NIF ignition targets for a more complete, consistent treatment of the sources of asymmetry. We describe modifications to HYDRA that enable it to scale to over one million processes on Sequoia. These include new options for replicating parts of the mesh over a subset of the processes, to avoid strong scaling limits. We consider results from a 3D full ignition capsule-only simulation performed using over one billion zones run on 262,000 processors which resolves surface perturbations through modes l = 200. We also report progress towards a high-resolution 3D integrated hohlraum simulation performed using 262,000 processors which resolves surface perturbations on the ignition capsule through modes l = 70. These aim for the most complete calculations yet of the interactions and overall impact of the various sources of asymmetry for NIF ignition targets. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.

  2. A Simulation Model for Drift Resistive Ballooning Turbulence Examining the Influence of Self-consistent Zonal Flows

    NASA Astrophysics Data System (ADS)

    Cohen, Bruce; Umansky, Maxim; Joseph, Ilon

    2015-11-01

    Progress is reported on including self-consistent zonal flows in simulations of drift-resistive ballooning turbulence using the BOUT + + framework. Previous published work addressed the simulation of L-mode edge turbulence in realistic single-null tokamak geometry using the BOUT three-dimensional fluid code that solves Braginskii-based fluid equations. The effects of imposed sheared ExB poloidal rotation were included, with a static radial electric field fitted to experimental data. In new work our goal is to include the self-consistent effects on the radial electric field driven by the microturbulence, which contributes to the sheared ExB poloidal rotation (zonal flow generation). We describe a model for including self-consistent zonal flows and an algorithm for maintaining underlying plasma profiles to enable the simulation of steady-state turbulence. We examine the role of Braginskii viscous forces in providing necessary dissipation when including axisymmetric perturbations. We also report on some of the numerical difficulties associated with including the axisymmetric component of the fluctuating fields. This work was performed under the auspices of the U.S. Department of Energy under contract DE-AC52-07NA27344 at the Lawrence Livermore National Laboratory (LLNL-ABS-674950).

  3. Simulation of the ELMs triggering by lithium pellet on EAST tokamak using BOUT + +

    NASA Astrophysics Data System (ADS)

    Wang, Y. M.; Xu, X. Q.; Wang, Z.; Sun, Z.; Hu, J. S.; Gao, X.

    2017-10-01

    A new lithium granule injector (LGI) was developed on EAST. Using the LGI, lithium granules can be efficiently injected into EAST tokamak with the granule radius 0.2-1 mm and the granules velocity 30-110 m/s. ELM pacing was realized during EAST shot #70123 at time window from 4.4-4.7s, the average velocity of the pellet was 75 m/s and the average injection rate is at 99Hz. The BOUT + + 6-field electromagnetic turbulence code has been used to simulate the ELM pacing process. A neutral gas shielding (NGS) model has been implemented during the pellet ablation process. The neutral transport code is used to evaluate the ionized electron and Li ion densities with the charge exchange as a dominant factor in the neutral cloud diffusion process. The snapshot plasma profiles during the pellet ablation and toroidal symmetrization process are used in the 6-field turbulence code to evaluate the impact of the pellets on ELMs. Destabilizing effects of the peeling-ballooning modes are found with lithium pellet injection, which is consistent with the experimental results. A scan of the pellet size, shape and the injection velocity will be conducted, which will benefit the pellet injection design in both the present and future devices. Prepared by LLNL under Contract DE-AC52-07NA27344 and this work is supported by the National Natural Science Fonudation of China (Grant No. 11505221) and China Scholarship Council (Grant No. 201504910132).

  4. Description of the Process Model for the Technoeconomic Evaluation of MEA versus Mixed Amines for Carbon Dioxide Removal from Stack Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Dale A.

    This model description is supplemental to the Lawrence Livermore National Laboratory (LLNL) report LLNL-TR-642494, Technoeconomic Evaluation of MEA versus Mixed Amines for CO2 Removal at Near- Commercial Scale at Duke Energy Gibson 3 Plant. We describe the assumptions and methodology used in the Laboratory’s simulation of its understanding of Huaneng’s novel amine solvent for CO2 capture with 35% mixed amine. The results of that simulation have been described in LLNL-TR-642494. The simulation was performed using ASPEN 7.0. The composition of the Huaneng’s novel amine solvent was estimated based on information gleaned from Huaneng patents. The chemistry of the process wasmore » described using nine equations, representing reactions within the absorber and stripper columns using the ELECTNRTL property method. As a rate-based ASPEN simulation model was not available to Lawrence Livermore at the time of writing, the height of a theoretical plate was estimated using open literature for similar processes. Composition of the flue gas was estimated based on information supplied by Duke Energy for Unit 3 of the Gibson plant. The simulation was scaled at one million short tons of CO2 absorbed per year. To aid stability of the model, convergence of the main solvent recycle loop was implemented manually, as described in the Blocks section below. Automatic convergence of this loop led to instability during the model iterations. Manual convergence of the loop enabled accurate representation and maintenance of model stability.« less

  5. Annual Review of Progress in Applied Computational Electromagnetics (4th), Held in Monterey, California on March 22-24, 1988

    DTIC Science & Technology

    1988-03-24

    1430-1445 BREAK 1445-1645 EM CODE USERS PANEL DISCUSSION. Chaired by Wkn Breakal of LLNL. User community sugqestlons on needed enhancemento for EM Codes...I -"FINITE DIFFERENCE & FINITE ELEMENT METHC"S" Moderator: David E . Stein The LTV Aerospace and Defense Company "A Firite Element Analysis of...conduction (resulting from charge movement) or displacement ( e ,0 E /Wt) terms. The sum of these current densities are referred to as the Maxwell current

  6. Assessment and Mitigation of Radiation, EMP, Debris & Shrapnel Impacts at Megajoule-Class Laser Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eder, D C; Anderson, R W; Bailey, D S

    2009-10-05

    The generation of neutron/gamma radiation, electromagnetic pulses (EMP), debris and shrapnel at mega-Joule class laser facilities (NIF and LMJ) impacts experiments conducted at these facilities. The complex 3D numerical codes used to assess these impacts range from an established code that required minor modifications (MCNP - calculates neutron and gamma radiation levels in complex geometries), through a code that required significant modifications to treat new phenomena (EMSolve - calculates EMP from electrons escaping from laser targets), to a new code, ALE-AMR, that is being developed through a joint collaboration between LLNL, CEA, and UC (UCSD, UCLA, and LBL) for debrismore » and shrapnel modelling.« less

  7. Modern Chemistry Techniques Applied to Metal Behavior and Chelation in Medical and Environmental Systems ? Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, M; Andresen, B; Burastero, S R

    2005-02-03

    This report details the research and findings generated over the course of a 3-year research project funded by Lawrence Livermore National Laboratory (LLNL) Laboratory Directed Research and Development (LDRD). Originally tasked with studying beryllium chemistry and chelation for the treatment of Chronic Beryllium Disease and environmental remediation of beryllium-contaminated environments, this work has yielded results in beryllium and uranium solubility and speciation associated with toxicology; specific and effective chelation agents for beryllium, capable of lowering beryllium tissue burden and increasing urinary excretion in mice, and dissolution of beryllium contamination at LLNL Site 300; {sup 9}Be NMR studies previously unstudied atmore » LLNL; secondary ionization mass spec (SIMS) imaging of beryllium in spleen and lung tissue; beryllium interactions with aerogel/GAC material for environmental cleanup. The results show that chelator development using modern chemical techniques such as chemical thermodynamic modeling, was successful in identifying and utilizing tried and tested beryllium chelators for use in medical and environmental scenarios. Additionally, a study of uranium speciation in simulated biological fluids identified uranium species present in urine, gastric juice, pancreatic fluid, airway surface fluid, simulated lung fluid, bile, saliva, plasma, interstitial fluid and intracellular fluid.« less

  8. Advanced Concept Exploration for Fast Ignition Science Program, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, Richard Burnite; McLean, Harry M.; Theobald, Wolfgang

    The Fast Ignition (FI) Concept for Inertial Confinement Fusion (ICF) has the potential to provide a significant advance in the technical attractiveness of Inertial Fusion Energy reactors. FI differs from conventional “central hot spot” (CHS) target ignition by decoupling compression from heating: using a laser (or heavy ion beam or Z pinch) drive pulse (10’s of nanoseconds) to create a dense fuel and a second, much shorter (~10 picoseconds) high intensity pulse to ignite a small volume within the dense fuel. The physics of fast ignition process was the focus of our Advanced Concept Exploration (ACE) program. Ignition depends criticallymore » on two major issues involving Relativistic High Energy Density (RHED) physics: The laser-induced creation of fast electrons and their propagation in high-density plasmas. Our program has developed new experimental platforms, diagnostic packages, computer modeling analyses, and taken advantage of the increasing energy available at laser facilities to advance understanding of the fundamental physics underlying these issues. Our program had three thrust areas: • Understand the production and characteristics of fast electrons resulting from FI relevant laser-plasma interactions and their dependence on laser prepulse and laser pulse length. • Investigate the subsequent fast electron transport in solid and through hot (FI-relevant) plasmas. • Conduct and understand integrated core-heating experiments by comparison to simulations. Over the whole period of this project (three years for this contract), we have greatly advanced our fundamental understanding of the underlying properties in all three areas: • Comprehensive studies on fast electron source characteristics have shown that they are controlled by the laser intensity distribution and the topology and plasma density gradient. Laser pre-pulse induced pre-plasma in front of a solid surface results in increased stand-off distances from the electron origin to the high density target as well as large and erratic spread of the electron beam with increasing short pulse duration. We have demonstrated, using newly available higher contrast lasers, an improved energy coupling, painting a promising picture for FI feasibility. • Our detailed experiments and analyses of fast electron transport dependence on target material have shown that it is feasible to collimate fast electron beam by self-generated resistive magnetic fields in engineered targets with a rather simple geometry. Stable and collimated electron beam with spot size as small as 50-μm after >100-μm propagation distance (an angular divergence angle of 20°!) in solid density plasma targets has been demonstrated with FI-relevant (10-ps, >1-kJ) laser pulses Such collimated beam would meet the required heating beam size for FI. • Our new experimental platforms developed for the OMEGA laser (i.e., i) high resolution 8 keV backlighter platform for cone-in-shell implosion and ii) the 8 keV imaging with Cu-doped shell targets for detailed transport characterization) have enabled us to experimentally confirm fuel assembly from cone-in-shell implosion with record-high areal density. We have also made the first direct measurement of fast electron transport and spatial energy deposition in integrated FI experiments enabling the first experiment-based benchmarking of integrated simulation codes. Executing this program required a large team. It was managed as a collaboration between General Atomics (GA), Lawrence Livermore National Laboratory (LLNL), and the Laboratory for Laser Energetics (LLE). GA fulfills its responsibilities jointly with the University of California, San Diego (UCSD), The Ohio State University (OSU) and the University of Nevada at Reno (UNR). The division of responsibility was as follows: (1) LLE had primary leadership for channeling studies and the integrated energy transfer, (2) LLNL led the development of measurement methods, analysis, and deployment of diagnostics, and (3) GA together with UCSD, OSU and UNR studied the detailed energy-transfer physics. The experimental program was carried out using the Titan laser at the Jupiter Laser Facility at LLNL, the OMEGA and OMEGA EP lasers at LLE and the Texas Petawatt laser at the University of Texas, Austin. Modeling has been pursued on large computing facilities at LLNL, OSU, and UCSD using codes developed (by us and others) within the HEDLP program, commercial codes, and by leveraging existing simulations codes developed by the National Nuclear Security Administration ICF program. One important aspect of this program was the involvement and training of young scientists including postdoctoral fellows and graduate students. This project generated an impressive forty articles in high quality journals including nine (two under review) in Physical Review Letters during the three years of this grant and five graduate students completed their doctoral dissertations.« less

  9. LLNL - WRF-LES - Neutral - TTU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosovic, Branko

    This dataset includes large-eddy simulation (LES) output from a neutrally stratified atmospheric boundary layer (ABL) simulation of observations at the SWIFT tower near Lubbock, Texas on Aug. 17, 2012. The dataset was used to assess LES models for simulation of canonical neutral ABL. The dataset can be used for comparison with other LES and computational fluid dynamics model outputs.

  10. Fission in R-processes Elements (FIRE) - Annual Report: Fiscal Year 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunck, Nicolas

    The goal of the FIRE topical collaboration in nuclear theory is to determine the astrophysical conditions of the rapid neutron capture process (r-process), which is responsible for the formation of heavy elements. This will be achieved by including in r-process simulations the most advanced models of fission (spontaneous, neutron-induced, beta-delayed) that have been developed at LLNL and LANL. The collaboration is composed of LLNL (lead) and LANL for work on nuclear data (ground-state properties, fission, beta-decay), BNL for nuclear data management, and the university of Notre Dame and North Carolina State University for r-process simulations. Under DOE/NNSA agreement, both universitiesmore » receive funds from the DOE Office of Science, while national laboratories receive funds directly from NA221.« less

  11. Higher-Order Advection-Based Remap of Magnetic Fields in an Arbitrary Lagrangian-Eulerian Code

    NASA Astrophysics Data System (ADS)

    Cornille, Brian; White, Dan

    2017-10-01

    We will present methods formulated for the Eulerian advection stage of an arbitrary Lagrangian-Eulerian code for the new addition of magnetohydrodynamic (MHD) effects. The various physical fields are advanced in time using a Lagrangian formulation of the system. When this Lagrangian motion produces substantial distortion of the mesh, it can be difficult or impossible to progress the simulation forward. This is overcome by relaxation of the mesh while the physical fields are frozen. The code has already successfully been extended to include evolution of magnetic field diffusion during the Lagrangian motion stage. This magnetic field is discretized using an H(div) compatible finite element basis. The advantage of this basis is that the divergence-free constraint of magnetic fields is maintained exactly during the Lagrangian motion evolution. Our goal is to preserve this property during Eulerian advection as well. We will demonstrate this property and the importance of MHD effects in several numerical experiments. In pulsed-power experiments magnetic fields may be imposed or spontaneously generated. When these magnetic fields are present, the evolution of the experiment may differ from a comparable configuration without magnetic fields. Prepared by LLNL under Contract DE-AC52-07NA27344. Supported by DOE CSGF under Grant Number DE-FG02-97ER25308.

  12. Requirements for migration of NSSD code systems from LTSS to NLTSS

    NASA Technical Reports Server (NTRS)

    Pratt, M.

    1984-01-01

    The purpose of this document is to address the requirements necessary for a successful conversion of the Nuclear Design (ND) application code systems to the NLTSS environment. The ND application code system community can be characterized as large-scale scientific computation carried out on supercomputers. NLTSS is a distributed operating system being developed at LLNL to replace the LTSS system currently in use. The implications of change are examined including a description of the computational environment and users in ND. The discussion then turns to requirements, first in a general way, followed by specific requirements, including a proposal for managing the transition.

  13. High Performance Parallel Processing (HPPP) Finite Element Simulation of Fluid Structure Interactions Final Report CRADA No. TC-0824-94-A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couch, R.; Ziegler, D. P.

    This project was a muki-partner CRADA. This was a partnership between Alcoa and LLNL. AIcoa developed a system of numerical simulation modules that provided accurate and efficient threedimensional modeling of combined fluid dynamics and structural response.

  14. Study of transport phenomena in laser-driven, non- equilibrium plasmas in the presence of external magnetic fields

    NASA Astrophysics Data System (ADS)

    Kemp, G. Elijah; Mariscal, D. A.; Williams, G. J.; Blue, B. E.; Colvin, J. D.; Fears, T. M.; Kerr, S. M.; May, M. J.; Moody, J. D.; Strozzi, D. J.; Lefevre, H. J.; Klein, S. R.; Kuranz, C. C.; Manuel, M. J.-E.; Gautier, D. C.; Montgomery, D. S.

    2017-10-01

    We present experimental and simulation results from a study of thermal transport inhibition in laser-driven, mid-Z, non-equilibrium plasmas in the presence external magnetic fields. The experiments were performed at the Jupiter Laser Facility at LLNL, where x-ray spectroscopy, proton radiography, and Brillouin backscatter data were simultaneously acquired from sub-critical-density, Ti-doped silica aerogel foams driven by a 2 ω laser at 5 ×1014 W /cm2 . External B-field strengths up to 20 T (aligned antiparallel to the laser propagation axis) were provided by a capacitor-bank-driven Helmholtz coil. Pre-shot simulations with Hydra, a radiation-magnetohydrodyanmics code, showed increasing electron plasma temperature with increasing B-field strength - the result of thermal transport inhibition perpendicular to the B-field. The influence of this thermal transport inhibition on the experimental observables as a function of external field strength and target density will be shown and compared with simulations. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344 and funded by LDRD project 17-ERD-027.

  15. Investigations of Turbulent Transport Channels in Gyrokinetic Simulations

    NASA Astrophysics Data System (ADS)

    Dimits, A. M.; Candy, J.; Guttenfelder, W.; Holland, C.; Howard, N.; Nevins, W. M.; Wang, E.

    2014-10-01

    Magnetic-field stochasticity arises due to microtearing perturbations, which can be driven linearly or nonlinearly (in cases where they are linearly stable), even at very modest values of the plasma beta. The resulting magnetic-flutter contribution may or may not be a significant component of the overall electron (particle and thermal) transport. Investigations of the effect of ExB flow shear on electron-drift magnetic-flutter diffusion coefficient Dedr (r ,v||) using perturbed magnetic fields from simulations, using the GYRO code, of ITG turbulence show a significant effect for electrons with parallel velocities v|| surprisingly far from the resonant velocity. We further examine changes in the radial dependence of this diffusion coefficient vs. v|| and which resonant magnetic-field perturbations are important to the values and radial structure of Dedr. The resulting electron transport fluxes are compared with the simulation results. Improvements over in treating the ambipolar field in the relationship between the magnetic (or drift) diffusion coefficients and the transport have been made in these comparisons. Prepared for US DOE by LLNL under Contract DE-AC52-07NA27344, by GA under Contract DE-FG03-95ER54309, and by PPPL under Contract DE-AC02-09CH11466.

  16. Correlated Production and Analog Transport of Fission Neutrons and Photons using Fission Models FREYA, FIFRELIN and the Monte Carlo Code TRIPOLI-4® .

    NASA Astrophysics Data System (ADS)

    Verbeke, Jérôme M.; Petit, Odile; Chebboubi, Abdelhazize; Litaize, Olivier

    2018-01-01

    Fission modeling in general-purpose Monte Carlo transport codes often relies on average nuclear data provided by international evaluation libraries. As such, only average fission multiplicities are available and correlations between fission neutrons and photons are missing. Whereas uncorrelated fission physics is usually sufficient for standard reactor core and radiation shielding calculations, correlated fission secondaries are required for specialized nuclear instrumentation and detector modeling. For coincidence counting detector optimization for instance, precise simulation of fission neutrons and photons that remain correlated in time from birth to detection is essential. New developments were recently integrated into the Monte Carlo transport code TRIPOLI-4 to model fission physics more precisely, the purpose being to access event-by-event fission events from two different fission models: FREYA and FIFRELIN. TRIPOLI-4 simulations can now be performed, either by connecting via an API to the LLNL fission library including FREYA, or by reading external fission event data files produced by FIFRELIN beforehand. These new capabilities enable us to easily compare results from Monte Carlo transport calculations using the two fission models in a nuclear instrumentation application. In the first part of this paper, broad underlying principles of the two fission models are recalled. We then present experimental measurements of neutron angular correlations for 252Cf(sf) and 240Pu(sf). The correlations were measured for several neutron kinetic energy thresholds. In the latter part of the paper, simulation results are compared to experimental data. Spontaneous fissions in 252Cf and 240Pu are modeled by FREYA or FIFRELIN. Emitted neutrons and photons are subsequently transported to an array of scintillators by TRIPOLI-4 in analog mode to preserve their correlations. Angular correlations between fission neutrons obtained independently from these TRIPOLI-4 simulations, using either FREYA or FIFRELIN, are compared to experimental results. For 240Pu(sf), the measured correlations were used to tune the model parameters.

  17. LIBMAKER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-08-01

    Version 00 COG LibMaker contains various utilities to convert common data formats into a format usable by the COG - Multi-particle Monte Carlo Code System package, (C00777MNYCP01). Utilities included: ACEtoCOG - ACE formatted neutron data: Currently ENDFB7R0.BNL, ENDFB7R1.BNL, JEFF3.1, JEFF3.1.1, JEFF3.1.2, MCNP.50c, MCNP.51c, MCNP.55c, MCNP.66c, and MCNP.70c. ACEUtoCOG - ACEU formatted photonuclear data: Currently PN.MCNP.30c and PN.MCNP.70u. ACTLtoCOG - Creates a COG library from ENDL formatted activation data COG library. EDDLtoCOG - Creates a COG library from ENDL formatted LLNL deuteron data. ENDLtoCOG - Creates a COG library from ENDL formatted LLNL neutron data. EPDLtoCOG - Creates a COG librarymore » from ENDL formatted LLNL photon data. LEX - Creates a COG dictionary file. SAB.ACEtoCOG - Creates a COG library from ACE formatted S(a,b) data. SABtoCOG - Creates a COG library from ENDF6 formatted S(a,b) data. URRtoCOG - Creates a COG library from ACE formatted probability table data. This package also includes library checking and bit swapping capability.« less

  18. Simulation of decay processes and radiation transport times in radioactivity measurements

    NASA Astrophysics Data System (ADS)

    García-Toraño, E.; Peyres, V.; Bé, M.-M.; Dulieu, C.; Lépy, M.-C.; Salvat, F.

    2017-04-01

    The Fortran subroutine package PENNUC, which simulates random decay pathways of radioactive nuclides, is described. The decay scheme of the active nuclide is obtained from the NUCLEIDE database, whose web application has been complemented with the option of exporting nuclear decay data (possible nuclear transitions, branching ratios, type and energy of emitted particles) in a format that is readable by the simulation subroutines. In the case of beta emitters, the initial energy of the electron or positron is sampled from the theoretical Fermi spectrum. De-excitation of the atomic electron cloud following electron capture and internal conversion is described using transition probabilities from the LLNL Evaluated Atomic Data Library and empirical or calculated energies of released X rays and Auger electrons. The time evolution of radiation showers is determined by considering the lifetimes of nuclear and atomic levels, as well as radiation propagation times. Although PENNUC is designed to operate independently, here it is used in conjunction with the electron-photon transport code PENELOPE, and both together allow the simulation of experiments with radioactive sources in complex material structures consisting of homogeneous bodies limited by quadric surfaces. The reliability of these simulation tools is demonstrated through comparisons of simulated and measured energy spectra from radionuclides with complex multi-gamma spectra, nuclides with metastable levels in their decay pathways, nuclides with two daughters, and beta plus emitters.

  19. Characterization of UMT2013 Performance on Advanced Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howell, Louis

    2014-12-31

    This paper presents part of a larger effort to make detailed assessments of several proxy applications on various advanced architectures, with the eventual goal of extending these assessments to codes of programmatic interest running more realistic simulations. The focus here is on UMT2013, a proxy implementation of deterministic transport for unstructured meshes. I present weak and strong MPI scaling results and studies of OpenMP efficiency on the Sequoia BG/Q system at LLNL, with comparison against similar tests on an Intel Sandy Bridge TLCC2 system. The hardware counters on BG/Q provide detailed information on many aspects of on-node performance, while informationmore » from the mpiP tool gives insight into the reasons for the differing scaling behavior on these two different architectures. Preliminary tests that exploit NVRAM as extended memory on an Ivy Bridge machine designed for “Big Data” applications are also included.« less

  20. Trapped Electron Instability of Electron Plasma Waves: Vlasov simulations and theory

    NASA Astrophysics Data System (ADS)

    Berger, Richard; Chapman, Thomas; Brunner, Stephan

    2013-10-01

    The growth of sidebands of a large-amplitude electron plasma wave is studied with Vlasov simulations for a range of amplitudes (. 001 < eϕ0 /Te < 1) and wavenumbers (0 . 25

  1. Multiscale Models for the Two-Stream Instability

    NASA Astrophysics Data System (ADS)

    Joseph, Ilon; Dimits, Andris; Banks, Jeffrey; Berger, Richard; Brunner, Stephan; Chapman, Thomas

    2017-10-01

    Interpenetrating streams of plasma found in many important scenarios in nature and in the laboratory can develop kinetic two-stream instabilities that exchange momentum and energy between the streams. A quasilinear model for the electrostatic two-stream instability is under development as a component of a multiscale model that couples fluid simulations to kinetic theory. Parameters of the model will be validated with comparison to full kinetic simulations using LOKI and efficient strategies for numerical solution of the quasilinear model and for coupling to the fluid model will be discussed. Extending the kinetic models into the collisional regime requires an efficient treatment of the collision operator. Useful reductions of the collision operator relative to the full multi-species Landau-Fokker-Plank operator are being explored. These are further motivated both by careful consideration of the parameter orderings relevant to two-stream scenarios and by the particular 2D+2V phase space used in the LOKI code. Prepared for US DOE by LLNL under Contract DE-AC52-07NA27344 and LDRD project 17- ERD-081.

  2. Development of an EMC3-EIRENE Synthetic Imaging Diagnostic

    NASA Astrophysics Data System (ADS)

    Meyer, William; Allen, Steve; Samuell, Cameron; Lore, Jeremy

    2017-10-01

    2D and 3D flow measurements are critical for validating numerical codes such as EMC3-EIRENE. Toroidal symmetry assumptions preclude tomographic reconstruction of 3D flows from single camera views. In addition, the resolution of the grids utilized in numerical code models can easily surpass the resolution of physical camera diagnostic geometries. For these reasons we have developed a Synthetic Imaging Diagnostic capability for forward projection comparisons of EMC3-EIRENE model solutions with the line integrated images from the Doppler Coherence Imaging diagnostic on DIII-D. The forward projection matrix is 2.8 Mpixel by 6.4 Mcells for the non-axisymmetric case we present. For flow comparisons, both simple line integral, and field aligned component matrices must be calculated. The calculation of these matrices is a massive embarrassingly parallel problem and performed with a custom dispatcher that allows processing platforms to join mid-problem as they become available, or drop out if resources are needed for higher priority tasks. The matrices are handled using standard sparse matrix techniques. Prepared by LLNL under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. DOE, Office of Science, Office of Fusion Energy Sciences. LLNL-ABS-734800.

  3. Numerical Investigation of the Consequences of Land Impacts, Water Impacts, or Air Bursts of Asteroids

    NASA Astrophysics Data System (ADS)

    Ezzedine, S. M.; Dearborn, D. S.; Miller, P. L.

    2015-12-01

    The annual probability of an asteroid impact is low, but over time, such catastrophic events are inevitable. Interest in assessing the impact consequences has led us to develop a physics-based framework to seamlessly simulate the event from entry to impact, including air and water shock propagation and wave generation. The non-linear effects are simulated using the hydrodynamics code GEODYN. As effects propagate outward, they become a wave source for the linear-elastic-wave propagation code, WPP/WWP. The GEODYN-WPP/WWP coupling is based on the structured adaptive-mesh-refinement infrastructure, SAMRAI, and has been used in FEMA table-top exercises conducted in 2013 and 2014, and more recently, the 2015 Planetary Defense Conference exercise. Results from these simulations provide an estimate of onshore effects and can inform more sophisticated inundation models. The capabilities of this methodology are illustrated by providing results for different impact locations, and an exploration of asteroid size on the waves arriving at the shoreline of area cities. We constructed the maximum and minimum envelops of water-wave heights given the size of the asteroid and the location of the impact along the risk corridor. Such profiles can inform emergency response and disaster-mitigation efforts, and may be used for design of maritime protection or assessment of risk to shoreline structures of interest. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-675390-DRAFT.

  4. Monte Carlo treatment planning for molecular targeted radiotherapy within the MINERVA system

    NASA Astrophysics Data System (ADS)

    Lehmann, Joerg; Hartmann Siantar, Christine; Wessol, Daniel E.; Wemple, Charles A.; Nigg, David; Cogliati, Josh; Daly, Tom; Descalle, Marie-Anne; Flickinger, Terry; Pletcher, David; DeNardo, Gerald

    2005-03-01

    The aim of this project is to extend accurate and patient-specific treatment planning to new treatment modalities, such as molecular targeted radiation therapy, incorporating previously crafted and proven Monte Carlo and deterministic computation methods. A flexible software environment is being created that allows planning radiation treatment for these new modalities and combining different forms of radiation treatment with consideration of biological effects. The system uses common input interfaces, medical image sets for definition of patient geometry and dose reporting protocols. Previously, the Idaho National Engineering and Environmental Laboratory (INEEL), Montana State University (MSU) and Lawrence Livermore National Laboratory (LLNL) had accrued experience in the development and application of Monte Carlo based, three-dimensional, computational dosimetry and treatment planning tools for radiotherapy in several specialized areas. In particular, INEEL and MSU have developed computational dosimetry systems for neutron radiotherapy and neutron capture therapy, while LLNL has developed the PEREGRINE computational system for external beam photon-electron therapy. Building on that experience, the INEEL and MSU are developing the MINERVA (modality inclusive environment for radiotherapeutic variable analysis) software system as a general framework for computational dosimetry and treatment planning for a variety of emerging forms of radiotherapy. In collaboration with this development, LLNL has extended its PEREGRINE code to accommodate internal sources for molecular targeted radiotherapy (MTR), and has interfaced it with the plugin architecture of MINERVA. Results from the extended PEREGRINE code have been compared to published data from other codes, and found to be in general agreement (EGS4—2%, MCNP—10%) (Descalle et al 2003 Cancer Biother. Radiopharm. 18 71-9). The code is currently being benchmarked against experimental data. The interpatient variability of the drug pharmacokinetics in MTR can only be properly accounted for by image-based, patient-specific treatment planning, as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo-based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and boron neutron capture therapy (BNCT). Brachytherapy and proton therapy are planned. Through the open application programming interface (API), other groups can add their own modules and share them with the community.

  5. Ideal MHD Stability and Characteristics of Edge Localized Modes on CFETR

    NASA Astrophysics Data System (ADS)

    Li, Zeyu; Chan, Vincent; Xu, Xueqiao; Wang, Xiaogang; Cfetr Physics Team

    2017-10-01

    Investigation on the equilibrium operation regime, its ideal magnetohydrodynamics (MHD) stability and edge localized modes (ELM) characteristics is performed for China Fusion Engineering Test Reactor (CFETR). The CFETR operation regime study starts with a baseline scenario derived from multi-code integrated modeling, with key parameters varied to build a systematic database. These parameters, under profile and pedestal constraints, provide the foundation for engineering design. The linear stabilities of low-n and intermediate-n peeling-ballooning modes for CFETR baseline scenario are analyzed. Multi-code benchmarking, including GATO, ELITE, BOUT + + and NIMROD, demonstrated good agreement in predicting instabilities. Nonlinear behavior of ELMs for the baseline scenario is simulated using BOUT + + . Instabilities are found both at the pedestal top and inside the pedestal region, which lead to a mix of grassy and type I ELMs. Pedestal structures extending inward beyond the pedestal top are also varied to study the influence on ELM characteristic. Preliminary results on the dependence of the Type-I ELM divertor heat load scaling on machine size and pedestal pressure will also be presented. Prepared by LLNL under Contract DE-AC52-07NA27344 and National Magnetic Confinement Fusion Research Program of China (Grant No. 2014GB110003 and 2014GB107004).

  6. Optimizing Dense Plasma Focus Neutron Yields With Fast Gas Jets

    NASA Astrophysics Data System (ADS)

    McMahon, Matthew; Stein, Elizabeth; Higginson, Drew; Kueny, Christopher; Link, Anthony; Schmidt, Andrea

    2017-10-01

    We report a study using the particle-in-cell code LSP to perform fully kinetic simulations modeling dense plasma focus (DPF) devices with high density gas jets on axis. The high-density jets are modeled in the large-eddy Navier-Stokes code CharlesX, which is suitable for modeling both sub-sonic and supersonic gas flow. The gas pattern, which is essentially static on z-pinch time scales, is imported from CharlesX to LSP for neutron yield predictions. Fast gas puffs allow for more mass on axis while maintaining the optimal pressure for the DPF. As the density of a subsonic jet increases relative to the background fill, we find the neutron yield increases, as does the variability in the neutron yield. Introducing perturbations in the jet density via super-sonic flow (also known as Mach diamonds) allow for consistent seeding of the m =0 instability leading to more consistent ion acceleration and higher neutron yields with less variability. Jets with higher on axis density are found to have the greatest yield. The optimal jet configuration and the necessary jet conditions for increasing neutron yield and reducing yield variability are explored. Simulations of realistic jet profiles are performed and compared to the ideal scenario. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and supported by the Laboratory Directed Research and Development Program (15-ERD-034) at LLNL.

  7. Kinetic Simulations of Dense Plasma Focus Breakdown

    NASA Astrophysics Data System (ADS)

    Schmidt, A.; Higginson, D. P.; Jiang, S.; Link, A.; Povilus, A.; Sears, J.; Bennett, N.; Rose, D. V.; Welch, D. R.

    2015-11-01

    A dense plasma focus (DPF) device is a type of plasma gun that drives current through a set of coaxial electrodes to assemble gas inside the device and then implode that gas on axis to form a Z-pinch. This implosion drives hydrodynamic and kinetic instabilities that generate strong electric fields, which produces a short intense pulse of x-rays, high-energy (>100 keV) electrons and ions, and (in deuterium gas) neutrons. A strong factor in pinch performance is the initial breakdown and ionization of the gas along the insulator surface separating the two electrodes. The smoothness and isotropy of this ionized sheath are imprinted on the current sheath that travels along the electrodes, thus making it an important portion of the DPF to both understand and optimize. Here we use kinetic simulations in the Particle-in-cell code LSP to model the breakdown. Simulations are initiated with neutral gas and the breakdown modeled self-consistently as driven by a charged capacitor system. We also investigate novel geometries for the insulator and electrodes to attempt to control the electric field profile. The initial ionization fraction of gas is explored computationally to gauge possible advantages of pre-ionization which could be created experimentally via lasers or a glow-discharge. Prepared by LLNL under Contract DE-AC52-07NA27344.

  8. 2D Kinetic Particle in Cell Simulations of a Shear-Flow Stabilized Z-Pinch

    NASA Astrophysics Data System (ADS)

    Tummel, Kurt; Higginson, Drew; Schmidt, Andrea; Link, Anthony; McLean, Harry; Shumlak, Uri; Nelson, Brian; Golingo, Raymond; Claveau, Elliot; Lawrence Livermore National Lab Team; University of Washington Team

    2016-10-01

    The Z-pinch is a relatively simple and attractive potential fusion reactor design, but attempts to develop such a reactor have consistently struggled to overcome Z-pinch instabilities. The ``sausage'' and ``kink'' modes are among the most robust and prevalent Z-pinch instabilities, but theory and simulations suggest that axial flow-shear, dvz / dr ≠ 0 , can suppress these modes. Experiments have confirmed that Z-pinch plasmas with embedded axial flow-shear display a significantly enhanced resilience to the sausage and kink modes at a demonstration current of 50kAmps. A new experiment is under way to test the concept at higher current, and efforts to model these plasmas are being expanded. The performance and stability of these devices will depend on features like the plasma viscosity, anomalous resistivity, and finite Larmor radius effects, which are most accurately characterized in kinetic models. To predict these features, kinetic simulations using the particle in cell code LSP are now in development, and initial benchmarking and 2D stability analyses of the sausage mode are presented here. These results represent the first kinetic modeling of the flow-shear stabilized Z-pinch. This work is funded by the USDOE/ARPAe Alpha Program. Prepared by LLNL under Contract DE-AC52-07NA27344.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, C. V.; Mendez, A. J.

    This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL) and Mendez R & D Associates (MRDA) to develop and demonstrate a reconfigurable and cost effective design for optical code division multiplexing (O-CDM) with high spectral efficiency and throughput, as applied to the field of distributed computing, including multiple accessing (sharing of communication resources) and bidirectional data distribution in fiber-to-the-premise (FTTx) networks.

  10. Spherical Harmonic Solutions to the 3D Kobayashi Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, P.N.; Chang, B.; Hanebutte, U.R.

    1999-12-29

    Spherical harmonic solutions of order 5, 9 and 21 on spatial grids containing up to 3.3 million cells are presented for the Kobayashi benchmark suite. This suite of three problems with simple geometry of pure absorber with large void region was proposed by Professor Kobayashi at an OECD/NEA meeting in 1996. Each of the three problems contains a source, a void and a shield region. Problem 1 can best be described as a box in a box problem, where a source region is surrounded by a square void region which itself is embedded in a square shield region. Problems 2more » and 3 represent a shield with a void duct. Problem 2 having a straight and problem 3 a dog leg shaped duct. A pure absorber and a 50% scattering case are considered for each of the three problems. The solutions have been obtained with Ardra, a scalable, parallel neutron transport code developed at Lawrence Livermore National Laboratory (LLNL). The Ardra code takes advantage of a two-level parallelization strategy, which combines message passing between processing nodes and thread based parallelism amongst processors on each node. All calculations were performed on the IBM ASCI Blue-Pacific computer at LLNL.« less

  11. Using SW4 for 3D Simulations of Earthquake Strong Ground Motions: Application to Near-Field Strong Motion, Building Response, Basin Edge Generated Waves and Earthquakes in the San Francisco Bay Are

    NASA Astrophysics Data System (ADS)

    Rodgers, A. J.; Pitarka, A.; Petersson, N. A.; Sjogreen, B.; McCallen, D.; Miah, M.

    2016-12-01

    Simulation of earthquake ground motions is becoming more widely used due to improvements of numerical methods, development of ever more efficient computer programs (codes), and growth in and access to High-Performance Computing (HPC). We report on how SW4 can be used for accurate and efficient simulations of earthquake strong motions. SW4 is an anelastic finite difference code based on a fourth order summation-by-parts displacement formulation. It is parallelized and can run on one or many processors. SW4 has many desirable features for seismic strong motion simulation: incorporation of surface topography; automatic mesh generation; mesh refinement; attenuation and supergrid boundary conditions. It also has several ways to introduce 3D models and sources (including Standard Rupture Format for extended sources). We are using SW4 to simulate strong ground motions for several applications. We are performing parametric studies of near-fault motions from moderate earthquakes to investigate basin edge generated waves and large earthquakes to provide motions to engineers study building response. We show that 3D propagation near basin edges can generate significant amplifications relative to 1D analysis. SW4 is also being used to model earthquakes in the San Francisco Bay Area. This includes modeling moderate (M3.5-5) events to evaluate the United States Geologic Survey's 3D model of regional structure as well as strong motions from the 2014 South Napa earthquake and possible large scenario events. Recently SW4 was built on a Commodity Technology Systems-1 (CTS-1) at LLNL, new systems for capacity computing at the DOE National Labs. We find SW4 scales well and runs faster on these systems compared to the previous generation of LINUX clusters.

  12. Does laser-driven heat front propagation depend on material microstructure?

    NASA Astrophysics Data System (ADS)

    Colvin, J. D.; Matsukuma, H.; Fournier, K. B.; Yoga, A.; Kemp, G. E.; Tanaka, N.; Zhang, Z.; Kota, K.; Tosaki, S.; Ikenouchi, T.; Nishimura, H.

    2016-10-01

    We showed earlier that the laser-driven heat front propagation velocity in low-density Ti-silica aerogel and TiO2 foam targets was slower than that simulated with a 2D radiation-hydrodynamics code incorporating an atomic kinetics model in non-LTE and assuming initially homogeneous material. Some theoretical models suggest that the heat front is slowed over what it would be in a homogeneous medium by the microstructure of the foam. In order to test this hypothesis we designed and conducted a comparison experiment on the GEKKO laser to measure heat front propagation velocity in two targets, one an Ar/CO2 gas mixture and the other a TiO2 foam, that had identical initial densities and average ionization states. We found that the heat front traveled about ten times faster in the gas than in the foam. We present the details of the experiment design and a comparison of the data with the simulations. This work was performed under the auspices of the U.S. Department of Energy by LLNL under Contract No. DE-AC52-07NA27344, and the joint research project of ILE Osaka U. (contract Nos. 2014A1-04 and 2015A1-02).

  13. Impact response of US Army and National Football League helmet pad systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, W C; King, M J

    Lawrence Livermore National Laboratory [LLNL] was tasked to compare the impact response of NFL helmet pad systems and U.S. Army pad systems compatible with an Advanced Combat Helmet [ACH] at impact velocities up to 20 ft/s. This was a one-year study funded by the U.S. Army and JIEDDO. The Army/JIEDDO point of contact is COL R. Todd Dombroski, DO, JIEDDO Surgeon. LLNL was chosen by committee to perform the research based on prior published computational studies of the mechanical response of helmets and skulls to blast. Our collaborators include the U.S. Army Aeromedical Research Laboratory [USAARL] (a DoD laboratory responsiblemore » for impact testing helmets), Team Wendy and Oregon Aero (current and former ACH pad manufacturers), Riddell and Xenith (NFL pad manufacturers), and d3o (general purpose sports pad manufacturer). The manufacturer-supplied pad systems that were studied are shown in the figure below. The first two are the Army systems, which are bilayer foam pads with both hard and soft foam and a water-resistant airtight wrapper (Team Wendy) or a water-resistant airtight coating (Oregon Aero). The next two are NFL pad systems. The Xenith system consists of a thin foam pad and a hollow air-filled cylinder that elastically buckles under load. The Riddell system is a bilayer foam pad that is encased in an inflatable airbag with relief channels to neighboring pads in the helmet. The inflatable airbag is for comfort and provides no enhancement to impact mitigation. The d3o system consists of a rate-sensitive homogeneous dense foam. LLNL performed experiments to characterize the material properties of the individual foam materials and the response of the complete pad systems, to obtain parameters needed for the simulations. LLNL also performed X-ray CT scans of an ACH helmet shell that were used to construct a geometrically accurate computational model of the helmet. Two complementary sets of simulations were performed. The first set of simulations reproduced the experimental helmet impact certification tests performed by USAARL, who provided data for comparison. The goal of this set of simulations was to demonstrate the overall validity of LLNL's computational analyses and methods and understand the general physics of helmet impacts. In these tests and the corresponding simulations, an inverted ACH containing pads and a head-form are dropped onto a hemispherical anvil, at 10 and 14.14 ft/s impact velocities. The simulations predicted peak accelerations (the metric used by USAARL for comparing the performance of pad systems), rebound velocities, and impact durations consistent with the experimental data, thus demonstrating the validity and relevance of the simulation methods. Because the NFL pad systems are approximately double the thickness of the U.S. Army pads, they do not fit into the ACH. As a result, the NFL pads could not be simply placed into an ACH shell in either a simulation or an experiment without modifying their size and shape. Since impact mitigation depends critically on the available stopping distance and the area over which the stopping force is applied, it is important to consider identically shaped pads in order to compare their performance in a fair and meaningful manner. Consequently, the second set of simulations utilized a simplified simulation geometry consisting of a 5 kg cylindrical impactor (equal in mass to a head) striking equally sized pads from each manufacturer. The simulated bilayer foam pads had the same proportions of hard and soft foam as the actual pad systems, while the Xenith pads were simulated as a bilayer foam pad with material properties adjusted to give the same response as the actual Xenith pads. The effects of trapped air were included in the simulations of the Team Wendy and Oregon Aero pads. All simulations used material properties derived from the experiments conducted at LLNL. The acceleration history of the center of mass of the impactor was used to calculate the Head Injury Criterion (HIC) for each simulation, to assess the pad performance. The HIC is a well-established metric that combines both acceleration and duration of impact to assess the danger of injury, and is a more robust measure than peak acceleration. Our key findings are: (1) The performance of a pad depends on the range of impact velocities. At lower impact velocity, softer pads perform better. At higher impact velocity, harder pads perform better; (2) Thicker pads perform better at all velocities, but especially at high velocities; and (3) For comparable thicknesses, neither the NFL systems nor the Oregon Aero pads outperform the Team Wendy pads currently used in the ACH system in militarily-relevant impact scenarios (impact speeds less than 20 ft/s). The second finding suggests a commercial off-the-shelf solution for mitigating impact-related traumatic brain injury to soldiers.« less

  14. Laboratory Directed Research and Development FY2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, W; Sketchley, J; Kotta, P

    2012-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less

  15. Modeling the effect of laser heating on the strength and failure of 7075-T6 aluminum

    DOE PAGES

    Florando, J. N.; Margraf, J. D.; Reus, J. F.; ...

    2015-06-06

    The effect of rapid laser heating on the response of 7075-T6 aluminum has been characterized using 3-D digital image correlation and a series of thermocouples. The experimental results indicate that as the samples are held under a constant load, the heating from the laser profile causes non-uniform temperature and strain fields, and the strain-rate increases dramatically as the sample nears failure. Simulations have been conducted using the LLNL multi-physics code ALE3D, and compared to the experiments. The strength and failure of the material was modeled using the Johnson–Cook strength and damage models. Here, in order to capture the response, amore » dual-condition criterion was utilized which calibrated one set of parameters to low temperature quasi-static strain rate data, while the other parameter set is calibrated to high temperature high strain rate data. The thermal effects were captured using temperature dependent thermal constants and invoking thermal transport with conduction, convection, and thermal radiation.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The plpdfa software is a product of an LDRD project at LLNL entitked "Adaptive Sampling for Very High Throughput Data Streams" (tracking number 11-ERD-035). This software was developed by a graduate student summer intern, Chris Challis, who worked under project PI Dan Merl furing the summer of 2011. The software the source code is implementing is a statistical analysis technique for clustering and classification of text-valued data. The method had been previously published by the PI in the open literature.

  17. Progress report on Nuclear Density project with Lawrence Livermore National Lab Year 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, C W; Krastev, P; Ormand, W E

    2011-03-11

    The main goal for year 2010 was to improve parallelization of the configuration interaction code BIGSTICK, co-written by W. Erich Ormand (LLNL) and Calvin W. Johnson (SDSU), with the parallelization carried out primarily by Plamen Krastev, a postdoc at SDSU and funded in part by this grant. The central computational algorithm is the Lanczos algorithm, which consists of a matrix-vector multiplication (matvec), followed by a Gram-Schmidt reorthogonalization.

  18. Porting AMG2013 to Heterogeneous CPU+GPU Nodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samfass, Philipp

    LLNL's future advanced technology system SIERRA will feature heterogeneous compute nodes that consist of IBM PowerV9 CPUs and NVIDIA Volta GPUs. Conceptually, the motivation for such an architecture is quite straightforward: While GPUs are optimized for throughput on massively parallel workloads, CPUs strive to minimize latency for rather sequential operations. Yet, making optimal use of heterogeneous architectures raises new challenges for the development of scalable parallel software, e.g., with respect to work distribution. Porting LLNL's parallel numerical libraries to upcoming heterogeneous CPU+GPU architectures is therefore a critical factor for ensuring LLNL's future success in ful lling its national mission. Onemore » of these libraries, called HYPRE, provides parallel solvers and precondi- tioners for large, sparse linear systems of equations. In the context of this intern- ship project, I consider AMG2013 which is a proxy application for major parts of HYPRE that implements a benchmark for setting up and solving di erent systems of linear equations. In the following, I describe in detail how I ported multiple parts of AMG2013 to the GPU (Section 2) and present results for di erent experiments that demonstrate a successful parallel implementation on the heterogeneous ma- chines surface and ray (Section 3). In Section 4, I give guidelines on how my code should be used. Finally, I conclude and give an outlook for future work (Section 5).« less

  19. Optimizing LX-17 Thermal Decomposition Model Parameters with Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Moore, Jason; McClelland, Matthew; Tarver, Craig; Hsu, Peter; Springer, H. Keo

    2017-06-01

    We investigate and model the cook-off behavior of LX-17 because this knowledge is critical to understanding system response in abnormal thermal environments. Thermal decomposition of LX-17 has been explored in conventional ODTX (One-Dimensional Time-to-eXplosion), PODTX (ODTX with pressure-measurement), TGA (thermogravimetric analysis), and DSC (differential scanning calorimetry) experiments using varied temperature profiles. These experimental data are the basis for developing multiple reaction schemes with coupled mechanics in LLNL's multi-physics hydrocode, ALE3D (Arbitrary Lagrangian-Eulerian code in 2D and 3D). We employ evolutionary algorithms to optimize reaction rate parameters on high performance computing clusters. Once experimentally validated, this model will be scalable to a number of applications involving LX-17 and can be used to develop more sophisticated experimental methods. Furthermore, the optimization methodology developed herein should be applicable to other high explosive materials. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC.

  20. Near-Field to Far-Field Uncertainty Propagation and Quantification of Ground Motions Generated by the Source Physics Experiments (SPE)

    NASA Astrophysics Data System (ADS)

    Antoun, T.; Ezzedine, S. M.; Vorobiev, O.; Pitarka, A.; Hurley, R.; Hirakawa, E. T.; Glenn, L.; Walter, W. R.

    2016-12-01

    LLNL has developed a framework for uncertainty propagation and quantification using HPC numerical codes to simulate end-to-end, from source to receivers, the ground motions observed during the Source Physics Experiments (SPE) conducted in fractured granitic rock at the Nevada National Security Site (NNSS). SPE includes six underground chemical explosions designed with different yields initiated at different depths. To date we have successfully applied this framework to explain the near-field shear motions observed in the vicinity of SPE3 thru SPE5. However, systematic uncertainty propagation to the far-field seismic receiver has not been addressed yet. In the current study, we used a coupling between the non-linear inelastic hydrodynamic regime in the near-field and the seismic elastic regime in the far-field to conduct the analysis. Several realizations of the stochastic discrete fracture network were generated conditional to the observed sparse data. These realizations were then used to calculate the ground motions generated from the SPE shots up to the elastic radius. The latter serves as the handshake interface for the far-field simulations. By creating several realizations of near-field responses one can embed those sources into the far-field elastic wave code and further the uncertainty propagation to the receivers. We will present a full assessment from end-to-end for the near- and far-field measurements. Separate analyses of the effect of the different conceptual geological models are also carried over using a nested Monte Carlo scheme. We compare the observed frequency content at several gages with the simulated ones. We conclude that both regions experience different sampling of frequencies: small features are relevant to near-field simulations while larger feature are more dominant at the far-field. We finally rank the primary sensitive parameters for both regions to drive and refine the field characterization data collection.

  1. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  2. Energy and technology review

    NASA Astrophysics Data System (ADS)

    Johnson, K. C.

    1991-04-01

    This issue of Energy and Technology Review discusses the various educational programs in which Lawrence Livermore National Laboratory (LLNL) participates or sponsors. LLNL has a long history of fostering educational programs for students from kindergarten through graduate school. A goal is to enhance the teaching of science, mathematics, and technology and thereby assist educational institutions to increase the pool of scientists, engineers, and technicians. LLNL programs described include: (1) contributions to the improvement of U.S. science education; (2) the LESSON program; (3) collaborations with Bay Area Science and Technology Education; (4) project HOPES; (5) lasers and fusion energy education; (6) a curriculum on global climate change; (7) computer and technology instruction at LLNL's Science Education Center; (8) the National Education Supercomputer Program; (9) project STAR; (10) the American Indian Program; (11) LLNL programs with historically Black colleges and Universities; (12) the Undergraduate Summer Institute on Contemporary Topics in Applied Science; (13) the National Physical Science Consortium: A Fellowship Program for Minorities and Women; (14) LLNL's participation with AWU; (15) the apprenticeship programs at LLNL; and (16) the future of LLNL's educational programs. An appendix lists all of LLNL's educational programs and activities. Contacts and their respective telephone numbers are given for all these programs and activities.

  3. Milestone Completion Report STCO04-1 AAPS: engagements with code teams, vendors, collaborators, developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, E. W.

    The Advanced Architecture and Portability Specialists team (AAPS) worked with a select set of LLNL application teams to develop and/or implement a portability strategy for next-generation architectures. The team also investigated new and updated programming models and helped develop programming abstractions targeting maintainability and performance portability. Significant progress was made on both fronts in FY17, resulting in multiple applications being significantly more prepared for the nextgeneration machines than before.

  4. Effects of Ion-ion Collisions and Inhomogeneity in Two-dimensional Simulations of Stimulated Brillouin Backscattering*

    NASA Astrophysics Data System (ADS)

    Cohen, B. I.

    2005-10-01

    Two-dimensional simulations of stimulated Brillouin backscattering (SBBS) with the BZOHAR^1 code have been extended to include ion-ion collisions and spatial nonuniformity in the mean ion flow. BZOHAR hybrid simulations (particle-in-cell kinetic ions and Boltzmann fluid electrons) have shown^2 that SBBS saturation is dominated by ion trapping effects and secondary instability of the primary ion wave (decay into subharmonic ion waves and ion quasi-modes). Here we address the effects of ion collisions^3 on SBBS saturation and employ the efficient Langevin ion collision algorithm of Ref. 4 and the Fokker-Planck collision operator of Ref. 5. We also report simulations of SBBS with a linear gradient in the mean ion drift, which in conjunction with the nonlinear frequency shift due to ion trapping can introduce auto-resonance effects that may enhance reflectivities.^6 For SBBS in a high-gain limit with ion collisions or inhomogeneity, we find that ion trapping and secondary ion wave instabilities are robust saturation mechanisms. *Work performed for US DOE by UC LLNL under Contr. W-7405-ENG-48. ^1B.I. Cohen, et al., Phys. Plasmas 4, 956 (1997). ^2B.I. Cohen, et al., Phys. Plasmas, 12, 052703 (2005),. ^ 3P.W. Rambo, et al., Phys. Rev. Lett. 79, 83 (1997). ^ 4M.E. Jones, et al., J. Comp. Phys. 123, 169, (1996). ^ 5W. M. Manheimer, et al., J. Comp. Phys. 138, 563 (1997). ^ 6E.A. Williams, et al., Phys. Plasmas 11, 231 (2004).

  5. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weeratunga, S K

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less

  6. Purple Computational Environment With Mappings to ACE Requirements for the General Availability User Environment Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barney, B; Shuler, J

    2006-08-21

    Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally,more » the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, A J; Petersson, N A; Morency, C E

    The California Academy of Sciences (CAS) Morrison Planetarium is producing a 'full-dome' planetarium show on earthquakes and asked LLNL to produce content for the show. Specifically the show features numerical ground motion simulations of the M 7.9 1906 San Francisco and a possible future M 7.05 Hayward fault scenario earthquake. The show also features concepts of plate tectonics and mantle convection using images from LLNL's G3D global seismic tomography. This document describes the data that was provided to the CAS in support of production of the 'Earthquake' show. The CAS is located in Golden Gate Park, San Francisco and hostsmore » over 1.6 million visitors. The Morrison Planetarium, within the CAS, is the largest all digital planetarium in the world. It features a 75-foot diameter spherical section projection screen tilted at a 30-degree angle. Six projectors cover the entire field of view and give a three-dimensional immersive experience. CAS shows strive to use scientifically accurate digital data in their productions. The show, entitled simply 'Earthquake', will debut on 26 May 2012. They are working on graphics and animations based on the same data sets for display on LLNL powerwalls and flat-screens as well as for public release.« less

  8. Malignant melanoma slide review project: Patients from non-Kaiser hospitals in the San Francisco Bay Area. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reynolds, P.

    This project was initiated, in response to concerns that the observed excess of malignant melanoma among employees of Lawrence Livermore National Laboratory (LLNL) might reflect the incidence of disease diagnostically different than that observed in the general population. LLNL sponsored a slide review project, inviting leading dermatopathology experts to independently evaluate pathology slides from LLNL employees diagnosed with melanoma and those from a matched sample of Bay Area melanoma patients who did not work at the LLNL. The study objectives were to: Identify all 1969--1984 newly diagnosed cases of malignant melanoma among LLNL employees resident in the San Francisco-Oakland Metropolitanmore » Statistical Area, and diagnosed at facilities other than Kaiser Permanente; identify a comparison series of melanoma cases also diagnosed between 1969--1984 in non-Kaiser facilities, and matched as closely as possible to the LLNL case series by gender, race, age at diagnosis, year of diagnosis, and hospital of diagnosis; obtain pathology slides for the identified (LLNL) case and (non-LLNL) comparison patients for review by the LLNL-invited panel of dermatopathology experts; and to compare the pathologic characteristics of the case and comparison melanoma patients, as recorded by the dermatopathology panel.« less

  9. Development of a PDXP platform on NIF

    NASA Astrophysics Data System (ADS)

    Whitley, Heather; Schneider, Marilyn; Garbett, Warren; Pino, Jesse; Shepherd, Ronnie; Brown, Colin; Castor, John; Scott, Howard; Ellison, C. Leland; Benedict, Lorin; Sio, Hong; Lahmann, Brandon; Petrasso, Richard; Graziani, Frank

    2016-10-01

    Over the past several years, we have conducted theoretical investigations of electron-ion coupling and electronic transport in plasmas. In the regime of weakly coupled plasmas, we have identified models that we believe describe the physics well, but experimental measurements are still needed to validate the models. We are developing spectroscopic experiments to study electron-ion equilibration and electron heat transport using a polar direct drive exploding pusher (PDXP) platform at the National Ignition Facility (NIF). Initial measurements are focused on characterizing the laser-target coupling, symmetry of the PDXP implosion, and overall neutron and x-ray signals. We present images from the first set of shots and make comparisons with simulations from ARES and discuss next steps in the platform development. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-697489.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, Richard D.; Hones, Holger E.

    The RAJA Performance Suite is designed to evaluate performance of the RAJA performance portability library on a wide variety of important high performance computing (HPC) algorithmic lulmels. These kernels assess compiler optimizations and various parallel programming model backends accessible through RAJA, such as OpenMP, CUDA, etc. The Initial version of the suite contains 25 computational kernels, each of which appears in 6 variants: Baseline SequcntiaJ, RAJA SequentiaJ, Baseline OpenMP, RAJA OpenMP, Baseline CUDA, RAJA CUDA. All variants of each kernel perform essentially the same mathematical operations and the loop body code for each kernel is identical across all variants. Theremore » are a few kernels, such as those that contain reduction operations, that require CUDA-specific coding for their CUDA variants. ActuaJ computer instructions executed and how they run in parallel differs depending on the parallel programming model backend used and which optimizations are perfonned by the compiler used to build the Perfonnance Suite executable. The Suite will be used primarily by RAJA developers to perform regular assessments of RAJA performance across a range of hardware platforms and compilers as RAJA features are being developed. It will also be used by LLNL hardware and software vendor panners for new defining requirements for future computing platform procurements and acceptance testing. In particular, the RAJA Performance Suite will be used for compiler acceptance testing of the upcoming CORAUSierra machine {initial LLNL delivery expected in late-2017/early 2018) and the CORAL-2 procurement. The Suite will aJso be used to generate concise source code reproducers of compiler and runtime issues we uncover so that we may provide them to relevant vendors to be fixed.« less

  11. KCAT, Xradia, ALS and APS Performance Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, A; Martz, H; Brown, W

    2004-09-30

    At Lawrence Livermore National Laboratory (LLNL) particular emphasis is being placed on the nondestructive characterization (NDC) of components, subassemblies and assemblies of millimeter-size extent with micrometer-size features (mesoscale). These mesoscale objects include materials that vary widely in composition, density, geometry and embedded features. Characterizing these mesoscale objects is critical for corroborating the physics codes that underlie LLNL's Stockpile Stewardship mission. In this report we present results from our efforts to quantitatively characterize the performance of several x-ray systems in an effort to benchmark existing systems and to determine which systems may have the best potential for our mesoscale imaging needs.more » Several different x-ray digital radiography (DR) and computed tomography (CT) systems exist that may be applicable to our mesoscale object characterization requirements, including microfocus and synchrotron systems. The systems we have benchmarked include KCAT (LLNL developed) and Xradia {mu}XCT (Xradia, Inc., Concord, CA), both microfocus systems, and Beamline 1-ID at the Advance Photon Source (APS) and the Tomography Beamline at the Advanced Light Source (ALS), both synchrotron based systems. The ALS Tomography Beamline is a new installation, and the data presented and analyzed here is some of the first to be acquired at the facility. It is important to note that the ALS system had not yet been optimized at the time we acquired data. Results for each of these systems has been independently documented elsewhere. In this report we summarize and compare the characterization results for these systems.« less

  12. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    NASA Astrophysics Data System (ADS)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years, SAC files contained a fixed-length header. Time and distance-related values are stored in single precision, which has become a problem with the increase in desired precision for data compared to thirty years ago. A future goal is to address this precision problem, but in a backward compatible manner. We would also like to transition SAC to a more open source license.

  13. Coop Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera, Zaylis Zayas; Bond, Essex

    2013-11-22

    My Coop at LLNL has been my first professional experience as an Electrical Engineer. I was tasked to carry out signal processing to analyze data and code in the IDL following standard software development principles. The Coop has met all of my needs to continue my professional career, and I feel more confident as I continue working as a student and professional. It is now a big open question for me as whether to pursue graduate research or industry after I graduate with my B.S. in Electrical Engineering.

  14. Uncrackable code for nuclear weapons

    ScienceCinema

    Hart, Mark

    2018-05-11

    Mark Hart, a scientist and engineer in Lawrence Livermore National Laboratory's (LLNL) Defense Technologies Division, has developed a new approach for ensuring nuclear weapons and their components can't fall prey to unauthorized use. The beauty of his approach: Let the weapon protect itself. "Using the random process of nuclear radioactive decay is the gold standard of random number generators," said Mark Hart. "You’d have a better chance of winning both Mega Millions and Powerball on the same day than getting control of IUC-protected components."

  15. Uncrackable code for nuclear weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Mark

    Mark Hart, a scientist and engineer in Lawrence Livermore National Laboratory's (LLNL) Defense Technologies Division, has developed a new approach for ensuring nuclear weapons and their components can't fall prey to unauthorized use. The beauty of his approach: Let the weapon protect itself. "Using the random process of nuclear radioactive decay is the gold standard of random number generators," said Mark Hart. "You’d have a better chance of winning both Mega Millions and Powerball on the same day than getting control of IUC-protected components."

  16. Plasma Interactions with Mixed Materials and Impurity Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rognlien, T. D.; Beiersdorfer, Peter; Chernov, A.

    2016-10-28

    The project brings together three discipline areas at LLNL to develop advanced capability to predict the impact of plasma/material interactions (PMI) on metallic surfaces in magnetic fusion energy (MFE) devices. These areas are (1) modeling transport of wall impurity ions through the edge plasma to the core plasma, (2) construction of a laser blow-off (LBO) system for injecting precise amounts of metallic atoms into a tokamak plasma, and (3) material science analysis of fundamental processes that modify metallic surfaces during plasma bombardment. The focus is on tungsten (W), which is being used for the ITER divertor and in designs ofmore » future MFE devices. In area (1), we have worked with the University of California, San Diego (UCSD) on applications of the UEDGE/DUSTT coupled codes to predict the influx of impurity ions from W dust through the edge plasma, including periodic edge-plasma oscillations, and revived a parallel version of UEDGE to speed up these simulations. In addition, the impurity transport model in the 2D UEDGE code has been implemented into the 3D BOUT++ turbulence/transport code to allow fundamental analysis of the impact of strong plasma turbulence on the impurity transport. In area (2), construction and testing of the LBO injection system has been completed. The original plan to install the LBO on the National Spherical Torus Experiment Upgrade (NSTX-U) at Princeton and its use to validate the impurity transport simulations is delayed owing to NSTX-U being offline for substantial magnetic coil repair period. In area (3), an analytic model has been developed to explain the growth of W tendrils (or fuzz) observed for helium-containing plasmas. Molecular dynamics calculations of W sputtering by W and deuterium (D) ions shows that a spatial blending of interatomic potentials is needed to describe the near-surface and deeper regions of the material.« less

  17. Calibration of the Lawrence Livermore National Laboratory Passive-Active Neutron Drum Shuffler for Measurement of Highly Enriched Uranium in Oxides within DOE-STD-3013-2000 Containers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mount, M E; O'Connell, W J

    2005-06-03

    Lawrence Livermore National Laboratory (LLNL) uses the LLNL passive-active neutron drum (PAN) shuffler (Canberra Model JCC-92) for accountability measurement of highly enriched uranium (HEU) oxide and HEU in mixed uranium-plutonium (U-Pu) oxide. In June 2002, at the 43rd Annual Meeting of the Institute of Nuclear Material Management, LLNL reported on an extensive effort to calibrate this shuffler, based on standards measurements and extensive simulations, for HEU oxides and mixed U-Pu oxides in thin-walled primary and secondary containers. In August 2002, LLNL began to also use DOE-STD-3013-2000 containers for HEU oxide and mixed U-Pu oxide. These DOE-STD-3013-2000 containers are comprised ofmore » a stainless steel convenience can enclosed in welded stainless steel primary and secondary containers. Compared to the double thin-walled containers, the DOE-STD-3013-2000 containers have substantially thicker walls, and the density of materials in these containers was found to extend over a greater range (1.35 g/cm{sup 3} to 4.62 g/cm{sup 3}) than foreseen for the double thin-walled containers. Further, the DOE-STD-3013-2000 Standard allows for oxides containing at least 30 wt% Pu plus U whereas the calibration algorithms for thin-walled containers were derived for virtually pure HEU or mixed U-Pu oxides. An initial series of Monte Carlo simulations of the PAN shuffler response to given quantities of HEU oxide and mixed U-Pu oxide in DOE-STD-3013-2000 containers was generated and compared with the response predicted by the calibration algorithms for thin-walled containers. Results showed a decrease on the order of 10% in the count rate, and hence a decrease in the calculated U mass for measured unknowns, with some varying trends versus U mass. Therefore a decision was made to develop a calibration algorithm for the PAN shuffler unique to the DOE-STD-3013-2000 container. This paper describes that effort and selected unknown item measurement results.« less

  18. Summary Report of Summer 2009 NGSI Human Capital Development Efforts at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dougan, A; Dreicer, M; Essner, J

    2009-11-16

    In 2009, Lawrence Livermore National Laboratory (LLNL) engaged in several activities to support NA-24's Next Generation Safeguards Initiative (NGSI). This report outlines LLNL's efforts to support Human Capital Development (HCD), one of five key components of NGSI managed by Dunbar Lockwood in the Office of International Regimes and Agreements (NA-243). There were five main LLNL summer safeguards HCD efforts sponsored by NGSI: (1) A joint Monterey Institute of International Studies/Center for Nonproliferation Studies-LLNL International Safeguards Policy and Information Analysis Course; (2) A Summer Safeguards Policy Internship Program at LLNL; (3) A Training in Environmental Sample Analysis for IAEA Safeguards Internship;more » (4) Safeguards Technology Internships; and (5) A joint LLNL-INL Summer Safeguards Lecture Series. In this report, we provide an overview of these five initiatives, an analysis of lessons learned, an update on the NGSI FY09 post-doc, and an update on students who participated in previous NGSI-sponsored LLNL safeguards HCD efforts.« less

  19. A Monte Carlo Simulation of the in vivo measurement of lung activity in the Lawrence Livermore National Laboratory torso phantom.

    PubMed

    Acha, Robert; Brey, Richard; Capello, Kevin

    2013-02-01

    A torso phantom was developed by the Lawrence Livermore National Laboratory (LLNL) that serves as a standard for intercomparison and intercalibration of detector systems used to measure low-energy photons from radionuclides, such as americium deposited in the lungs. DICOM images of the second-generation Human Monitoring Laboratory-Lawrence Livermore National Laboratory (HML-LLNL) torso phantom were segmented and converted into three-dimensional (3D) voxel phantoms to simulate the response of high purity germanium (HPGe) detector systems, as found in the HML new lung counter using a Monte Carlo technique. The photon energies of interest in this study were 17.5, 26.4, 45.4, 59.5, 122, 244, and 344 keV. The detection efficiencies at these photon energies were predicted for different chest wall thicknesses (1.49 to 6.35 cm) and compared to measured values obtained with lungs containing (241)Am (34.8 kBq) and (152)Eu (10.4 kBq). It was observed that no statistically significant differences exist at the 95% confidence level between the mean values of simulated and measured detection efficiencies. Comparisons between the simulated and measured detection efficiencies reveal a variation of 20% at 17.5 keV and 1% at 59.5 keV. It was found that small changes in the formulation of the tissue substitute material caused no significant change in the outcome of Monte Carlo simulations.

  20. 3D Vectorial Time Domain Computational Integrated Photonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallman, J S; Bond, T C; Koning, J M

    2007-02-16

    The design of integrated photonic structures poses considerable challenges. 3D-Time-Domain design tools are fundamental in enabling technologies such as all-optical logic, photonic bandgap sensors, THz imaging, and fast radiation diagnostics. Such technologies are essential to LLNL and WFO sponsors for a broad range of applications: encryption for communications and surveillance sensors (NSA, NAI and IDIV/PAT); high density optical interconnects for high-performance computing (ASCI); high-bandwidth instrumentation for NIF diagnostics; micro-sensor development for weapon miniaturization within the Stockpile Stewardship and DNT programs; and applications within HSO for CBNP detection devices. While there exist a number of photonics simulation tools on the market,more » they primarily model devices of interest to the communications industry. We saw the need to extend our previous software to match the Laboratory's unique emerging needs. These include modeling novel material effects (such as those of radiation induced carrier concentrations on refractive index) and device configurations (RadTracker bulk optics with radiation induced details, Optical Logic edge emitting lasers with lateral optical inputs). In addition we foresaw significant advantages to expanding our own internal simulation codes: parallel supercomputing could be incorporated from the start, and the simulation source code would be accessible for modification and extension. This work addressed Engineering's Simulation Technology Focus Area, specifically photonics. Problems addressed from the Engineering roadmap of the time included modeling the Auston switch (an important THz source/receiver), modeling Vertical Cavity Surface Emitting Lasers (VCSELs, which had been envisioned as part of fast radiation sensors), and multi-scale modeling of optical systems (for a variety of applications). We proposed to develop novel techniques to numerically solve the 3D multi-scale propagation problem for both the microchip laser logic devices as well as devices characterized by electromagnetic (EM) propagation in nonlinear materials with time-varying parameters. The deliverables for this project were extended versions of the laser logic device code Quench2D and the EM propagation code EMsolve with new modules containing the novel solutions incorporated by taking advantage of the existing software interface and structured computational modules. Our approach was multi-faceted since no single methodology can always satisfy the tradeoff between model runtime and accuracy requirements. We divided the problems to be solved into two main categories: those that required Full Wave Methods and those that could be modeled using Approximate Methods. Full Wave techniques are useful in situations where Maxwell's equations are not separable (or the problem is small in space and time), while approximate techniques can treat many of the remaining cases.« less

  1. Joint FAM/Line Management Assessment Report on LLNL Machine Guarding Safety Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, J. J.

    2016-07-19

    The LLNL Safety Program for Machine Guarding is implemented to comply with requirements in the ES&H Manual Document 11.2, "Hazards-General and Miscellaneous," Section 13 Machine Guarding (Rev 18, issued Dec. 15, 2015). The primary goal of this LLNL Safety Program is to ensure that LLNL operations involving machine guarding are managed so that workers, equipment and government property are adequately protected. This means that all such operations are planned and approved using the Integrated Safety Management System to provide the most cost effective and safest means available to support the LLNL mission.

  2. 2017 LLNL Nuclear Forensics Summer Internship Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavarin, Mavrik

    The Lawrence Livermore National Laboratory (LLNL) Nuclear Forensics Summer Internship Program (NFSIP) is designed to give graduate students an opportunity to come to LLNL for 8-10 weeks of hands-on research. Students conduct research under the supervision of a staff scientist, attend a weekly lecture series, interact with other students, and present their work in poster format at the end of the program. Students can also meet staff scientists one-on-one, participate in LLNL facility tours (e.g., the National Ignition Facility and Center for Accelerator Mass Spectrometry), and gain a better understanding of the various science programs at LLNL.

  3. Lawrence Livermore National Laboratory Environmental Report 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, H E; Bertoldo, N A; Campbell, C G

    The purposes of the Lawrence Livermore National Laboratory Environmental Report 2010 are to record Lawrence Livermore National Laboratory's (LLNL's) compliance with environmental standards and requirements, describe LLNL's environmental protection and remediation programs, and present the results of environmental monitoring at the two LLNL sites - the Livermore site and Site 300. The report is prepared for the U.S. Department of Energy (DOE) by LLNL's Environmental Protection Department. Submittal of the report satisfies requirements under DOE Order 231.1A, Environmental Safety and Health Reporting, and DOE Order 5400.5, Radiation Protection of the Public and Environment. The report is distributed electronically and ismore » available at https://saer.llnl.gov/, the website for the LLNL annual environmental report. Previous LLNL annual environmental reports beginning in 1994 are also on the website. Some references in the electronic report text are underlined, which indicates that they are clickable links. Clicking on one of these links will open the related document, data workbook, or website that it refers to. The report begins with an executive summary, which provides the purpose of the report and an overview of LLNL's compliance and monitoring results. The first three chapters provide background information: Chapter 1 is an overview of the location, meteorology, and hydrogeology of the two LLNL sites; Chapter 2 is a summary of LLNL's compliance with environmental regulations; and Chapter 3 is a description of LLNL's environmental programs with an emphasis on the Environmental Management System including pollution prevention. The majority of the report covers LLNL's environmental monitoring programs and monitoring data for 2010: effluent and ambient air (Chapter 4); waters, including wastewater, storm water runoff, surface water, rain, and groundwater (Chapter 5); and terrestrial, including soil, sediment, vegetation, foodstuff, ambient radiation, and special status wildlife and plants (Chapter 6). Complete monitoring data, which are summarized in the body of the report, are provided in Appendix A. The remaining three chapters discuss the radiological impact on the public from LLNL operations (Chapter 7), LLNL's groundwater remediation program (Chapter 8), and quality assurance for the environmental monitoring programs (Chapter 9). The report uses System International units, consistent with the federal Metric Conversion Act of 1975 and Executive Order 12770, Metric Usage in Federal Government Programs (1991). For ease of comparison to environmental reports issued prior to 1991, dose values and many radiological measurements are given in both metric and U.S. customary units. A conversion table is provided in the glossary.« less

  4. LLNL Scientists Use NERSC to Advance Global Aerosol Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergmann, D J; Chuang, C; Rotman, D

    2004-10-13

    While ''greenhouse gases'' have been the focus of climate change research for a number of years, DOE's ''Aerosol Initiative'' is now examining how aerosols (small particles of approximately micron size) affect the climate on both a global and regional scale. Scientists in the Atmospheric Science Division at Lawrence Livermore National Laboratory (LLNL) are using NERSC's IBM supercomputer and LLNL's IMPACT (atmospheric chemistry) model to perform simulations showing the historic effects of sulfur aerosols at a finer spatial resolution than ever done before. Simulations were carried out for five decades, from the 1950s through the 1990s. The results clearly show themore » effects of the changing global pattern of sulfur emissions. Whereas in 1950 the United States emitted 41 percent of the world's sulfur aerosols, this figure had dropped to 15 percent by 1990, due to conservation and anti-pollution policies. By contrast, the fraction of total sulfur emissions of European origin has only dropped by a factor of 2 and the Asian emission fraction jumped six fold during the same time, from 7 percent in 1950 to 44 percent in 1990. Under a special allocation of computing time provided by the Office of Science INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program, Dan Bergmann, working with a team of LLNL scientists including Cathy Chuang, Philip Cameron-Smith, and Bala Govindasamy, was able to carry out a large number of calculations during the past month, making the aerosol project one of the largest users of NERSC resources. The applications ran on 128 and 256 processors. The objective was to assess the effects of anthropogenic (man-made) sulfate aerosols. The IMPACT model calculates the rate at which SO{sub 2} (a gas emitted by industrial activity) is oxidized and forms particles known as sulfate aerosols. These particles have a short lifespan in the atmosphere, often washing out in about a week. This means that their effects on climate tend to be more regional, occurring near the area where the SO{sub 2} is emitted. To accurately study these regional effects, Bergmann needed to run the simulations at a finer horizontal resolution, as the coarser resolution (typically 300km by 300km) of other climate models are insufficient for studying changes on a regional scale. Livermore's use of CAM3, the Community Atmospheric Model which is a high-resolution climate model developed at NCAR (with collaboration from DOE), allows a 100km by 100km grid to be applied. NERSC's terascale computing capability provided the needed computational horsepower to run the application at the finer level.« less

  5. Gold Spectra Measurements from LLNL EBIT Plasmas

    NASA Astrophysics Data System (ADS)

    May, M.; Brown, G. V.; Chen, H.; Chung, H. K.; Gu, M.; Hansen, S. B.; Schneider, M. B.; Widmann, K.; Beiersdorfer, P.

    2008-11-01

    Spectra have been recorded from gold that has been injected into the Lawrence Livermore Electron Beam Ion Trap (EBIT-II). Both mono-energetic and experimentally simulated Maxwell-Boltzmann (MB) plasmas were created for these measurements. The beam plasmas had energies of 2.75, 3.0, 3.6, 4.6, 5.5, 6.0, 6.5 keV. The MB plasmas had electron temperatures of 2.0, 2.5 and 3.0 keV. M-band gold spectra (n = 4-3, 5-3, 6-3 and 7-3 transitions) were recorded between 1 - 8 keV from K-like to Kr-like ions in the x-ray. The emission of gold was recorded by crystal spectrometers and a micro-calorimeter from the Goddard Space Flight Center. A full survey of the recorded spectra will be presented along with line emission and charge state modeling from the flexible atomic code (FAC). Some comparisons with laser produced plasmas will be made. *This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  6. ARES Simulations of a Double Shell Surrogate Target

    NASA Astrophysics Data System (ADS)

    Sacks, Ryan; Tipton, Robert; Graziani, Frank

    2015-11-01

    Double shell targets provide an alternative path to ignition that allows for a less robust laser profile and non-cryogenic initial temperatures. The target designs call for a high-Z material to abut the gas/liquid DT fuel which is cause for concern due to possible mix of the inner shell with the fuel. This research concentrates on developing a surrogate target for a double shell capsule that can be fielded in a current NIF two-shock hohlraum. Through pressure-density scaling the hydrodynamic behavior of the high-Z pusher of a double shell can be approximated allowing for studies of performance and mix. Use of the ARES code allows for investigation of mix in one and two dimensions and analysis of instabilities in two dimensions. Development of a shell material that will allow for experiments similar to CD Mix is also discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344, Lawrence Livermore National Security, LLC. Information Management release number LLNL-ABS-675098.

  7. 2016 LLNL Nuclear Forensics Summer Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavarin, Mavrik

    The Lawrence Livermore National Laboratory (LLNL) Nuclear Forensics Summer Program is designed to give graduate students an opportunity to come to LLNL for 8–10 weeks for a hands-on research experience. Students conduct research under the supervision of a staff scientist, attend a weekly lecture series, interact with other students, and present their work in poster format at the end of the program. Students also have the opportunity to meet staff scientists one-on-one, participate in LLNL facility tours (e.g., the National Ignition Facility and Center for Accelerator Mass Spectrometry), and gain a better understanding of the various science programs at LLNL.

  8. Finite element analysis of constrained total Condylar Knee Prosthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-07-13

    Exactech, Inc., is a prosthetic joint manufacturer based in Gainesville, FL. The company set the goal of developing a highly effective prosthetic articulation, based on scientific principles, not trial and error. They developed an evolutionary design for a total knee arthroplasty system that promised improved performance. They performed static load tests in the laboratory with similar previous designs, but dynamic laboratory testing was both difficult to perform and prohibitively expensive for a small business to undertake. Laboratory testing also cannot measure stress levels in the interior of the prosthesis where failures are known to initiate. To fully optimize their designsmore » for knee arthroplasty revisions, they needed range-of-motion stress/strain data at interior as well as exterior locations within the prosthesis. LLNL developed computer software (especially NIKE3D) specifically designed to perform stress/strain computations (finite element analysis) for complex geometries in large displacement/large deformation conditions. Additionally, LLNL had developed a high fidelity knee model for other analytical purposes. The analysis desired by Exactech could readily be performed using NIKE3D and a modified version of the high fidelity knee that contained the geometry of the condylar knee components. The LLNL high fidelity knee model was a finite element computer model which would not be transferred to Exactech during the course of this CRADA effort. The previously performed laboratory studies by Exactech were beneficial to LLNL in verifying the analytical capabilities of NIKE3D for human anatomical modeling. This, in turn, gave LLNL further entree to perform work-for-others in the prosthetics field. There were two purposes to the CRADA (1) To modify the LLNL High Fidelity Knee Model to accept the geometry of the Exactech Total Knee; and (2) To perform parametric studies of the possible design options in appropriate ranges of motion so that an optimum design could be selected for production. Because of unanticipated delays in the CRADA funding, the knee design had to be finalized before the analysis could be accomplished. Thus, the scope of work was modified by the industrial partner. It was decided that it would be most beneficial to perform FEA that would closely replicate the lab tests that had been done as the basis of the design. Exactech was responsible for transmitting the component geometries to Livermore, as well as providing complete data from the quasi-static laboratory loading tests that were performed on various designs. LLNL was responsible for defining the basic finite element mesh and carrying out the analysis. We performed the initial computer simulation and verified model integrity, using the laboratory data. After performing the parametric studies, the results were reviewed with Exactech. Also, the results were presented at the Orthopedic Research Society meeting in a poster session.« less

  9. Laboratory Tests of Multiplex Detection of PCR Amplicons Using the Luminex 100 Flow Analyzer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkateswaran, K.S.; Nasarabadi, S.; Langlois, R.G.

    2000-05-05

    Lawrence Livermore National Laboratory (LLNL) demonstrated the power of flow cytometry in detecting the biological agents simulants at JFT III. LLNL pioneered in the development of advanced nucleic acid analyzer (ANM) for portable real time identification. Recent advances in flow cytometry provide a means for multiplexed nucleic acid detection and immunoassay of pathogenic microorganisms. We are presently developing multiplexed immunoassays for the simultaneous detection of different simulants. Our goal is to build an integrated instrument for both nucleic acid analysis and immuno detection. In this study we evaluated the Luminex LX 100 for concurrent identification of more than one PCRmore » amplified product. ANAA has real-time Taqman fluorescent detection capability for rapid identification of field samples. However, its multiplexing ability is limited by the combination of available fluorescent labels. Hence integration of ANAA with flow cytometry can give the rapidity of ANAA amplification and the multiplex capability of flow cytometry. Multiplexed flow cytometric analysis is made possible using a set of fluorescent latex microsphere that are individually identified by their red and infrared fluorescence. A green fluorochrome is used as the assay signal. Methods were developed for the identification of specific nucleic acid sequences from Bacillus globigii (Bg), Bacillus thuringensis (Bt) and Erwinia herbicola (Eh). Detection sensitivity using different reporter fluorochromes was tested with the LX 100, and also different assay formats were evaluated for their suitability for rapid testing. A blind laboratory trial was carried out December 22-27, 1999 to evaluate bead assays for multiplex identification of Bg and Bt PCR products. This report summarizes the assay development, fluorochrome comparisons, and the results of the blind trial conducted at LLNL for the laboratory evaluation of the LX 100 flow analyzer.« less

  10. Environmental Report 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallegos, G; Bertoldo, N A; Campbell, C G

    The purposes of the Lawrence Livermore National Laboratory Environmental Report 2008 are to record Lawrence Livermore National Laboratory's (LLNL's) compliance with environmental standards and requirements, describe LLNL's environmental protection and remediation programs, and present the results of environmental monitoring at the two LLNL sites - the Livermore site and Site 300. The report is prepared for the U.S. Department of Energy (DOE) by LLNL's Environmental Protection Department. Submittal of the report satisfies requirements under DOE Order 231.1A, Environmental Safety and Health Reporting, and DOE Order 5400.5, Radiation Protection of the Public and Environment. The report is distributed electronically and ismore » available at https://saer.lln.gov/, the website for the LLNL annual environmental report. Previous LLNL annual environmental reports beginning in 1994 are also on the website. Some references in the electronic report text are underlined, which indicates that they are clickable links. Clicking on one of these links will open the related document, data workbook, or website that it refers to. The report begins with an executive summary, which provides the purpose of the report and an overview of LLNL's compliance and monitoring results. The first three chapters provide background information: Chapter 1 is an overview of the location, meteorology, and hydrogeology of the two LLNL sites; Chapter 2 is a summary of LLNL's compliance with environmental regulations; and Chapter 3 is a description of LLNL's environmental programs with an emphasis on the Environmental Management System including pollution prevention. The majority of the report covers LLNL's environmental monitoring programs and monitoring data for 2008: effluent and ambient air (Chapter 4); waters, including wastewater, storm water runoff, surface water, rain, and groundwater (Chapter 5); and terrestrial, including soil, sediment, vegetation, foodstuff, ambient radiation, and special status wildlife and plants (Chapter 6). Complete monitoring data, which are summarized in the body of the report, are provided in Appendix A. The remaining three chapters discuss the radiological impact on the public from LLNL operations (Chapter 7), LLNL's groundwater remediation program (Chapter 8), and quality assurance for the environmental monitoring programs (Chapter 9). The report uses Systeme International units, consistent with the federal Metric Conversion Act of 1975 and Executive Order 12770, Metric Usage in Federal Government Programs (1991). For ease of comparison to environmental reports issued prior to 1991, dose values and many radiological measurements are given in both metric and U.S. customary units. A conversion table is provided in the glossary. The report is the responsibility of LLNL's Environmental Protection Department. Monitoring data were obtained through the combined efforts of the Environmental Protection Department; Environmental Restoration Department; Physical and Life Sciences Environmental Monitoring Radiation Laboratory; and the Hazards Control Department.« less

  11. Environmental Report 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathews, S; Gallegos, G; Berg, L L

    2008-09-24

    The purposes of the 'Lawrence Livermore National Laboratory Environmental Report 2007' are to record Lawrence Livermore National Laboratory's (LLNL's) compliance with environmental standards and requirements, describe LLNL's environmental protection and remediation programs, and present the results of environmental monitoring at the two LLNL sites--the Livermore site and Site 300. The report is prepared for the U.S. Department of Energy (DOE) by LLNL's Environmental Protection Department. Submittal of the report satisfies requirements under DOE Order 231.1A, Environmental Safety and Health Reporting, and DOE Order 5400.5, Radiation Protection of the Public and Environment. The report is distributed electronically and is available atmore » https://saer.lln.gov/, the website for the LLNL annual environmental report. Previous LLNL annual environmental reports beginning in 1994 are also on the website. Some references in the electronic report text are underlined, which indicates that they are clickable links. Clicking on one of these links will open the related document, data workbook, or website that it refers to. The report begins with an executive summary, which provides the purpose of the report and an overview of LLNL's compliance and monitoring results. The first three chapters provide background information: Chapter 1 is an overview of the location, meteorology, and hydrogeology of the two LLNL sites; Chapter 2 is a summary of LLNL's compliance with environmental regulations; and Chapter 3 is a description of LLNL's environmental programs with an emphasis on the Environmental Management System including pollution prevention. The majority of the report covers LLNL's environmental monitoring programs and monitoring data for 2007: effluent and ambient air (Chapter 4); waters, including wastewater, storm water runoff, surface water, rain, and groundwater (Chapter 5); and terrestrial, including soil, sediment, vegetation, foodstuff, ambient radiation, and special status wildlife and plants (Chapter 6). Complete monitoring data, which are summarized in the body of the report, are provided in Appendix A. The remaining three chapters discuss the radiological impact on the public from LLNL operations (Chapter 7), LLNL's groundwater remediation program (Chapter 8), and quality assurance for the environmental monitoring programs (Chapter 9). The report uses Systeme International units, consistent with the federal Metric Conversion Act of 1975 and Executive Order 12770, Metric Usage in Federal Government Programs (1991). For ease of comparison to environmental reports issued prior to 1991, dose values and many radiological measurements are given in both metric and U.S. customary units. A conversion table is provided in the glossary.« less

  12. Supplement analysis for continued operation of Lawrence Livermore National Laboratory and Sandia National Laboratories, Livermore. Volume 2: Comment response document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-03-01

    The US Department of Energy (DOE), prepared a draft Supplement Analysis (SA) for Continued Operation of Lawrence Livermore National Laboratory (LLNL) and Sandia National Laboratories, Livermore (SNL-L), in accordance with DOE`s requirements for implementation of the National Environmental Policy Act of 1969 (NEPA) (10 Code of Federal Regulations [CFR] Part 1021.314). It considers whether the Final Environmental Impact Statement and Environmental Impact Report for Continued Operation of Lawrence Livermore National Laboratory and Sandia National Laboratories, Livermore (1992 EIS/EIR) should be supplement3ed, whether a new environmental impact statement (EIS) should be prepared, or no further NEPA documentation is required. The SAmore » examines the current project and program plans and proposals for LLNL and SNL-L, operations to identify new or modified projects or operations or new information for the period from 1998 to 2002 that was not considered in the 1992 EIS/EIR. When such changes, modifications, and information are identified, they are examined to determine whether they could be considered substantial or significant in reference to the 1992 proposed action and the 1993 Record of Decision (ROD). DOE released the draft SA to the public to obtain stakeholder comments and to consider those comments in the preparation of the final SA. DOE distributed copies of the draft SA to those who were known to have an interest in LLNL or SNL-L activities in addition to those who requested a copy. In response to comments received, DOE prepared this Comment Response Document.« less

  13. Particle-in-cell modeling for MJ scale dense plasma focus with varied anode shape

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, A., E-mail: link6@llnl.gov; Halvorson, C., E-mail: link6@llnl.gov; Schmidt, A.

    2014-12-15

    Megajoule scale dense plasma focus (DPF) Z-pinches with deuterium gas fill are compact devices capable of producing 10{sup 12} neutrons per shot but past predictive models of large-scale DPF have not included kinetic effects such as ion beam formation or anomalous resistivity. We report on progress of developing a predictive DPF model by extending our 2D axisymmetric collisional kinetic particle-in-cell (PIC) simulations from the 4 kJ, 200 kA LLNL DPF to 1 MJ, 2 MA Gemini DPF using the PIC code LSP. These new simulations incorporate electrodes, an external pulsed-power driver circuit, and model the plasma from insulator lift-off throughmore » the pinch phase. To accommodate the vast range of relevant spatial and temporal scales involved in the Gemini DPF within the available computational resources, the simulations were performed using a new hybrid fluid-to-kinetic model. This new approach allows single simulations to begin in an electron/ion fluid mode from insulator lift-off through the 5-6 μs run-down of the 50+ cm anode, then transition to a fully kinetic PIC description during the run-in phase, when the current sheath is 2-3 mm from the central axis of the anode. Simulations are advanced through the final pinch phase using an adaptive variable time-step to capture the fs and sub-mm scales of the kinetic instabilities involved in the ion beam formation and neutron production. Validation assessments are being performed using a variety of different anode shapes, comparing against experimental measurements of neutron yield, neutron anisotropy and ion beam production.« less

  14. Toward validation of a 3-D plasma turbulence model using LAPD data

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.

    2010-11-01

    Detailed results from a 3-D fluid simulation of plasma turbulence are compared with experimental data from the Large Plasma Device (LAPD) at UCLA. LAPD is a magnetized plasma column experiment with a high repetition rate, allowing detailed time-and-space resolved probe data on plasma turbulence and transport. The large amount of data allows a thorough comparison with the simulation results. For the observed drift-type modes, LAPD plasmas are strongly collisional (φ*/νei1 and λei/L1), providing justification for a fluid treatment. Accordingly, the model is based on reduced Braginskii equations and is implemented in the framework of the BOUT code, originally developed at LLNL for tokamak edge plasmas. Analysis of linear plasma instabilities shows that resistive drift modes, rotation-driven interchange modes, and Kelvin-Helmholtz modes can all be important in LAPD and have comparable frequencies and growth rates. In nonlinear simulations using measured LAPD density profiles, evolution of instabilities and self-generated zonal flows results in a saturated turbulent state. Comparisons of these simulations with measurements in LAPD plasmas reveal good agreement, in particular in the frequency spectrum, spatial correlation, and amplitude probability distribution function of density fluctuations. Also, consistent with the experiment, the simulations indicate a great deal of similarity between plasma turbulence in LAPD and some features of tokamak edge turbulence. Similar to tokamak edge plasmas, density transport appears to be predominantly carried by large particle-flux events. Despite the intermittent character of the calculated turbulence, as indicated by fluctuation statistics, the turbulent particle flux is consistent with a diffusive model with diffusion coefficient close to the Bohm value.

  15. Evaluation of LLNL BSL-3 Maximum Credible Event Potential Consequence to the General Population and Surrounding Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, M.

    2010-08-16

    The purpose of this evaluation is to establish reproducibility of the analysis and consequence results to the general population and surrounding environment in the LLNL Biosafety Level 3 Facility Environmental Assessment (LLNL 2008).

  16. Natural Language Processing as a Discipline at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firpo, M A

    The field of Natural Language Processing (NLP) is described as it applies to the needs of LLNL in handling free-text. The state of the practice is outlined with the emphasis placed on two specific aspects of NLP: Information Extraction and Discourse Integration. A brief description is included of the NLP applications currently being used at LLNL. A gap analysis provides a look at where the technology needs work in order to meet the needs of LLNL. Finally, recommendations are made to meet these needs.

  17. Branson: A Mini-App for Studying Parallel IMC, Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Alex

    This code solves the gray thermal radiative transfer (TRT) equations in parallel using simple opacities and Cartesian meshes. Although Branson solves the TRT equations it is not designed to model radiation transport: Branson contains simple physics and does not have a multigroup treatment, nor can it use physical material data. The opacities have are simple polynomials in temperature there is a limited ability to specify complex geometries and sources. Branson was designed only to capture the computational demands of production IMC codes, especially in large parallel runs. It was also intended to foster collaboration with vendors, universities and other DOEmore » partners. Branson is similar in character to the neutron transport proxy-app Quicksilver from LLNL, which was recently open-sourced.« less

  18. Modeling and measurements of XRD spectra of extended solids under high pressure

    NASA Astrophysics Data System (ADS)

    Batyrev, I. G.; Coleman, S. P.; Stavrou, E.; Zaug, J. M.; Ciezak-Jenkins, J. A.

    2017-06-01

    We present results of evolutionary simulations based on density functional calculations of various extended solids: N-Si and N-H using variable and fixed concentration methods of USPEX. Predicted from the evolutionary simulations structures were analyzed in terms of thermo-dynamical stability and agreement with experimental X-ray diffraction spectra. Stability of the predicted system was estimated from convex-hull plots. X-ray diffraction spectra were calculated using a virtual diffraction algorithm which computes kinematic diffraction intensity in three-dimensional reciprocal space before being reduced to a two-theta line profile. Calculations of thousands of XRD spectra were used to search for a structure of extended solids at certain pressures with best fits to experimental data according to experimental XRD peak position, peak intensity and theoretically calculated enthalpy. Comparison of Raman and IR spectra calculated for best fitted structures with available experimental data shows reasonable agreement for certain vibration modes. Part of this work was performed by LLNL, Contract DE-AC52-07NA27344. We thank the Joint DoD / DOE Munitions Technology Development Program, the HE C-II research program at LLNL and Advanced Light Source, supported by BES DOE, Contract No. DE-AC02-05CH112.

  19. Interactive Photochemistry in Earth System Models to Assess Uncertainty in Ozone and Greenhouse Gases. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prather, Michael J.; Hsu, Juno; Nicolau, Alex

    Atmospheric chemistry controls the abundances and hence climate forcing of important greenhouse gases including N 2O, CH 4, HFCs, CFCs, and O 3. Attributing climate change to human activities requires, at a minimum, accurate models of the chemistry and circulation of the atmosphere that relate emissions to abundances. This DOE-funded research provided realistic, yet computationally optimized and affordable, photochemical modules to the Community Earth System Model (CESM) that augment the CESM capability to explore the uncertainty in future stratospheric-tropospheric ozone, stratospheric circulation, and thus the lifetimes of chemically controlled greenhouse gases from climate simulations. To this end, we have successfullymore » implemented Fast-J (radiation algorithm determining key chemical photolysis rates) and Linoz v3.0 (linearized photochemistry for interactive O 3, N 2O, NO y and CH 4) packages in LLNL-CESM and for the first time demonstrated how change in O2 photolysis rate within its uncertainty range can significantly impact on the stratospheric climate and ozone abundances. From the UCI side, this proposal also helped LLNL develop a CAM-Superfast Chemistry model that was implemented for the IPCC AR5 and contributed chemical-climate simulations to CMIP5.« less

  20. Biological and Chemical Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, P J

    2002-12-19

    The LLNL Chemical & Biological National Security Program (CBNP) provides science, technology and integrated systems for chemical and biological security. Our approach is to develop and field advanced strategies that dramatically improve the nation's capabilities to prevent, prepare for, detect, and respond to terrorist use of chemical or biological weapons. Recent events show the importance of civilian defense against terrorism. The 1995 nerve gas attack in Tokyo's subway served to catalyze and focus the early LLNL program on civilian counter terrorism. In the same year, LLNL began CBNP using Laboratory-Directed R&D investments and a focus on biodetection. The Nunn-Lugar-Domenici Defensemore » Against Weapons of Mass Destruction Act, passed in 1996, initiated a number of U.S. nonproliferation and counter-terrorism programs including the DOE (now NNSA) Chemical and Biological Nonproliferation Program (also known as CBNP). In 2002, the Department of Homeland Security was formed. The NNSA CBNP and many of the LLNL CBNP activities are being transferred as the new Department becomes operational. LLNL has a long history in national security including nonproliferation of weapons of mass destruction. In biology, LLNL had a key role in starting and implementing the Human Genome Project and, more recently, the Microbial Genome Program. LLNL has over 1,000 scientists and engineers with relevant expertise in biology, chemistry, decontamination, instrumentation, microtechnologies, atmospheric modeling, and field experimentation. Over 150 LLNL scientists and engineers work full time on chemical and biological national security projects.« less

  1. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  2. Thin Shell Model for NIF capsule stagnation studies

    NASA Astrophysics Data System (ADS)

    Hammer, J. H.; Buchoff, M.; Brandon, S.; Field, J. E.; Gaffney, J.; Kritcher, A.; Nora, R. C.; Peterson, J. L.; Spears, B.; Springer, P. T.

    2015-11-01

    We adapt the thin shell model of Ott et al. to asymmetric ICF capsule implosions on NIF. Through much of an implosion, the shell aspect ratio is large so the thin shell approximation is well satisfied. Asymmetric pressure drive is applied using an analytic form for ablation pressure as a function of the x-ray flux, as well as time-dependent 3D drive asymmetry from hohlraum calculations. Since deviations from a sphere are small through peak velocity, we linearize the equations, decompose them by spherical harmonics and solve ODE's for the coefficients. The model gives the shell position, velocity and areal mass variations at the time of peak velocity, near 250 microns radius. The variables are used to initialize 3D rad-hydro calculations with the HYDRA and ARES codes. At link time the cold fuel shell and ablator are each characterized by a density, adiabat and mass. The thickness, position and velocity of each point are taken from the thin shell model. The interior of the shell is filled with a uniform gas density and temperature consistent with the 3/2PV energy found from 1D rad-hydro calculations. 3D linked simulations compare favorably with integrated simulations of the entire implosion. Through generating synthetic diagnostic data, the model offers a method for quickly testing hypothetical sources of asymmetry and comparing with experiment. Prepared by LLNL under Contract DE-AC52-07NA27344.

  3. IGPP-LLNL 1998 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryerson, F J; Cook, K H; Tweed, J

    1999-11-19

    The Institute of Geophysics and Planetary Physics (IGPP) is a Multicampus Research Unit of the University of California (UC). IGPP was founded in 1946 at UC Los Angeles with a charter to further research in the earth and planetary sciences and related fields. The Institute now has branches at UC campuses in Los Angeles, San Diego, and Riverside, and at Los Alamos and Lawrence Livermore national laboratories. The University-wide IGPP has played an important role in establishing interdisciplinary research in the earth and planetary sciences. For example, IGPP was instrumental in founding the fields of physical oceanography and space physics,more » which at the time fell between the cracks of established university departments. Because of its multicampus orientation, IGPP has sponsored important interinstitutional consortia in the earth and planetary sciences. Each of the five branches has a somewhat different intellectual emphasis as a result of the interplay between strengths of campus departments and Laboratory programs. The IGPP branch at Lawrence Livermore National Laboratory (LLNL) was approved by the Regents of the University of California in 1982. IGPP-LLNL emphasizes research in tectonics, geochemistry, and astrophysics. It provides a venue for studying the fundamental aspects of these fields, thereby complementing LLNL programs that pursue applications of these disciplines in national security and energy research. IGPP-LLNL is directed by Charles Alcock and was originally organized into three centers: Geosciences, stressing seismology; High-Pressure Physics, stressing experiments using the two-stage light-gas gun at LLNL; and Astrophysics, stressing theoretical and computational astrophysics. In 1994, the activities of the Center for High-Pressure Physics were merged with those of the Center for Geosciences. The Center for Geosciences, headed by Frederick Ryerson, focuses on research in geophysics and geochemistry. The Astrophysics Research Center, headed by Kem Cook, provides a home for theoretical and observational astrophysics and serves as an interface with the Physics Directorate's astrophysics efforts. The IGPP branch at LLNL (as well as the branch at Los Alamos) also facilitates scientific collaborations between researchers at the UC campuses and those at the national laboratories in areas related to earth science, planetary science, and astrophysics. It does this by sponsoring the University Collaborative Research Program (UCRP), which provides funds to UC campus scientists for joint research projects with LLNL. Additional information regarding IGPP-LLNL projects and people may be found at http://wwwigpp.llnl.gov/. The goals of the UCRP are to enrich research opportunities for UC campus scientists by making available to them some of LLNL's unique facilities and expertise, and to broaden the scientific program at LLNL through collaborative or interdisciplinary work with UC campus researchers. UCRP funds (provided jointly by the Regents of the University of California and by the Director of LLNL) are awarded annually on the basis of brief proposals, which are reviewed by a committee of scientists from UC campuses, LLNL programs, and external universities and research organizations. Typical annual funding for a collaborative research project ranges from $5,000 to $30,000. Funds are used for a variety of purposes, such as salary support for UC graduate students, postdoctoral fellows, and faculty; and costs for experimental facilities. A statistical overview of IGPP-LLNL's UCRP (colloquially known as the mini-grant program) is presented in Figures 1 and 2. Figure 1 shows the distribution of UCRP awards among the UC campuses, by total amount awarded and by number of proposals funded. Figure 2 shows the distribution of awards by center.« less

  4. Computational mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less

  5. ICP-MS Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carman, April J.; Eiden, Gregory C.

    2014-11-01

    This is a short document that explains the materials that will be transmitted to LLNL and DNN HQ regarding the ICP-MS Workshop held at PNNL June 17-19th. The goal of the information is to pass on to LLNL information regarding the planning and preparations for the Workshop at PNNL in preparation of the SIMS workshop at LLNL.

  6. Solutions for Digital Video Transmission Technology Final Report CRADA No. TC02068.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, A. T.; Rivers, W.

    This Project aimed at development of software for seismic data processing based on the Geotool code developed by the American company Multimax., Inc. The Geotool code was written in early 90-es for the UNIX platform. Under Project# 2821, functions of the old Geotool code were transferred into a commercial version for the Microsoft XP and Vista platform with addition of new capabilities on visualization and data processing. The developed new version of the Geotool+ was implemented using the up-to-date tool Microsoft Visual Studio 2005 and uses capabilities of the .NET platform. C++ was selected as the main programming language formore » the Geotool+. The two-year Project was extended by six months and funding levels increased from 600,000 to $670,000. All tasks were successfully completed and all deliverables were met for the project even though both the industrial partner and LLNL principal investigator left the project before its final report.« less

  7. Unified EDGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2007-06-18

    UEDGE is an interactive suite of physics packages using the Python or BASIS scripting systems. The plasma is described by time-dependent 2D plasma fluid equations that include equations for density, velocity, ion temperature, electron temperature, electrostatic potential, and gas density in the edge region of a magnetic fusion energy confinement device. Slab, cylindrical, and toroidal geometries are allowed, and closed and open magnetic field-line regions are included. Classical transport is assumed along magnetic field lines, and anomalous transport is assumed across field lines. Multi-charge state impurities can be included with the corresponding line-radiation energy loss. Although UEDGE is written inmore » Fortran, for efficient execution and analysis of results, it utilizes either Python or BASIS scripting shells. Python is easily available for many platforms (http://www.Python.org/). The features and availability of BASIS are described in "Basis Manual Set" by P.F. Dubois, Z.C. Motteler, et al., Lawrence Livermore National Laboratory report UCRL-MA-1 18541, June, 2002 and http://basis.llnl.gov. BASIS has been reviewed and released by LLNL for unlimited distribution. The Python version utilizes PYBASIS scripts developed by D.P. Grote, LLNL. The Python version also uses MPPL code and MAC Perl script, available from the public-domain BASIS source above. The Forthon version of UEDGE uses the same source files, but utilizes Forthon to produce a Python-compatible source. Forthon has been developed by D.P. Grote at LBL (see http://hifweb.lbl.gov/Forthon/ and Grote et al. in the references below), and it is freely available. The graphics can be performed by any package importable to Python, such as PYGIST.« less

  8. Review of LLNL Mixed Waste Streams for the Application of Potential Waste Reduction Controls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belue, A; Fischer, R P

    2007-01-08

    In July 2004, LLNL adopted the International Standard ISO 14001 as a Work Smart Standard in lieu of DOE Order 450.1. In support of this new requirement the Director issued a new environmental policy that was documented in Section 3.0 of Document 1.2, ''ES&H Policies of LLNL'', in the ES&H Manual. In recent years the Environmental Management System (EMS) process has become formalized as LLNL adopted ISO 14001 as part of the contract under which the laboratory is operated for the Department of Energy (DOE). On May 9, 2005, LLNL revised its Integrated Safety Management System Description to enhance existingmore » environmental requirements to meet ISO 14001. Effective October 1, 2005, each new project or activity is required to be evaluated from an environmental aspect, particularly if a potential exists for significant environmental impacts. Authorizing organizations are required to consider the management of all environmental aspects, the applicable regulatory requirements, and reasonable actions that can be taken to reduce negative environmental impacts. During 2006, LLNL has worked to implement the corrective actions addressing the deficiencies identified in the DOE/LSO audit. LLNL has begun to update the present EMS to meet the requirements of ISO 14001:2004. The EMS commits LLNL--and each employee--to responsible stewardship of all the environmental resources in our care. The generation of mixed radioactive waste was identified as a significant environmental aspect. Mixed waste for the purposes of this report is defined as waste materials containing both hazardous chemical and radioactive constituents. Significant environmental aspects require that an Environmental Management Plan (EMP) be developed. The objective of the EMP developed for mixed waste (EMP-005) is to evaluate options for reducing the amount of mixed waste generated. This document presents the findings of the evaluation of mixed waste generated at LLNL and a proposed plan for reduction.« less

  9. Criticality Safety Evaluation of the LLNL Inherently Safe Subcritical Assembly (ISSA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Percher, Catherine

    2012-06-19

    The LLNL Nuclear Criticality Safety Division has developed a training center to illustrate criticality safety and reactor physics concepts through hands-on experimental training. The experimental assembly, the Inherently Safe Subcritical Assembly (ISSA), uses surplus highly enriched research reactor fuel configured in a water tank. The training activities will be conducted by LLNL following the requirements of an Integration Work Sheet (IWS) and associated Safety Plan. Students will be allowed to handle the fissile material under the supervision of LLNL instructors. This report provides the technical criticality safety basis for instructional operations with the ISSA experimental assembly.

  10. LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.

    2005-12-01

    We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.

  11. The Numerical Electromagnetics Code (NEC) - A Brief History

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, G J; Miller, E K; Poggio, A J

    The Numerical Electromagnetics Code, NEC as it is commonly known, continues to be one of the more widely used antenna modeling codes in existence. With several versions in use that reflect different levels of capability and availability, there are now 450 copies of NEC4 and 250 copies of NEC3 that have been distributed by Lawrence Livermore National Laboratory to a limited class of qualified recipients, and several hundred copies of NEC2 that had a recorded distribution by LLNL. These numbers do not account for numerous copies (perhaps 1000s) that were acquired through other means capitalizing on the open source code,more » the absence of distribution controls prior to NEC3 and the availability of versions on the Internet. In this paper we briefly review the history of the code that is concisely displayed in Figure 1. We will show how it capitalized on the research of prominent contributors in the early days of computational electromagnetics, how a combination of events led to the tri-service-supported code development program that ultimately led to NEC and how it evolved to the present day product. The authors apologize that space limitations do not allow us to provide a list of references or to acknowledge the numerous contributors to the code both of which can be found in the code documents.« less

  12. ARC-2010-ACD10-0020-034

    NASA Image and Video Library

    2010-02-10

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. LLNL's test piece is being installed on truck.

  13. Update On the Status of the FLUKA Monte Carlo Transport Code*

    NASA Technical Reports Server (NTRS)

    Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.

    2006-01-01

    The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation dose for beam target areas, dose calculations for radiation therapy as well as being adapted for use in the simulation of events in the ALICE detector at the LHC.

  14. Alternate Operating Scenarios for NDCX-II

    NASA Astrophysics Data System (ADS)

    Sharp, W. M.; Friedman, A.; Grote, D. P.; Cohen, R. H.; Lund, S. M.; Vay, J.-L.; Waldron, W. L.; Yeun, A.

    2011-10-01

    NDCX-II is an accelerator facility being built at LBNL to study ion-heated warm dense matter and aspects of ion-driven targets for inertial-fusion energy. The baseline design calls for using twelve induction cells to accelerate 40 nC of Li+ ions to 1.2 MeV. During commissioning, though, we plan to extend the source lifetime by extracting less total charge. For operational flexibility, the option of using a helium plasma source is also being investigated. Over time, we expect that NDCX-II will be upgraded to substantially higher energies, necessitating the use of heavier ions to keep a suitable deposition range in targets. Each of these options requires development of an alternate acceleration schedule and the associated transverse focusing. The schedules here are first worked out with a fast-running 1-D particle-in-cell code ASP, then 2-D and 3-D Warp simulations are used to verify the 1-D results and to design transverse focusing. Work performed under the auspices of U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344 and by LBNL under Contract DE-AC03-76SF00098.

  15. Emergency Response Capability Baseline Needs Assessment - Requirements Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharry, John A.

    This document was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by LLNL Emergency Management Department Head James Colson. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only addresses emergency response.

  16. Lawrence Livermore National Laboratory Environmental Report 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosene, Crystal

    The purposes of the Environmental Report 2016 are to record LLNL’s compliance with environmental standards and requirements, describe LLNL’s environmental protection and remediation programs, and present the results of environmental monitoring. Specifically, the report discusses LLNL’s EMS; describes significant accomplishments in pollution prevention; presents the results of air, water, vegetation, and foodstuff monitoring; reports radiological doses from LLNL operations; summarizes LLNL’s activities involving special status wildlife, plants, and habitats; and describes the progress LLNL has made in remediating groundwater contamination. Environmental monitoring at LLNL, including analysis of samples and data, is conducted according to documented standard operating procedures. Duplicate samplesmore » are collected and analytical results are reviewed and compared to internal acceptance standards. This report is prepared for DOE by LLNL’s Environmental Functional Area (EFA). Submittal of the report satisfies requirements under DOE Order 231.1B, “Environment, Safety and Health Reporting,” and DOE Order 458.1, “Radiation Protection of the Public and Environment.” The report is distributed in electronic form and is available to the public at https://saer.llnl.gov/, the website for the LLNL annual environmental report. Previous LLNL annual environmental reports beginning with 1994 are also on the website.« less

  17. Numerical Simulation of Quarry Blast Sources

    DTIC Science & Technology

    1994-01-01

    REPORT NUMBER 3701 N. Fairfax Dr. #717 3651 Lowry Avenue, SE Arlington, VA 22203-1714 Kirtland, AFB, NM 87117-5777 11. SUPPLEMENTARY NOTES 12a...Blasts, LLNL Report UCRL -JC- 109245. Wuster, J. (1993). Discrimination of Chemical Explosions and Earthquakes in Central Europe - A Case Study, Bull. Seism...NMRO Building 77 3701 N. Fairfax Drive, Suite 717 University of Arizona Arlington, VA 22203-1714 Tucson, AZ 85721 ARPAIOASB/Librarian Dr. William

  18. Detectability of Wellbore CO2 Leakage using the Magnetotelluric Method

    NASA Astrophysics Data System (ADS)

    Yang, X.; Buscheck, T. A.; Mansoor, K.; Carroll, S.

    2016-12-01

    We assessed the effectiveness of the magnetotelluric (MT) method in detecting CO2 and brine leakage through a wellbore, which penetrates a CO2 storage reservoir, into overlying aquifers, 0 to 1720 m in depth, in support of the USDOE National Risk Assessment Partnership (NRAP) monitoring program. Synthetic datasets based on the Kimberlina site in the southern San Joaquin Basin, California were created using CO2 storage reservoir models, wellbore leakage models, and groundwater/geochemical models of the overlying aquifers. The species concentrations simulated with the groundwater/geochemical models were converted into bulk electrical conductivity (EC) distributions as the MT model input. Brine and CO2 leakage into the overlying aquifers increases ion concentrations, and thus results in an EC increase, which may be detected by the MT method. Our objective was to estimate and maximize the probability of leakage detection using the MT method. The MT method is an electromagnetic geophysical technique that images the subsurface EC distribution by measuring natural electric and magnetic fields in the frequency range from 0.01 Hz to 1 kHz with sensors on the ground surface. The ModEM software was used to predict electromagnetic responses from brine and CO2 leakage and to invert synthetic MT data for recovery of subsurface conductivity distribution. We are in the process of building 1000 simulations for ranges of permeability, leakage flux, and hydraulic gradient to study leakage detectability and to develop an optimization method to answer when, where and how an MT monitoring system should be deployed to maximize the probability of leakage detection. This work was sponsored by the USDOE Fossil Energy, National Energy Technology Laboratory, managed by Traci Rodosta and Andrea McNemar. This work was performed under the auspices of the USDOE by LLNL under contract DE-AC52-07NA27344. LLNL IM release number is LLNL-ABS-699276.

  19. Report on the Threatened Valley Elderberry Longhorn Beetle and its Elderberry Food Plant at the Lawrence Livermore National Laboratory--Site 300

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Ph.D., R A; Woollett, J

    2004-11-16

    This report describes the results of an entomological survey in 2002 to determine the presence of the federally-listed, threatened Valley Elderberry Longhorn Beetle or ''VELB'' (Desmocerus culifornicus dimorphus: Coleoptera, Cerambycidae) and its elderberry food plant (Sumbucus mexicana: Caprifoliaceae) on the Lawrence Livermore National Laboratory's (LLNL) Experimental Test Site, known as Site 300. In addition, an area located immediately southeast of Site 300, which is owned and managed by the California Department of Fish and Game (CDFG), but secured by LLNL, was also included in this survey. This report will refer to the survey areas as the LLNL-Site 300 and themore » CDFG site. The 2002 survey included mapping the locations of elderberry plants that were observed using a global positioning system (GPS) to obtain positional coordinates for every elderberry plant at Site 300. In addition, observations of VELB adults and signs of their infestation on elderberry plants were also mapped using GPS technology. LLNL requested information on the VELB and its elderberry food plants to update earlier information that had been collected in 1991 (Arnold 1991) as part of the 1992 EIS/EIR for continued operation of LLNL. No VELB adults were observed as part of this prior survey. The findings of the 2002 survey reported herein will be used by LLNL as it updates the expected 2004 Environmental Impact Statement for ongoing operations at LLNL, including Site 300.« less

  20. Hyperspectral Sensors Final Report CRADA No. TC02173.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Priest, R. E.; Sauvageau, J. E.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and Science Applications International Corporation (SAIC), National Security Space Operations/SRBU, to develop longwave infrared (LWIR) hyperspectral imaging (HSI) sensors for airborne and potentially ground and space, platforms. LLNL has designed and developed LWIR HSI sensors since 1995. The current generation of these sensors has applications to users within the U.S. Department of Defense and the Intelligence Community. User needs are for multiple copies provided by commercial industry. To gain the most benefit from the U.S. Government’s prior investments inmore » LWIR HSI sensors developed at LLNL, transfer of technology and know-how from LLNL HSI experts to commercial industry was needed. The overarching purpose of the CRADA project was to facilitate the transfer of the necessary technology from LLNL to SAIC thereby allowing the U.S. Government to procure LWIR HSI sensors from this company.« less

  1. Lawrence Livermore National Laboratory environmental report for 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sims, J.M.; Surano, K.A.; Lamson, K.C.

    1990-01-01

    This report documents the results of the Environmental Monitoring Program at the Lawrence Livermore National Laboratory (LLNL) and presents summary information about environmental compliance for 1990. To evaluate the effect of LLNL operations on the local environment, measurements of direct radiation and a variety of radionuclides and chemical compounds in ambient air, soil, sewage effluent surface water, groundwater, vegetation, and foodstuff were made at both the Livermore site and at Site 300 nearly. LLNL's compliance with all applicable guides, standards, and limits for radiological and nonradiological emissions to the environment was evaluated. Aside from an August 13 observation of silvermore » concentrations slightly above guidelines for discharges to the sanitary sewer, all the monitoring data demonstrated LLNL compliance with environmental laws and regulations governing emission and discharge of materials to the environment. In addition, the monitoring data demonstrated that the environmental impacts of LLNL are minimal and pose no threat to the public to or to the environment. 114 refs., 46 figs., 79 tabs.« less

  2. Feasibility of Wide-Area Decontamination of Bacillus anthracis Spores Using a Germination-Lysis Approach

    DTIC Science & Technology

    2011-11-16

    Security, LLC 2011 CBD S& T Conference November 16, 2011 LLNL-PRES-508394 Lawrence Livermore National Laboratory LLNL-PRES-  Background...PRES-  Gruinard Island 5% formaldehyde  Sverdlosk Release UNKNOWN: but washing, chloramines , soil disposal believed to have been used...508394 Lawrence Livermore National Laboratory LLNL-PRES- 4 Disinfectant >6 Log Reduction on Materials (EPA, 2010a,b; Wood et al., 2011

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Robert C.

    Following the January 1980 earthquake that was felt at Lawrence Livermore National Laboratory (LLNL), a network of strong-motion accelerographs was installed at LLNL. Prior to the 1980 earthquake, there were no accelerographs installed. The ground motion from the 1980 earthquake was estimated from USGS instruments around the Laboratory to be between 0.2 – 0.3 g horizontal peak ground acceleration. These instruments were located at the Veterans Hospital, 5 miles southwest of LLNL, and in San Ramon, about 12 miles west of LLNL. In 2011, the Department of Energy (DOE) requested to know the status of our seismic instruments. We conductedmore » a survey of our instrumentation systems and responded to DOE in a letter. During this survey, it was found that the recorders in Buildings 111 and 332 were not operational. The instruments on Nova had been removed, and only three of the 10 NIF instruments installed in 2005 were operational (two were damaged and five had been removed from operation at the request of the program). After the survey, it was clear that the site seismic instrumentation had degraded substantially and would benefit from an overhaul and more attention to ongoing maintenance. LLNL management decided to update the LLNL seismic instrumentation system. The updated system is documented in this report.« less

  4. 2004 Environmental Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Althouse, P E; Bertoldo, N A; Brown, R A

    2005-09-28

    The Lawrence Livermore National Laboratory (LLNL) annual Environmental Report, prepared for the Department of Energy (DOE) and made available to the public, presents summary environmental data that characterizes site environmental management performance, summarizes environmental occurrences and responses reported during the calendar year, confirms compliance with environmental standards and requirements, and highlights significant programs and efforts. By explaining the results of effluent and environmental monitoring, mentioning environmental performance indicators and performance measure programs, and assessing the impact of Laboratory operations on the environment and the public, the report also demonstrates LLNL's continuing commitment to minimize any potentially adverse impact of itsmore » operations. The combination of environmental and effluent monitoring, source characterization, and dose assessment showed that radiological doses to the public caused by LLNL operations in 2004 were less than 0.26% of regulatory standards and more than 11,000 times smaller than dose from natural background. Analytical results and evaluations generally showed continuing low levels of most contaminants; remediation efforts further reduced the concentrations of contaminants of concern in groundwater and soil vapor. In addition, LLNL's extensive environmental compliance activities related to water, air, endangered species, waste, wastewater, and waste reduction controlled or reduced LLNL's effects on the environment. LLNL's environmental program clearly demonstrates a commitment to protecting the environment from operational impacts.« less

  5. Dynamics of Exploding Plasma Within a Magnetized Plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dimonte, G; Dipeso, G; Hewett, D

    2002-02-01

    This memo describes several possible laboratory experiments on the dynamics of an exploding plasma in a background magnetized plasma. These are interesting scientifically and the results are applicable to energetic explosions in the earth's ionosphere (DOE Campaign 7 at LLNL). These proposed experiments are difficult and can only be performed in the new LAPD device at UCLA. The purpose of these experiments would be to test numerical simulations, theory and reduced models for systems performance codes. The experiments are designed to investigate the affect of the background plasma on (1) the maximum diamagnetic bubble radius given by Eq. 9; andmore » (2) the Alfven wave radiation efficiency produced by the induced current J{sub A} (Eqs. 10-12) These experiments involve measuring the bubble radius using a fast gated optical imager as in Ref [1] and the Alfven wave profile and intensity as in Ref [2] for different values of the exploding plasma energy, background plasma density and temperature, and background magnetic field. These experiments extend the previously successful experiments [2] on Alfven wave coupling. We anticipate that the proposed experiments would require 1-2 weeks of time on the LAPD. We would perform PIC simulations in support of these experiments in order to validate the codes. Once validated, the PIC simulations would then be able to be extended to realistic ionospheric conditions with various size explosions and altitudes. In addition to the Alfven wave coupling, we are interested in the magnetic containment and transport of the exploding ''debris'' plasma to see if the shorting of the radial electric field in the magnetic bubble would allow the ions to propagate further. This has important implications in an ionospheric explosion because it defines the satellite damage region. In these experiments, we would field fast gated optical cameras to obtain images of the plasma expansion, which could then be correlated with magnetic probe measurements. In this regard, it would be most helpful to have a more powerful laser more than 10J in order to increase the extent of the magnetic bubble.« less

  6. Development of a Dynamic Time Sharing Scheduled Environment Final Report CRADA No. TC-824-94E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M.; Caliga, D.

    Massively parallel computers, such as the Cray T3D, have historically supported resource sharing solely with space sharing. In that method, multiple problems are solved by executing them on distinct processors. This project developed a dynamic time- and space-sharing scheduler to achieve greater interactivity and throughput than could be achieved with space-sharing alone. CRI and LLNL worked together on the design, testing, and review aspects of this project. There were separate software deliverables. CFU implemented a general purpose scheduling system as per the design specifications. LLNL ported the local gang scheduler software to the LLNL Cray T3D. In this approach, processorsmore » are allocated simultaneously to aU components of a parallel program (in a “gang”). Program execution is preempted as needed to provide for interactivity. Programs are also reIocated to different processors as needed to efficiently pack the computer’s torus of processors. In phase one, CRI developed an interface specification after discussions with LLNL for systemlevel software supporting a time- and space-sharing environment on the LLNL T3D. The two parties also discussed interface specifications for external control tools (such as scheduling policy tools, system administration tools) and applications programs. CRI assumed responsibility for the writing and implementation of all the necessary system software in this phase. In phase two, CRI implemented job-rolling on the Cray T3D, a mechanism for preempting a program, saving its state to disk, and later restoring its state to memory for continued execution. LLNL ported its gang scheduler to the LLNL T3D utilizing the CRI interface implemented in phases one and two. During phase three, the functionality and effectiveness of the LLNL gang scheduler was assessed to provide input to CRI time- and space-sharing, efforts. CRI will utilize this information in the development of general schedulers suitable for other sites and future architectures.« less

  7. 10 CFR 850 Implementation of Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S

    2012-01-05

    10 CFR 850 defines a contractor as any entity, including affiliated entities, such as a parent corporation, under contract with DOE, including a subcontractor at any tier, with responsibility for performing work at a DOE site in furtherance of a DOE mission. The Chronic Beryllium Disease Prevention Program (CBDPP) applies to beryllium-related activities that are performed at the Lawrence Livermore National Laboratory (LLNL). The CBDPP or Beryllium Safety Program is integrated into the LLNL Worker Safety and Health Program and, thus, implementation documents and responsibilities are integrated in various documents and organizational structures. Program development and management of the CBDPPmore » is delegated to the Environment, Safety and Health (ES&H) Directorate, Worker Safety and Health Functional Area. As per 10 CFR 850, Lawrence Livermore National Security, LLC (LLNS) periodically submits a CBDPP to the National Nuclear Security Administration/Livermore Site Office (NNSA/LSO). The requirements of this plan are communicated to LLNS workers through ES&H Manual Document 14.4, 'Working Safely with Beryllium.' 10 CFR 850 is implemented by the LLNL CBDPP, which integrates the safety and health standards required by the regulation, components of the LLNL Integrated Safety Management System (ISMS), and incorporates other components of the LLNL ES&H Program. As described in the regulation, and to fully comply with the regulation, specific portions of existing programs and additional requirements are identified in the CBDPP. The CBDPP is implemented by documents that interface with the workers, principally through ES&H Manual Document 14.4. This document contains information on how the management practices prescribed by the LLNL ISMS are implemented, how beryllium hazards that are associated with LLNL work activities are controlled, and who is responsible for implementing the controls. Adherence to the requirements and processes described in the ES&H Manual ensures that ES&H practices across LLNL are developed in a consistent manner. Other implementing documents, such as the ES&H Manual, are integral in effectively implementing 10 CFR 850.« less

  8. Non LTE Effects in Laser Plasmas

    NASA Astrophysics Data System (ADS)

    Klapisch, Marcel

    1997-11-01

    Laser produced plasmas are not in Local Thermodynamical Equilibrium(LTE) because of the strong gradients and the escaping radiation. Departure from LTE changes the average charge state Z^*, and through it the electron temperature and other thermodynamical variables. Hydrodynamic simulations using LTE and non LTE modes show that in some cases the temperatures can change by an order of magnitude. Several rad/hydro models have solved the approximate atomic rate equations in-line within the average atom model(W. A. Lokke and W. H. Grasburger, LLNL, Report UCRL-52276 (1977),G. Pollack, LANL, Report LA-UR-90-2423 (1990)), or with global rates(M. Busquet, J. P. Raucourt and J. C. Gauthier, J. Quant. Spectrosc. Radiat. Transfer, 54, 81 (1995)). A new technique developed by Busquet, the Radiation Dependent Ionization Model (RADIOM)(M. Busquet, Phys. Fluids B, 5, 4191 (1993)) has been implemented in the NRL hydro-code. It uses an ionization temperature Tz to obtain the opacities and EOS in table look-ups. A very elaborate LTE atomic physics such as the STA code( A. Bar-Shalom and J. Oreg, Phys. Rev. E, 54, 1850 (1996), and ref. therein), or OPAL, can then be used off-line for generating the tables. The algorithm for Tz is very simple and quick. RADIOM has recently been benchmarked with a new detailed collisional radiative model SCROLL(A. Bar-Shalom, J. Oreg and M. Klapisch, Phys. Rev. E, to appear in July (1997)) on a range of temperatures, densities and atomic numbers. RADIOM has been surprisingly successful in calculations of non-LTE opacities.

  9. X-ray spectroscopy of E2 and M3 transitions in Ni-like W

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clementson, J.; Beiersdorfer, P.; Gu, M. F.

    2010-01-15

    The electric quadrupole (E2) and magnetic octupole (M3) ground-state transitions in Ni-like W{sup 46+} have been measured using high-resolution crystal spectroscopy at the LLNL electron-beam ion trap facility. The lines fall in the soft x-ray region near 7.93 A and were originally observed as an unresolved feature in tokamak plasmas. Using flat ammonium dihydrogen phosphate and quartz crystals, the wavelengths, intensities, and polarizations of the two lines have been measured for various electron-beam energies and compared to intensity and polarization calculations performed using the Flexible Atomic Code (FAC).

  10. Update of the Nuclear Criticality Slide Rule for the Emergency Response to a Nuclear Criticality Accident

    NASA Astrophysics Data System (ADS)

    Duluc, Matthieu; Bardelay, Aurélie; Celik, Cihangir; Heinrichs, Dave; Hopper, Calvin; Jones, Richard; Kim, Soon; Miller, Thomas; Troisne, Marc; Wilson, Chris

    2017-09-01

    AWE (UK), IRSN (France), LLNL (USA) and ORNL (USA) began a long term collaboration effort in 2015 to update the nuclear criticality Slide Rule for the emergency response to a nuclear criticality accident. This document, published almost 20 years ago, gives order of magnitude estimates of key parameters, such as number of fissions and doses (neutron and gamma), useful for emergency response teams and public authorities. This paper will present, firstly the motivation and the long term objectives for this update, then the overview of the initial configurations for updated calculations and preliminary results obtained with modern 3D codes.

  11. Final Project Report "Advanced Concept Exploration For Fast Ignition Science Program"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    STEPHENS, Richard B.; McLEAN, Harry M.; THEOBALD, Wolfgang

    The Fast Ignition (FI) Concept for Inertial Confinement Fusion has the potential to provide a significant advance in the technical attractiveness of Inertial Fusion Energy (IFE) reactors. FI differs from conventional “central hot spot” (CHS) target ignition by decoupling compression from heating: using the laser (or heavy ion beam or Z pinch) drive pulse (10’s of ns) to create a dense fuel and a second, much shorter (~10 ps) high intensity pulse to ignite a small region of it. There are two major physics issues concerning this concept; controlling the laser-induced generation of large electron currents and their propagation throughmore » high density plasmas. This project has addressed these two significant scientific issues in Relativistic High Energy Density (RHED) physics. Learning to control relativistic laser matter interaction (and the limits and potential thereof) will enable a wide range of applications. While these physics issues are of specific interest to inertial fusion energy science, they are also important for a wide range of other HED phenomena, including high energy ion beam generation, isochoric heating of materials, and the development of high brightness x-ray sources. Generating, controlling, and understanding the extreme conditions needed to advance this science has proved to be challenging: Our studies have pushed the boundaries of physics understanding and are at the very limits of experimental, diagnostic, and simulation capabilities in high energy density laboratory physics (HEDLP). Our research strategy has been based on pursuing the fundamental physics underlying the Fast Ignition (FI) concept. We have performed comprehensive study of electron generation and transport in fast-ignition targets with experiments, theory, and numerical modeling. A major issue is that the electrons produced in these experiments cannot be measured directly—only effects due to their transport. We focused mainly on x-ray continuum photons from bremsstrahlung and x-ray line radiation from K-shell fluorescence. Integrated experiments, which combine target compression with short-pulse laser heating, yield additional information on target heating efficiency. This indirect way of studying the underlying behavior of the electrons must be validated with computational modeling to understand the physics and improve the design. This program execution required a large, well-organized team and it was managed by a joint Collaboration between General Atomics (GA), Lawrence Livermore National Laboratory (LLNL), and the Laboratory for Laser Energetics (LLE). The Collaboration was formed 8 years ago to understand the physics issues of the Fast Ignition concept, building on the strengths of each partner. GA fulfills its responsibilities jointly with the University of California, San Diego (UCSD), The Ohio State University (OSU) and the University of Nevada at Reno (UNR). Since RHED physics is pursued vigorously in many countries, international researchers have been an important part of our efforts to make progress. The division of responsibility was as follows: (1) LLE had primary leadership for channeling studies and the integrated energy transfer, (2) LLNL led the development of measurement methods, analysis, and deployment of diagnostics, and (3) GA together with UCSD, OSU and UNR studied the detailed energy-transfer physics. The experimental program was carried out using the Titan laser at the Jupiter Laser Facility at LLNL, the OMEGA and OMEGA EP lasers at LLE and the Texas Petawatt laser (TPW) at UT Austin. Modeling has been pursued on large computing facilities at LLNL, OSU, and UCSD using codes developed (by us and others) within the HEDLP program, commercial codes, and by leveraging existing supercomputer codes developed by the NNSA ICF program. This Consortium brought together all the components—resources, facilities, and personnel—necessary to accomplish its aggressive goals. The ACE Program has been strongly collaborative, taking advantage of the expertise of the participating institutions to provide a research effort that is far greater than the sum of its parts. The results of this work have firmly strengthened the scientific foundation from which the viability of FI and other applications can be evaluated. Program execution has also led to improved diagnostics for probing dense, hot plasmas, detailed understanding of high-current, relativistic electron energy generation and transport across boundaries and into dense plasmas, and greatly enhanced predictive modeling capabilities. One important aspect of this program was the involvement and training of young scientists including postdoctoral fellows and graduate students. During the entire 8 years of FI and ACE project period since 2005, more than fifteen graduate students completed their doctoral dissertations including three from OSU and two from UCSD in last three years. This project generated an impressive forty articles in high quality journals including nine (including two under review) in Physical Review Letters during the last funding period since 2011.« less

  12. International Safeguards Technology and Policy Education and Training Pilot Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreicer, M; Anzelon, G A; Essner, J T

    2009-06-16

    A major focus of the National Nuclear Security Administration-led Next Generation Safeguards Initiative (NGSI) is the development of human capital to meet present and future challenges to the safeguards regime. An effective university-level education in safeguards and related disciplines is an essential element in a layered strategy to rebuild the safeguards human resource capacity. NNSA launched two pilot programs in 2008 to develop university level courses and internships in association with James, Martin Center for Nonproliferation Studies (CNS) at the Monterey Institute of International Studies (MIIS) and Texas A&M University (TAMU). These pilot efforts involved 44 students in total andmore » were closely linked to hands-on internships at Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). The Safeguards and Nuclear Material Management pilot program was a collaboration between TAMU, LANL, and LLNL. The LANL-based coursework was shared with the students undertaking internships at LLNL via video teleconferencing. A weeklong hands-on exercise was also conducted at LANL. A second pilot effort, the International Nuclear Safeguards Policy and Information Analysis pilot program was implemented at MIIS in cooperation with LLNL. Speakers from MIIS, LLNL, and other U.S. national laboratories (LANL, BNL) delivered lectures for the audience of 16 students. The majority of students were senior classmen or new master's degree graduates from MIIS specializing in nonproliferation policy studies. The two pilots programs concluded with an NGSI Summer Student Symposium, held at LLNL, where 20 students participated in LLNL facility tours and poster sessions. The value of bringing together the students from the technical and policy pilots was notable and will factor into the planning for the continued refinement of the two programs in the coming years.« less

  13. IGPP 1999-2000 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryerson, F J; Cook, K; Hitchcock, B

    2003-01-27

    The Institute of Geophysics and Planetary Physics (IGPP) is a Multicampus Research Unit of the University of California (UC). IGPP was founded in 1946 at UC Los Angeles with a charter to further research in the earth and planetary sciences and related fields. The Institute now has branches at UC campuses in Irvine, Los Angeles, San Diego, Santa Cruz and Riverside, and at Los Alamos National Laboratory and Lawrence Livermore National Laboratory. The University-wide IGPP has played an important role in establishing interdisciplinary research in the earth and planetary sciences. For example, IGPP was instrumental in founding the fields ofmore » physical oceanography and space physics, which at the time fell between the cracks of established university departments. Because of its multicampus orientation, IGPP has sponsored important inter-institutional consortia in the earth and planetary sciences. Each of the seven branches has a somewhat different intellectual emphasis as a result of the interplay between strengths of campus departments and Laboratory programs. The IGPP branch at Lawrence Livermore National Laboratory (LLNL) was approved by the Regents of the University of California in 1982. IGPP-LLNL emphasizes research in tectonics, geochemistry, and astrophysics. It provides a venue for studying the fundamental aspects of these fields, thereby complementing LLNL programs that pursue applications of these disciplines in national security and energy research. IGPP-LLNL was directed by Charles Alcock during this period and was originally organized into three centers: Geosciences, stressing seismology; High-Pressure Physics, stressing experiments using the two-stage light-gas gun at LLNL; and Astrophysics, stressing theoretical and computational astrophysics. In 1994, the activities of the Center for High-Pressure Physics were merged with those of the Center for Geosciences. The Center for Geosciences, headed by Frederick Ryerson, focuses on research in geophysics and geochemistry. The Astrophysics Research Center, headed by Kem Cook, provides a home for theoretical and observational astrophysics and serves as an interface with the Physics Directorate's astrophysics efforts. At the end of the period covered by this report, Alcock left for the University of Pennsylvania. Cook became Acting Director of IGPP, the Physics Direcorate merged with portions of the old Lasers Direcorate to become Physics and Advanced Technologies. Energy Programs and Earth and Environmental Sciences Directorate became Energy and Environment Sciences Directorate. The IGPP branch at LLNL (as well as the branch at Los Alamos) also facilitates scientific collaborations between researchers at the UC campuses and those at the national laboratories in areas related to earth science, planetary science, and astrophysics. It does this by sponsoring the University Collaborative Research Program (UCRP), which provides funds to UC campus scientists for joint research projects with LLNL. Additional information regarding IGPP-LLNL projects and people may be found at http://wwwigpp. llnl.gov/. The goals of the UCRP are to enrich research opportunities for UC campus scientists by making available to them some of LLNL's unique facilities and expertise, and to broaden the scientific program at LLNL through collaborative or interdisciplinary work with UC campus researchers. UCRP funds (provided jointly by the Regents of the University of California and by the Director of LLNL) are awarded annually on the basis of brief proposals, which are reviewed by a committee of scientists from UC campuses, LLNL programs, and external universities and research organizations. Typical annual funding for a collaborative research project ranges from $5,000 to $30,000. Funds are used for a variety of purposes, such as salary support for UC graduate students, postdoctoral fellows; and costs for experimental facilities. A statistical overview of IGPP-LLNL's UCRP (colloquially known as the mini-grant program) is presented in Figures 1 and 2. Figure 1 shows the distribution of UCRP awards among the UC campuses, by total amount awarded and by number of proposals funded. Figure 2 shows the distribution of awards by center. Although the permanent LLNL staff assigned to IGPP is relatively small (presently about 8 full-time equivalents), IGPP's research centers have become vital research organizations. This growth has been possible because of IGPP support for a substantial group of resident postdoctoral fellows; because of the 20 or more UCRP projects funded each year; and because IGPP hosts a variety of visitors, guests, and faculty members (from both UC and other institutions). To focus attention on areas of topical interest in the geosciences and astrophysics, IGPP--LLNL hosts conferences and workshops and also organizes seminars in astrophysics and geosciences.« less

  14. Advanced Design Concepts for Dense Plasma Focus Devices at LLNL

    NASA Astrophysics Data System (ADS)

    Povilus, Alexander; Podpaly, Yuri; Cooper, Christopher; Shaw, Brian; Chapman, Steve; Mitrani, James; Anderson, Michael; Pearson, Aric; Anaya, Enrique; Koh, Ed; Falabella, Steve; Link, Tony; Schmidt, Andrea

    2017-10-01

    The dense plasma focus (DPF) is a z-pinch device where a plasma sheath is accelerated down a coaxial railgun and ends in a radial implosion, pinch phase. During the pinch phase, the plasma generates intense, transient electric fields through physical mechanisms, similar to beam instabilities, that can accelerate ions in the plasma sheath to MeV-scale energies on millimeter length scales. Using kinetic modeling techniques developed at LLNL, we have gained insight into the formation of these accelerating fields and are using these observations to optimize the behavior of the generated ion beam for producing neutrons via beam-target interactions for kilojoule to megajoule-scale devices. Using a set of DPF's, both in operation and in development at LLNL, we have explored critical aspects of these devices, including plasma sheath formation behavior, power delivery to the plasma, and instability seeding during the implosion in order to improve the absolute yield and stability of the device. Prepared by LLNL under Contract DE-AC52-07NA27344. Computing support for this work came from the LLNL Institutional Computing Grand Challenge program.

  15. SPH modeling of the Stickney impact at Phobos

    NASA Astrophysics Data System (ADS)

    Bruck Syal, Megan; Rovny, Jared; Owen, J. Michael; Miller, Paul L.

    2016-10-01

    Stickney crater stretches across nearly half the diameter of ~22-km Phobos, the larger of the two martian moons. The Stickney-forming impact would have had global consequences for Phobos, causing extensive damage to the satellite's interior and initiating large-scale resurfacing through ejecta blanket emplacement. Further, much of the ejected material that initially escaped the moon's tiny gravity (escape velocity of ~11 m/s) would have likely reimpacted on subsequent orbits. Modeling of the impact event is necessary to understand the conditions that allowed this "megacrater" to form without disrupting the entire satellite. Impact simulation results also provide a means to test several different hypotheses for how the mysterious families of parallel grooves may have formed at Phobos.We report on adaptive SPH simulations that successfully generate Stickney while avoiding catastrophic fragmentation of Phobos. Inclusion of target porosity and using sufficient numerical resolution in fully 3-D simulations are key for avoiding over-estimation of target damage. Cratering efficiency follows gravity-dominated scaling laws over a wide range of velocities (6-20 km/s) for the appropriate material constants. While the adaptive SPH results are used to constrain crater volume and fracture patterns within the target, additional questions about the fate of ejecta and final crater morphology within an unusual gravity environment can be addressed with complementary numerical methods. Results from the end of the hydrodynamics-controlled phase (tens of seconds after impact) are linked to a Discrete Element Method code, which can explore these processes over longer time scales (see Schwartz et al., this meeting).This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-695442.

  16. Advanced Simulation and Computing: A Summary Report to the Director's Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less

  17. Void Growth and Coalescence in Dynamic Fracture of FCC and BCC Metals - Molecular Dynamics Study

    NASA Astrophysics Data System (ADS)

    Seppälä, Eira

    2004-03-01

    In dynamic fracture of ductile metals, the state of tension causes the nucleation of voids, typically from inclusions or grain boundary junctions, which grow and ultimately coalesce to form the fracture surface. Significant plastic deformation occurs in the process, including dislocations emitted to accommodate the growing voids. We have studied at the atomistic scale growth and coalescence processes of voids with concomitant dislocation formation. Classical molecular dynamics (MD) simulations of one and two pre-existing spherical voids initially a few nanometers in radius have been performed in single-crystal face-centered-cubic (FCC) and body-centered-cubic (BCC) lattices under dilational strain with high strain-rates. Million atom simulations of single void growth have been done to study the effect of stress triaxiality,^1 along with strain rate and lattice-structure dependence. An interesting prolate-to-oblate transition in the void shape in uniaxial expansion has been observed and quantitatively analyzed. The simulations also confirm that the plastic strain results directly from the void growth. Interaction and coalescence between two voids have been studied utilizing a parallel MD code in a seven million atom system. In particular, the movement of centers of the voids, linking of the voids, and the shape changes in vicinity of the other void are studied. Also the critical intervoid ligament distance after which the voids can be treated independently has been searched. ^1 E. T. Seppälä, J. Belak, and R. E. Rudd, cond-mat/0310541, submitted to Phys. Rev. B. Acknowledgment: This work was done in collaboration with Dr. James Belak and Dr. Robert E. Rudd, LLNL. It was performed under the auspices of the US Dept. of Energy at the Univ. of Cal./Lawrence Livermore National Laboratory under contract no. W-7405-Eng-48.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemhoff, A P; Burnham, A K; Nichols III, A L

    The reduction of the number of reactions in kinetic models for both the HMX beta-delta phase transition and thermal cookoff provides an attractive alternative to traditional multi-stage kinetic models due to reduced calibration effort requirements. In this study, we use the LLNL code ALE3D to provide calibrated kinetic parameters for a two-reaction bidirectional beta-delta HMX phase transition model based on Sandia Instrumented Thermal Ignition (SITI) and Scaled Thermal Explosion (STEX) temperature history curves, and a Prout-Tompkins cookoff model based on One-Dimensional Time to Explosion (ODTX) data. Results show that the two-reaction bidirectional beta-delta transition model presented here agrees as wellmore » with STEX and SITI temperature history curves as a reversible four-reaction Arrhenius model, yet requires an order of magnitude less computational effort. In addition, a single-reaction Prout-Tompkins model calibrated to ODTX data provides better agreement with ODTX data than a traditional multi-step Arrhenius model, and can contain up to 90% less chemistry-limited time steps for low-temperature ODTX simulations. Manual calibration methods for the Prout-Tompkins kinetics provide much better agreement with ODTX experimental data than parameters derived from Differential Scanning Calorimetry (DSC) measurements at atmospheric pressure. The predicted surface temperature at explosion for STEX cookoff simulations is a weak function of the cookoff model used, and a reduction of up to 15% of chemistry-limited time steps can be achieved by neglecting the beta-delta transition for this type of simulation. Finally, the inclusion of the beta-delta transition model in the overall kinetics model can affect the predicted time to explosion by 1% for the traditional multi-step Arrhenius approach, while up to 11% using a Prout-Tompkins cookoff model.« less

  19. Application of global kinetic models to HMX beta-delta transition and cookoff processes.

    PubMed

    Wemhoff, Aaron P; Burnham, Alan K; Nichols, Albert L

    2007-03-08

    The reduction of the number of reactions in kinetic models for both the HMX (octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine) beta-delta phase transition and thermal cookoff provides an attractive alternative to traditional multi-stage kinetic models due to reduced calibration effort requirements. In this study, we use the LLNL code ALE3D to provide calibrated kinetic parameters for a two-reaction bidirectional beta-delta HMX phase transition model based on Sandia instrumented thermal ignition (SITI) and scaled thermal explosion (STEX) temperature history curves, and a Prout-Tompkins cookoff model based on one-dimensional time to explosion (ODTX) data. Results show that the two-reaction bidirectional beta-delta transition model presented here agrees as well with STEX and SITI temperature history curves as a reversible four-reaction Arrhenius model yet requires an order of magnitude less computational effort. In addition, a single-reaction Prout-Tompkins model calibrated to ODTX data provides better agreement with ODTX data than a traditional multistep Arrhenius model and can contain up to 90% fewer chemistry-limited time steps for low-temperature ODTX simulations. Manual calibration methods for the Prout-Tompkins kinetics provide much better agreement with ODTX experimental data than parameters derived from differential scanning calorimetry (DSC) measurements at atmospheric pressure. The predicted surface temperature at explosion for STEX cookoff simulations is a weak function of the cookoff model used, and a reduction of up to 15% of chemistry-limited time steps can be achieved by neglecting the beta-delta transition for this type of simulation. Finally, the inclusion of the beta-delta transition model in the overall kinetics model can affect the predicted time to explosion by 1% for the traditional multistep Arrhenius approach, and up to 11% using a Prout-Tompkins cookoff model.

  20. Theoretical studies of defect formation and target heating by intense pulsed ion beams

    NASA Astrophysics Data System (ADS)

    Barnard, J. J.; Schenkel, T.; Persaud, A.; Seidl, P. A.; Friedman, A.; Grote, D. P.; Davidson, R. C.; Gilson, E. P.; Kaganovich, I.

    2015-11-01

    We present results of three studies related to experiments on NDCX-II, the Neutralized Drift Compression Experiment, a short-pulse (~ 1ns), high-current (~ 70A) linear accelerator for 1.2 MeV ions at LBNL. These include: (a) Coupled transverse and longitudinal envelope calculations of the final non-neutral ion beam transport, followed by neutralized drift and final focus, for a number of focus and drift lengths and with a series of ion species (Z =1-19). Predicted target fluences were obtained and target temperatures in the 1 eV range estimated. (b) HYDRA simulations of the target response for Li and He ions and for Al and Au targets at various ion fluences (up to 1012 ions/pulse/mm2) and pulse durations, benchmarking temperature estimates from the envelope calculations. (c) Crystal-Trim simulations of ion channeling through single-crystal lattices, with comparisons to ion transmission data as a function of orientation angle of the crystal foil and for different ion intensities and ion species. This work was performed under the auspices of the U.S. DOE under contracts DE-AC52-07NA27344 (LLNL), DE-AC02-05CH11231 (LBNL) and DE-AC02-76CH0307 (PPPL) and was supported by the US DOE Office of Science, Fusion Energy Sciences. LLNL-ABS-67521.

  1. FY 2008 Next Generation Safeguards Initiative International Safeguards Education and Training Pilot Progerams Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreicer, M; Anzelon, G; Essner, J

    2008-10-17

    Key component of the Next Generation Safeguards Initiative (NGSI) launched by the National Nuclear Security Administration is the development of human capital to meet present and future challenges to the safeguards regime. An effective university-level education in safeguards and related disciplines is an essential element in a layered strategy to rebuild the safeguards human resource capacity. Two pilot programs at university level, involving 44 students, were initiated and implemented in spring-summer 2008 and linked to hands-on internships at LANL or LLNL. During the internships, students worked on specific safeguards-related projects with a designated Laboratory Mentor to provide broader exposure tomore » nuclear materials management and information analytical techniques. The Safeguards and Nuclear Material Management pilot program was a collaboration between the Texas A&M University (TAMU), Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). It included a 16-lecture course held during a summer internship program. The instructors for the course were from LANL together with TAMU faculty and LLNL experts. The LANL-based course was shared with the students spending their internship at LLNL via video conference. A week-long table-top (or hands-on) exercise on was also conducted at LANL. The student population was a mix of 28 students from a 12 universities participating in a variety of summer internship programs held at LANL and LLNL. A large portion of the students were TAMU students participating in the NGSI pilot. The International Nuclear Safeguards Policy and Information Analysis pilot program was implemented at the Monterey Institute for International Studies (MIIS) in cooperation with LLNL. It included a two-week intensive course consisting of 20 lectures and two exercises. MIIS, LLNL, and speakers from other U.S. national laboratories (LANL, BNL) delivered lectures for the audience of 16 students. The majority of students were senior classmen or new master's degree graduates from MIIS specializing in nonproliferation policy studies. Other university/organizations represented: University of California in LA, Stanford University, and the IAEA. Four of the students that completed this intensive course participated in a 2-month internship at LLNL. The conclusions of the two pilot courses and internships was a NGSI Summer Student Symposium, held at LLNL, where 20 students participated in LLNL facility tours and poster sessions. The Poster sessions were designed to provide a forum for sharing the results of their summer projects and providing experience in presenting their work to a varied audience of students, faculty and laboratory staff. The success of bringing together the students from the technical and policy pilots was notable and will factor into the planning for the continued refinement of their two pilot efforts in the coming years.« less

  2. Carbon Nanotube Membranes for Water Purification

    NASA Astrophysics Data System (ADS)

    Bakajin, Olgica

    2009-03-01

    Carbon nanotubes are an excellent platform for the fundamental studies of transport through channels commensurate with molecular size. Water transport through carbon nanotubes is also believed to be similar to transport in biological channels such as aquaporins. I will discuss the transport of gas, water and ions through microfabricated membranes with sub-2 nanometer aligned carbon nanotubes as ideal atomically-smooth pores. The measured gas flow through carbon nanotubes exceeded predictions of the Knudsen diffusion model by more than an order of magnitude. The measured water flow exceeded values calculated from continuum hydrodynamics models by more than three orders of magnitude and is comparable to flow rates extrapolated from molecular dynamics simulations and measured for aquaporins. More recent reverse osmosis experiments reveal ion rejection by our membranes. Based on our experimental findings, the current understanding of the fundamentals of water and gas transport and of ion rejection will be discussed. The potential application space that exploits these unique nanofluidic phenomena will be explored. The extremely high permeabilities of these membranes, combined with their small pore size will enable energy efficient filtration and eventually decrease the cost of water purification.[4pt] In collaboration with Francesco Fornasiero, Biosciences and Biotechnology Division, PLS, LLNL, Livermore, CA 94550; Sangil Kim, NSF Center for Biophotonics Science & Technology, University of California at Davis, Sacramento CA 95817; Jung Bin In, Mechanical Engineering Department, UC Berkeley, Berkeley CA 94720; Hyung Gyu Park, Jason K Holt, and Michael Stadermann, Biosciences and Biotechnology Division, PLS, LLNL; Costas P. Grigoropoulos, Mechanical Engineering Department, UC Berkeley; Aleksandr Noy, Biosciences and Biotechnology Division, PLS, LLNL and School of Natural Sciences, University of California at Merced.

  3. Corporate Functional Management Evaluation of the LLNL Radiation Safety Organization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sygitowicz, L S

    2008-03-20

    A Corporate Assess, Improve, and Modernize review was conducted at Lawrence Livermore National Laboratory (LLNL) to evaluate the LLNL Radiation Safety Program and recommend actions to address the conditions identified in the Internal Assessment conducted July 23-25, 2007. This review confirms the findings of the Internal Assessment of the Institutional Radiation Safety Program (RSP) including the noted deficiencies and vulnerabilities to be valid. The actions recommended are a result of interviews with about 35 individuals representing senior management through the technician level. The deficiencies identified in the LLNL Internal Assessment of the Institutional Radiation Safety Program were discussed with Radiationmore » Safety personnel team leads, customers of Radiation Safety Program, DOE Livermore site office, and senior ES&H management. There are significant issues with the RSP. LLNL RSP is not an integrated, cohesive, consistently implemented program with a single authority that has the clear roll and responsibility and authority to assure radiological operations at LLNL are conducted in a safe and compliant manner. There is no institutional commitment to address the deficiencies that are identified in the internal assessment. Some of these deficiencies have been previously identified and corrective actions have not been taken or are ineffective in addressing the issues. Serious funding and staffing issues have prevented addressing previously identified issues in the Radiation Calibration Laboratory, Internal Dosimetry, Bioassay Laboratory, and the Whole Body Counter. There is a lack of technical basis documentation for the Radiation Calibration Laboratory and an inadequate QA plan that does not specify standards of work. The Radiation Safety Program lack rigor and consistency across all supported programs. The implementation of DOE Standard 1098-99 Radiological Control can be used as a tool to establish this consistency across LLNL. The establishment of a site wide ALARA Committee and administrative control levels would focus attention on improved processes. Currently LLNL issues dosimeters to a large number of employees and visitors that do not enter areas requiring dosimetry. This includes 25,000 visitor TLDs per year. Dosimeters should be issued to only those personnel who enter areas where dosimetry is required.« less

  4. LLNL compiled first pages ordered by ascending B&R code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, G; Kumar, M; Tobin, J

    We aim to develop a fundamental understanding of materials dynamics (from {micro}s to ns) in systems where the required combination of spatial and temporal resolution can only be reached by the dynamic transmission electron microscope (DTEM). In this regime, the DTEM is capable of studying complex transient phenomena with several orders of magnitude time resolution advantage over any existing in-situ TEM. Using the unique in situ capabilities and the nanosecond time resolution of the DTEM, we seek to study complex transient phenomena associated with rapid processes in materials, such as active sites on nanoscale catalysts and the atomic level mechanismsmore » and microstructural features for nucleation and growth associated with phase transformations in materials, specifically in martensite formation and crystallization reactions from the amorphous phase. We also will study the transient phase evolution in rapid solid-state reactions, such as those occurring in reactive multilayer foils (RMLF). Program Impact: The LLNL DTEM possesses unique capabilities for capturing time resolved images and diffraction patterns of rapidly evolving materials microstructure under strongly driven conditions. No other instrument in the world can capture images with <10 nm spatial resolution of interesting irreversible materials processes such as phase transformations, plasticity, or morphology changes with 15 ns time resolution. The development of this innovative capability requires the continuing collaboration of laser scientists, electron microscopists, and materials scientists experienced in time resolved observations of materials that exist with particularly relevant backgrounds at LLNL. The research team has made observations of materials processes that are possible by no other method, such as the rapid crystallization of thin film NiTi that identified a change in mechanism at high heating rates as compared to isothermal anneals through changes in nucleation and growth rates of the crystalline phase. The project is designed to reveal these fundamental processes and mechanisms in rapid microstructure evolution that form the foundation of understanding that is an integral part of the DOE-BES mission.« less

  5. Thermal conductivity measurements of proton-heated warm dense aluminum

    NASA Astrophysics Data System (ADS)

    McKelvey, A.; Kemp, G.; Sterne, P.; Fernandez, A.; Shepherd, R.; Marinak, M.; Link, A.; Collins, G.; Sio, H.; King, J.; Freeman, R.; Hua, R.; McGuffey, C.; Kim, J.; Beg, F.; Ping, Y.

    2017-10-01

    We present the first thermal conductivity measurements of warm dense aluminum at 0.5-2.7 g/cc and 2-10 eV, using a recently developed platform of differential heating. A temperature gradient is induced in a Au/Al dual-layer target by proton heating, and subsequent heat flow from the hotter Au to the Al rear surface is detected by two simultaneous time-resolved diagnostics. A systematic data set allows for constraining both thermal conductivity and equation-of-state models. Simulations using Purgatorio model or Sesame S27314 for Al thermal conductivity and LEOS for Au/Al release equation-of-state show good agreement with data after 15 ps. Predictions by other models, such Lee-More, Sesame 27311 and 29373, are outside of experimental error bars. Discrepancy still exists at early time 0-15 ps, likely due to non-equilibrium conditions. (Y. Ping et al. Phys. Plasmas, 2015, A. Mckelvey, et al. Sci. Reports 2017). This work was performed under the auspices of the DOE by LLNL under contract DE-AC52-07NA27344 with support from DOE OFES Early Career program and LLNL LDRD program.

  6. Summary of Environmental Data Analysis and Work Performed by Lawrence Livermore National Laboratory (LLNL) in Support of the Navajo Nation Abandoned Mine Lands Project at Tse Tah, Arizona

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taffet, Michael J.; Esser, Bradley K.; Madrid, Victor M.

    This report summarizes work performed by Lawrence Livermore National Laboratory (LLNL) under Navajo Nation Services Contract CO9729 in support of the Navajo Abandoned Mine Lands Reclamation Program (NAMLRP). Due to restrictions on access to uranium mine waste sites at Tse Tah, Arizona that developed during the term of the contract, not all of the work scope could be performed. LLNL was able to interpret environmental monitoring data provided by NAMLRP. Summaries of these data evaluation activities are provided in this report. Additionally, during the contract period, LLNL provided technical guidance, instructional meetings, and review of relevant work performed by NAMLRPmore » and its contractors that was not contained in the contract work scope.« less

  7. Modelling Dynamic Behaviour and Spall Failure of Aluminium Alloy AA7010

    NASA Astrophysics Data System (ADS)

    Ma'at, N.; Nor, M. K. Mohd; Ismail, A. E.; Kamarudin, K. A.; Jamian, S.; Ibrahim, M. N.; Awang, M. K.

    2017-10-01

    A finite strain constitutive model to predict the dynamic deformation behaviour of Aluminium Alloy 7010 including shockwaves and spall failure is developed in this work. The important feature of this newly hyperelastic-plastic constitutive formulation is a new Mandel stress tensor formulated using new generalized orthotropic pressure. This tensor is combined with a shock equation of state (EOS) and Grady spall failure. The Hill’s yield criterion is adopted to characterize plastic orthotropy by means of the evolving structural tensors that is defined in the isoclinic configuration. This material model was developed and integration into elastic and plastic parts. The elastic anisotropy is taken into account through the newly stress tensor decomposition of a generalized orthotropic pressure. Plastic anisotropy is considered through yield surface and an isotropic hardening defined in a unique alignment of deviatoric plane within the stress space. To test its ability to describe shockwave propagation and spall failure, the new material model was implemented into the LLNL-DYNA3D code of UTHM’s. The capability of this newly constitutive model were compared against published experimental data of Plate Impact Test at 234m/s, 450m/s and 895m/s impact velocities. A good agreement is obtained between experimental and simulation in each test.

  8. Special Analysis for the Disposal of the Lawrence Livermore National Laboratory EnergyX Macroencapsulated Waste Stream at the Area 5 Radioactive Waste Management Site, Nevada National Security Site, Nye County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shott, Gregory J.

    This special analysis (SA) evaluates whether the Lawrence Livermore National Laboratory (LLNL) EnergyX Macroencapsulated waste stream (B LAMACRONCAP, Revision 1) is suitable for disposal by shallow land burial (SLB) at the Area 5 Radioactive Waste Management Site (RWMS) at the Nevada National Security Site (NNSS). The LLNL EnergyX Macroencapsulated waste stream is macroencapsulated mixed waste generated during research laboratory operations and maintenance (LLNL 2015). The LLNL EnergyX Macroencapsulated waste stream required a special analysis due to tritium (3H), cobalt-60 (60Co), cesium-137 (137Cs), and radium-226 (226Ra) exceeding the NNSS Waste Acceptance Criteria (WAC) Action Levels (U.S. Department of Energy, National Nuclearmore » Security Administration Nevada Field Office [NNSA/NFO] 2015).The results indicate that all performance objectives can be met with disposal of the waste stream in a SLB trench. Addition of the LLNL EnergyX Macroencapsulated inventory slightly increases multiple performance assessment results, with the largest relative increase occurring for the all-pathways annual total effective dose (TED). The maximum mean and 95th percentile 222Rn flux density remain less than the performance objective throughout the compliance period. The LLNL EnergyX Macroencapsulated waste stream is suitable for disposal by SLB at the Area 5 RWMS. The waste stream is recommended for approval without conditions.« less

  9. Computation Directorate Annual Report 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L; McGraw, J R; Ashby, S F

    Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less

  10. 2020 Foresight Forging the Future of Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chrzanowski, P.

    2000-01-01

    The Lawrence Livermore National Laboratory (LLNL) of 2020 will look much different from the LLNL of today and vastly different from how it looked twenty years ago. We, the members of the Long-Range Strategy Project, envision a Laboratory not defined by one program--nuclear weapons research--but by several core programs related to or synergistic with LLNL's national security mission. We expect the Laboratory to be fully engaged with sponsors and the local community and closely partnering with other research and development (R&D) organizations and academia. Unclassified work will be a vital part of the Laboratory of 2020 and will visibly demonstratemore » LLNL's international science and technology strengths. We firmly believe that there will be a critical and continuing role for the Laboratory. As a dynamic and versatile multipurpose laboratory with a national security focus, LLNL will be applying its capabilities in science and technology to meet the needs of the nation in the 21st century. With strategic investments in science, outstanding technical capabilities, and effective relationships, the Laboratory will, we believe, continue to play a key role in securing the nation's future.« less

  11. Conventional Weapons Underwater Explosions

    DTIC Science & Technology

    1988-12-01

    Nitromethane," UCRL 52903, December 1980. 22 I >I I 20 0--0 AIcN 23 0 I0 0 0 c W * ’S * / 0 o ---. 0 / nEil~ 24 Unreacted explosive Shock front t t...1976. 57 7. B. M. Dobratz LLNL Explosives Handbook - Properties of Explosives and Ex- plosive Simulants, UCRL -52997, March 1981. 8. M. H. Rice and J...Canada (403) 549- 3701 Ext. 4787 39. Joel C. W. Rogers Dept. of Mathemantics Polytechnic University 333 Jay Street Brooklyn, NY 11201 (718) 260-3501 40

  12. SPE-5 Ground-Motion Prediction at Far-Field Geophone and Accelerometer Array Sites and SPE-5 Moment and Corner-Frequency Prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaoning; Patton, Howard John; Chen, Ting

    2016-03-25

    This report offers predictions for the SPE-5 ground-motion and accelerometer array sites. These predictions pertain to the waveform and spectral amplitude at certain geophone sites using Denny&Johnson source model and a source model derived from SPE data; waveform, peak velocity and peak acceleration at accelerometer sites using the SPE source model and the finite-difference simulation with LLNL 3D velocity model; and the SPE-5 moment and corner frequency.

  13. A microwave FEL (free electron laser) code using waveguide modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byers, J.A.; Cohen, R.H.

    1987-08-01

    A free electron laser code, GFEL, is being developed for application to the LLNL tokamak current drive experiment, MTX. This single frequency code solves for the slowly varying complex field amplitude using the usual wiggler-averaged equations of existing codes, in particular FRED, except that it describes the fields by a 2D expansion in the rectangular waveguide modes, using coupling coefficients similar to those developed by Wurtele, which include effects of spatial variations in the fields seen by the wiggler motion of the particles. Our coefficients differ from those of Wurtele in two respects. First, we have found a missing ..sqrt..2..gamma../a/submore » w/ factor in his C/sub z/; when corrected this increases the effect of the E/sub z/ field component and this in turn reduces the amplitude of the TM mode. Second, we have consistently retained all terms of second order in the wiggle amplitude. Both corrections are necessary for accurate computation. GFEL has the capability of following the TE/sub 0n/ and TE(M)/sub m1/ modes simultaneously. GFEL produces results nearly identical to those from FRED if the coupling coefficients are adjusted to equal those implied by the algorithm in FRED. Normally, the two codes produce results that are similar but different in detail due to the different treatment of modes higher than TE/sub 01/. 5 refs., 2 figs., 1 tab.« less

  14. Control System for the LLNL Kicker Pulse Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, J A; Anaya, R M; Cook, E G

    2002-06-18

    A solid-state high voltage pulse generator with multi-pulse burst capability, very fast rise and fall times, pulse width agility, and amplitude modulation capability for use with high speed electron beam kickers has been designed and tested at LLNL. A control system calculates a desired waveform to be applied to the kicker based on measured electron beam displacement then adjusts the pulse generators to provide the desired waveform. This paper presents the design of the control system and measure performance data from operation on the ETA-11 accelerator at LLNL.

  15. The National Ignition Facility (NIF) and High Energy Density Science Research at LLNL (Briefing Charts)

    DTIC Science & Technology

    2013-06-21

    neutron activation detectors (FNADS) 2013-049951s2.ppt Detector locations • Average rR ~ 1 g/cm2 • ~ 50% variations Motivates new 2D backlit imaging...of the implosion Motivates Compton radiography for stagnated fuel shape g/cm2 DrR rR map from neutron Activation Detectors (90Zr(n,2n)  89Zr...high energy cosmic rays Oxford Univ./LLNL LLNL Novel phases of compressed diamond Synthesis of elements heavier than iron 1545 Neutron flux in

  16. Demonstration of Laser Plasma X-Ray Source with X-Ray Collimator Final Report CRADA No. TC-1564-99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, S. M.; Forber, R. A.

    2017-09-28

    This collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and JMAR Research, Inc. (JRI), was to demonstrate that LLNL x-ray collimators can effectively increase the wafer throughput of JRI's laser based x-ray lithography systems. The technical objectives were expected to be achieved by completion of the following tasks, which are separated into two task lists by funding source. The organization (LLNL or JMAR) having primary responsibility is given parenthetically for each task.

  17. Institute of Geophysics and Planetary Physics (IGPP), Lawrence Livermore National Laboratory (LLNL): Quinquennial report, November 14-15, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweed, J.

    1996-10-01

    This Quinquennial Review Report of the Lawrence Livermore National Laboratory (LLNL) branch of the Institute for Geophysics and Planetary Physics (IGPP) provides an overview of IGPP-LLNL, its mission, and research highlights of current scientific activities. This report also presents an overview of the University Collaborative Research Program (UCRP), a summary of the UCRP Fiscal Year 1997 proposal process and the project selection list, a funding summary for 1993-1996, seminars presented, and scientific publications. 2 figs., 3 tabs.

  18. Three-Dimensional Modeling of Low-Mode Asymmetries in OMEGA Cryogenic Implosions

    NASA Astrophysics Data System (ADS)

    Anderson, K. S.; McKenty, P. W.; Shvydky, A.; Collins, T. J. B.; Forrest, C. J.; Knauer, J. P.; Marozas, J. A.; Marshall, F. J.; Radha, P. B.; Sefkow, A. B.; Marinak, M. M.

    2017-10-01

    In direct-drive inertial confinement fusion implosions, long-wavelength asymmetries resulting from target offset, laser power imbalance, beam mispointing, etc. can be highly detrimental to target performance. Characterizing the effects of these asymmetry sources requires 3-D simulations performed in full-sphere geometry to accurately capture the evolution of shell perturbations and hot-spot flow. This paper will present 3-D HYDRA simulations characterizing the impact of these perturbation sources on yield and shell modulation. Various simulated observables are generated, and trends are analyzed and compared with experimental data. This material is based on work supported by the Department of Energy National Nuclear Security Administration under Award Numbers DE-NA0001944 and performed under the auspices of the LLNL under Contract No. DE-AC52-07NA27344.

  19. Mechanical Engineering Department engineering research: Annual report, FY 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denney, R.M.; Essary, K.L.; Genin, M.S.

    1986-12-01

    This report provides information on the five areas of research interest in LLNL's Mechanical Engineering Department. In Computer Code Development, a solid geometric modeling program is described. In Dynamic Systems and Control, structure control and structure dynamics are discussed. Fabrication technology involves machine cutting, interferometry, and automated optical component manufacturing. Materials engineering reports on composite material research and measurement of molten metal surface properties. In Nondestructive Evaluation, NMR, CAT, and ultrasound machines are applied to manufacturing processes. A model for underground collapse is developed. Finally, an alternative heat exchanger is investigated for use in a fusion power plant. Separate abstractsmore » were prepared for each of the 13 reports in this publication. (JDH)« less

  20. Analysis and Design of a Fiber-optic Probe for DNA Sensors Final Report CRADA No. TSB-1147-95

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molau, Nicole; Vail, Curtis

    In 1995, a challenge in the field of genetics dealt with the acquisition of efficient DNA sequencing techniques for reading the 3 billion base-pairs that comprised the human genome. AccuPhotonics, Inc. proposed to develop and manufacture a state-of-the-art near-field scanning optical microscopy (NSOM) fiber-optic probe that was expected to increase probe efficiency by two orders of magnitude over the existing state-of-the-art and to improve resolution to 10Å. The detailed design calculation and optimization of electrical properties of the fiber-optic probe tip geometry would be performed at LLNL, using existing finite-difference time-domain (FDTD) electromagnetic (EM) codes.

  1. Not ''just'' pump and treat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angleberger, K; Bainer, R W

    2000-12-12

    The Lawrence Livermore National Laboratory (LLNL) has been consistently improving the site cleanup methods by adopting new philosophies, strategies and technologies to address constrained or declining budgets, lack of useable space due to a highly industrialized site, and significant technical challenges. As identified in the ROD, the preferred remedy at the LLNL Livermore Site is pump and treat, although LLNL has improved this strategy to bring the remediation of the ground water to closure as soon as possible. LLNL took the logical progression from a pump and treat system to the philosophy of ''Smart Pump and Treat'' coupled with themore » concepts of ''Hydrostratigraphic Unit Analysis,'' ''Engineered Plume Collapse,'' and ''Phased Source Remediation,'' which led to the development of new, more cost-effective technologies which have accelerated the attainment of cleanup goals significantly. Modeling is also incorporated to constantly develop new, cost-effective methodologies to accelerate cleanup and communicate the progress of cleanup to stakeholders. In addition, LLNL improved on the efficiency and flexibility of ground water treatment facilities. Ground water cleanup has traditionally relied on costly and obtrusive fixed treatment facilities. LLNL has designed and implemented various portable ground water treatment units to replace the fixed facilities; the application of each type of facility is determined by the amount of ground water flow and contaminant concentrations. These treatment units have allowed for aggressive ground water cleanup, increased cleanup flexibility, and reduced capital and electrical costs. After a treatment unit has completed ground water cleanup at one location, it can easily be moved to another location for additional ground water cleanup.« less

  2. Addressing Transportation Energy and Environmental Impacts: Technical and Policy Research Directions

    DOT National Transportation Integrated Search

    1995-08-10

    The Lawrence Livermore National Laboratory (LLNL) is establishing a local chapter of the University of California Energy Institute (UCEI). In order to most effectively contribute to the Institute, LLNL sponsored a workshop on energy and environmental...

  3. Environmental Report 1993-1996

    DOT National Transportation Integrated Search

    2002-08-16

    These reports are prepared for the U.S. Department of Energy (DOE), as required by DOE Order 5400.1 and DOE Order 231.1, by the Environmental Protection Department (EPD) at the Lawrence Livermore National Laboratory (LLNL). The results of LLNL's envi...

  4. Status of parallel Python-based implementation of UEDGE

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Pankin, A. Y.; Rognlien, T. D.; Dimits, A. M.; Friedman, A.; Joseph, I.

    2017-10-01

    The tokamak edge transport code UEDGE has long used the code-development and run-time framework Basis. However, with the support for Basis expected to terminate in the coming years, and with the advent of the modern numerical language Python, it has become desirable to move UEDGE to Python, to ensure its long-term viability. Our new Python-based UEDGE implementation takes advantage of the portable build system developed for FACETS. The new implementation gives access to Python's graphical libraries and numerical packages for pre- and post-processing, and support of HDF5 simplifies exchanging data. The older serial version of UEDGE has used for time-stepping the Newton-Krylov solver NKSOL. The renovated implementation uses backward Euler discretization with nonlinear solvers from PETSc, which has the promise to significantly improve the UEDGE parallel performance. We will report on assessment of some of the extended UEDGE capabilities emerging in the new implementation, and will discuss the future directions. Work performed for U.S. DOE by LLNL under contract DE-AC52-07NA27344.

  5. Neutron transport-burnup code MCORGS and its application in fusion fission hybrid blanket conceptual research

    NASA Astrophysics Data System (ADS)

    Shi, Xue-Ming; Peng, Xian-Jue

    2016-09-01

    Fusion science and technology has made progress in the last decades. However, commercialization of fusion reactors still faces challenges relating to higher fusion energy gain, irradiation-resistant material, and tritium self-sufficiency. Fusion Fission Hybrid Reactors (FFHR) can be introduced to accelerate the early application of fusion energy. Traditionally, FFHRs have been classified as either breeders or transmuters. Both need partition of plutonium from spent fuel, which will pose nuclear proliferation risks. A conceptual design of a Fusion Fission Hybrid Reactor for Energy (FFHR-E), which can make full use of natural uranium with lower nuclear proliferation risk, is presented. The fusion core parameters are similar to those of the International Thermonuclear Experimental Reactor. An alloy of natural uranium and zirconium is adopted in the fission blanket, which is cooled by light water. In order to model blanket burnup problems, a linkage code MCORGS, which couples MCNP4B and ORIGEN-S, is developed and validated through several typical benchmarks. The average blanket energy Multiplication and Tritium Breeding Ratio can be maintained at 10 and 1.15 respectively over tens of years of continuous irradiation. If simple reprocessing without separation of plutonium from uranium is adopted every few years, FFHR-E can achieve better neutronic performance. MCORGS has also been used to analyze the ultra-deep burnup model of Laser Inertial Confinement Fusion Fission Energy (LIFE) from LLNL, and a new blanket design that uses Pb instead of Be as the neutron multiplier is proposed. In addition, MCORGS has been used to simulate the fluid transmuter model of the In-Zinerater from Sandia. A brief comparison of LIFE, In-Zinerater, and FFHR-E will be given.

  6. Velocimetry Overview for visitors from the DOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briggs, Matthew E.; Holtkamp, David Bruce

    2016-08-19

    We are in the midst of a transformative period in which technological advances are making fundamental changes in the measurement techniques that form the backbone of nuclear weapon certification. Optical velocimetry has replaced electrical shorting pins in “Hydrotests,” which measure the dynamic implosion process. This advance has revolutionized nuclear weapons certification during the last 5 years. We can now measure the implosion process that drives a nuclear detonation with many orders of magnitude more resolution in both space and time than was possible just 10 years ago. It has been compared to going from Morse Code to HDTV, resulting inmore » a dozen or more improvements in models of these weapons. These Hydrotests are carried out at LANL, LLNL and the NNSS, with the later holding the important role of allowing us to test with nuclear materials, in sub-critical configurations (i.e., no yield.) Each of these institutions has largely replaced pins with hundreds of channels of optical velocimetry. Velocimetry is non-contact and is used simultaneously with the X-ray capability of these facilities. The U1-a facility at NNSS pioneered this approach in the Gemini series in 2012, and continues to lead, both in channel count and technological advances. Close cooperation among LANL, LLNL and NSTec in these advances serves the complex by leveraging capabilities across sites and accelerating the pace of technical improvements.« less

  7. Target design for materials processing very far from equilibrium

    NASA Astrophysics Data System (ADS)

    Barnard, John J.; Schenkel, Thomas

    2016-10-01

    Local heating and electronic excitations can trigger phase transitions or novel material states that can be stabilized by rapid quenching. An example on the few nanometer scale are phase transitions induced by the passage of swift heavy ions in solids where nitrogen-vacancy color centers form locally in diamonds when ions heat the diamond matrix to warm dense matter conditions at 0.5 eV. We optimize mask geometries for target materials such as silicon and diamond to induce phase transitions by intense ion pulses (e. g. from NDCX-II or from laser-plasma acceleration). The goal is to rapidly heat a solid target volumetrically and to trigger a phase transition or local lattice reconstruction followed by rapid cooling. The stabilized phase can then be studied ex situ. We performed HYDRA simulations that calculate peak temperatures for a series of excitation conditions and cooling rates of crystal targets with micro-structured masks. A simple analytical model, that includes ion heating and radial, diffusive cooling, was developed that agrees closely with the HYDRA simulations. The model gives scaling laws that can guide the design of targets over a wide range of parameters including those for NDCX-II and the proposed BELLA-i. This work was performed under the auspices of the U.S. DOE under contracts DE-AC52-07NA27344 (LLNL), DE-AC02-05CH11231 (LBNL) and was supported by the US DOE Office of Science, Fusion Energy Sciences. LLNL-ABS-697271.

  8. LLNL Center of Excellence Work Items for Q9-Q10 period

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neely, J. R.

    This work plan encompasses a slice of effort going on within the ASC program, and for projects utilizing COE vendor resources, describes work that will be performed by both LLNL staff and COE vendor staff collaboratively.

  9. LLNL Results from CALIBAN-PROSPERO Nuclear Accident Dosimetry Experiments in September 2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobaugh, M. L.; Hickman, D. P.; Wong, C. W.

    2015-05-21

    Lawrence Livermore National Laboratory (LLNL) uses thin neutron activation foils, sulfur, and threshold energy shielding to determine neutron component doses and the total dose from neutrons in the event of a nuclear criticality accident. The dosimeter also uses a DOELAP accredited Panasonic UD-810 (Panasonic Industrial Devices Sales Company of America, 2 Riverfront Plaza, Newark, NJ 07102, U.S.A.) thermoluminescent dosimetery system (TLD) for determining the gamma component of the total dose. LLNL has participated in three international intercomparisons of nuclear accident dosimeters. In October 2009, LLNL participated in an exercise at the French Commissariat à l’énergie atomique et aux énergies alternativesmore » (Alternative Energies and Atomic Energy Commission- CEA) Research Center at Valduc utilizing the SILENE reactor (Hickman, et.al. 2010). In September 2010, LLNL participated in a second intercomparison at CEA Valduc, this time with exposures at the CALIBAN reactor (Hickman et al. 2011). This paper discusses LLNL’s results of a third intercomparison hosted by the French Institut de Radioprotection et de Sûreté Nucléaire (Institute for Radiation Protection and Nuclear Safety- IRSN) with exposures at two CEA Valduc reactors (CALIBAN and PROSPERO) in September 2014. Comparison results between the three participating facilities is presented elsewhere (Chevallier 2015; Duluc 2015).« less

  10. FY16 LLNL Omega Experimental Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heeter, R. F.; Ali, S. J.; Benstead, J.

    In FY16, LLNL’s High-Energy-Density Physics (HED) and Indirect Drive Inertial Confinement Fusion (ICF-ID) programs conducted several campaigns on the OMEGA laser system and on the EP laser system, as well as campaigns that used the OMEGA and EP beams jointly. Overall, these LLNL programs led 430 target shots in FY16, with 304 shots using just the OMEGA laser system, and 126 shots using just the EP laser system. Approximately 21% of the total number of shots (77 OMEGA shots and 14 EP shots) supported the Indirect Drive Inertial Confinement Fusion Campaign (ICF-ID). The remaining 79% (227 OMEGA shots and 112more » EP shots) were dedicated to experiments for High-Energy-Density Physics (HED). Highlights of the various HED and ICF campaigns are summarized in the following reports. In addition to these experiments, LLNL Principal Investigators led a variety of Laboratory Basic Science campaigns using OMEGA and EP, including 81 target shots using just OMEGA and 42 shots using just EP. The highlights of these are also summarized, following the ICF and HED campaigns. Overall, LLNL PIs led a total of 553 shots at LLE in FY 2016. In addition, LLNL PIs also supported 57 NLUF shots on Omega and 31 NLUF shots on EP, in collaboration with the academic community.« less

  11. Computation Directorate and Science& Technology Review Computational Science and Research Featured in 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alchorn, A L

    Thank you for your interest in the activities of the Lawrence Livermore National Laboratory Computation Directorate. This collection of articles from the Laboratory's Science & Technology Review highlights the most significant computational projects, achievements, and contributions during 2002. In 2002, LLNL marked the 50th anniversary of its founding. Scientific advancement in support of our national security mission has always been the core of the Laboratory. So that researchers could better under and predict complex physical phenomena, the Laboratory has pushed the limits of the largest, fastest, most powerful computers in the world. In the late 1950's, Edward Teller--one of themore » LLNL founders--proposed that the Laboratory commission a Livermore Advanced Research Computer (LARC) built to Livermore's specifications. He tells the story of being in Washington, DC, when John Von Neumann asked to talk about the LARC. He thought Teller wanted too much memory in the machine. (The specifications called for 20-30,000 words.) Teller was too smart to argue with him. Later Teller invited Von Neumann to the Laboratory and showed him one of the design codes being prepared for the LARC. He asked Von Neumann for suggestions on fitting the code into 10,000 words of memory, and flattered him about ''Labbies'' not being smart enough to figure it out. Von Neumann dropped his objections, and the LARC arrived with 30,000 words of memory. Memory, and how close memory is to the processor, is still of interest to us today. Livermore's first supercomputer was the Remington-Rand Univac-1. It had 5600 vacuum tubes and was 2 meters wide by 4 meters long. This machine was commonly referred to as a 1 KFlop machine [E+3]. Skip ahead 50 years. The ASCI White machine at the Laboratory today, produced by IBM, is rated at a peak performance of 12.3 TFlops or E+13. We've improved computer processing power by 10 orders of magnitude in 50 years, and I do not believe there's any reason to think we won't improve another 10 orders of magnitude in the next 50 years. For years I have heard talk of hitting the physical limits of Moore's Law, but new technologies will take us into the next phase of computer processing power such as 3-D chips, molecular computing, quantum computing, and more. Big computers are icons or symbols of the culture and larger infrastructure that exists at LLNL to guide scientific discovery and engineering development. We have dealt with balance issues for 50 years and will continue to do so in our quest for a digital proxy of the properties of matter at extremely high temperatures and pressures. I believe that the next big computational win will be the merger of high-performance computing with information management. We already create terabytes--soon to be petabytes--of data. Efficiently storing, finding, visualizing and extracting data and turning that into knowledge which aids decision-making and scientific discovery is an exciting challenge. In the meantime, please enjoy this retrospective on computational physics, computer science, advanced software technologies, and applied mathematics performed by programs and researchers at LLNL during 2002. It offers a glimpse into the stimulating world of computational science in support of the national missions and homeland defense.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrnstein, Aaron R.

    An ocean model with adaptive mesh refinement (AMR) capability is presented for simulating ocean circulation on decade time scales. The model closely resembles the LLNL ocean general circulation model with some components incorporated from other well known ocean models when appropriate. Spatial components are discretized using finite differences on a staggered grid where tracer and pressure variables are defined at cell centers and velocities at cell vertices (B-grid). Horizontal motion is modeled explicitly with leapfrog and Euler forward-backward time integration, and vertical motion is modeled semi-implicitly. New AMR strategies are presented for horizontal refinement on a B-grid, leapfrog time integration,more » and time integration of coupled systems with unequal time steps. These AMR capabilities are added to the LLNL software package SAMRAI (Structured Adaptive Mesh Refinement Application Infrastructure) and validated with standard benchmark tests. The ocean model is built on top of the amended SAMRAI library. The resulting model has the capability to dynamically increase resolution in localized areas of the domain. Limited basin tests are conducted using various refinement criteria and produce convergence trends in the model solution as refinement is increased. Carbon sequestration simulations are performed on decade time scales in domains the size of the North Atlantic and the global ocean. A suggestion is given for refinement criteria in such simulations. AMR predicts maximum pH changes and increases in CO 2 concentration near the injection sites that are virtually unattainable with a uniform high resolution due to extremely long run times. Fine scale details near the injection sites are achieved by AMR with shorter run times than the finest uniform resolution tested despite the need for enhanced parallel performance. The North Atlantic simulations show a reduction in passive tracer errors when AMR is applied instead of a uniform coarse resolution. No dramatic or persistent signs of error growth in the passive tracer outgassing or the ocean circulation are observed to result from AMR.« less

  13. Infrared Imaging Camera Final Report CRADA No. TC02061.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E. V.; Nebeker, S.

    This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less

  14. Atmospheric Test Models and Numerical Experiments for the Simulation of the Global Distributions of Weather Data Transponders III. Horizontal Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molenkamp, C.R.; Grossman, A.

    1999-12-20

    A network of small balloon-borne transponders which gather very high resolution wind and temperature data for use by modern numerical weather predication models has been proposed to improve the reliability of long-range weather forecasts. The global distribution of an array of such transponders is simulated using LLNL's atmospheric parcel transport model (GRANTOUR) with winds supplied by two different general circulation models. An initial study used winds from CCM3 with a horizontal resolution of about 3 degrees in latitude and longitude, and a second study used winds from NOGAPS with a 0.75 degree horizontal resolution. Results from both simulations show thatmore » reasonable global coverage can be attained by releasing balloons from an appropriate set of launch sites.« less

  15. NLTE atomic kinetics modeling in ICF target simulations

    NASA Astrophysics Data System (ADS)

    Patel, Mehul V.; Mauche, Christopher W.; Scott, Howard A.; Jones, Ogden S.; Shields, Benjamin T.

    2017-10-01

    Radiation hydrodynamics (HYDRA) simulations using recently developed 1D spherical and 2D cylindrical hohlraum models have enabled a reassessment of the accuracy of energetics modeling across a range of NIF target configurations. Higher-resolution hohlraum calculations generally find that the X-ray drive discrepancies are greater than previously reported. We identify important physics sensitivities in the modeling of the NLTE wall plasma and highlight sensitivity variations between different hohlraum configurations (e.g. hohlraum gas fill). Additionally, 1D capsule only simulations show the importance of applying a similar level of rigor to NLTE capsule ablator modeling. Taken together, these results show how improved target performance predictions can be achieved by performing inline atomic kinetics using more complete models for the underlying atomic structure and transitions. Prepared by LLNL under Contract DE-AC52-07NA27344.

  16. Advancing Your Career at LLNL: Meet NIF’s Radiation Control Technicians

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarco, Judy; Gutierrez, Myrna; Beale, Richard

    2017-04-26

    Myrna Gutierrez and Judy Zarco took advantage of LLNL's legacy of encouraging continuing education to get the necessary degrees and training to advance their careers at the Lab. As Radiation Control Technicians, they help maintain safety at the National Ignition Facility.

  17. Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikkel, D. J.; McCabe, J.

    This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less

  18. Comprehensive Angular Response Study of LLNL Panasonic Dosimeter Configurations and Artificial Intelligence Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, D. K.

    In April of 2016, the Lawrence Livermore National Laboratory External Dosimetry Program underwent a Department of Energy Laboratory Accreditation Program (DOELAP) on-site assessment. The assessment reported a concern that the study performed in 2013 Angular Dependence Study Panasonic UD-802 and UD-810 Dosimeters LLNL Artificial Intelligence Algorithm was incomplete. Only the responses at ±60° and 0° were evaluated and independent data from dosimeters was not used to evaluate the algorithm. Additionally, other configurations of LLNL dosimeters were not considered in this study. This includes nuclear accident dosimeters (NAD) which are placed in the wells surrounding the TLD in the dosimeter holder.

  19. Michael M. May: Working toward solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M.M.

    1993-07-01

    As part of LLNL's 40th anniversary celebration held during 1992, the six former Directors were asked to participate in a lecture series. Each of these men contributed in important ways toward making the Lawrence Livermore National Laboratory (LLNL) what it has become today. Each was asked to comment on some of the Laboratory's accomplishments, his career here, his view of the changing world, and where he sees the Laboratory going in the future. Michael M. May, LLNL's fifth Director and now a Director Emeritus, comments on a broad range of issues including arms control, nonproliferation, cooperative security, and the futuremore » role of the Laboratory.« less

  20. Optics & Materials Science & Technology (OMST) Organization at LLNL

    ScienceCinema

    Suratwala,; Tayyab,; Nguyen, Hoang; Bude, Jeff; Dylla-Spears, Rebecca

    2018-06-13

    The Optics and Materials Science & Technology (OMST) organization at Lawrence Livermore National Laboratory (LLNL) supplies optics, recycles optics, and performs the materials science and technology to advance optics and optical materials for high-power and high-energy lasers for a variety of missions. The organization is a core capability at LLNL. We have a strong partnership with many optical fabricators, universities and national laboratories to accomplish our goals. The organization has a long history of performing fundamental optical materials science, developing them into useful technologies, and transferring them into production both on-site and off-site. We are successfully continuing this same strategy today.

  1. Optics & Materials Science & Technology (OMST) Organization at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suratwala,; Tayyab,; Nguyen, Hoang

    The Optics and Materials Science & Technology (OMST) organization at Lawrence Livermore National Laboratory (LLNL) supplies optics, recycles optics, and performs the materials science and technology to advance optics and optical materials for high-power and high-energy lasers for a variety of missions. The organization is a core capability at LLNL. We have a strong partnership with many optical fabricators, universities and national laboratories to accomplish our goals. The organization has a long history of performing fundamental optical materials science, developing them into useful technologies, and transferring them into production both on-site and off-site. We are successfully continuing this same strategymore » today.« less

  2. Technical Report: Benchmarking for Quasispecies Abundance Inference with Confidence Intervals from Metagenomic Sequence Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLoughlin, K.

    2016-01-22

    The software application “MetaQuant” was developed by our group at Lawrence Livermore National Laboratory (LLNL). It is designed to profile microbial populations in a sample using data from whole-genome shotgun (WGS) metagenomic DNA sequencing. Several other metagenomic profiling applications have been described in the literature. We ran a series of benchmark tests to compare the performance of MetaQuant against that of a few existing profiling tools, using real and simulated sequence datasets. This report describes our benchmarking procedure and results.

  3. Image Matrix Processor for Volumetric Computations Final Report CRADA No. TSB-1148-95

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberson, G. Patrick; Browne, Jolyon

    The development of an Image Matrix Processor (IMP) was proposed that would provide an economical means to perform rapid ray-tracing processes on volume "Giga Voxel" data sets. This was a multi-phased project. The objective of the first phase of the IMP project was to evaluate the practicality of implementing a workstation-based Image Matrix Processor for use in volumetric reconstruction and rendering using hardware simulation techniques. Additionally, ARACOR and LLNL worked together to identify and pursue further funding sources to complete a second phase of this project.

  4. Multiple Experimental Platform Consistency at NIF

    NASA Astrophysics Data System (ADS)

    Benedetti, L. R.; Barrios, M. A.; Bradley, D. K.; Eder, D. C.; Khan, S. F.; Izumi, N.; Jones, O. S.; Ma, T.; Nagel, S. R.; Peterson, J. L.; Rygg, J. R.; Spears, B. K.; Town, R. P.

    2013-10-01

    ICF experiments at NIF utilize several platforms to assess different metrics of implosion quality. In addition to the point design-a target capsule of DT ice inside a thin plastic ablator-notable platforms include: (i) Symmetry Capsules(SymCaps), mass-adjusted CH capsules filled with DT gas for similar hydrodynamic performance without the need for a DT crystal; (ii) D:3He filled SymCaps, designed for low neutron yield implosions to accommodate a variety of x-ray and optical diagnostics; and (iii) Convergent Ablators, SymCaps coupled with x-radiography to assess in-flight velocity and symmetry of the implosion over ~1 ns before stagnation and burn. These platforms are expected to be good surrogates for one another, and their hohlraum and implosion performance variations have been simulated in detail. By comparing results of similar experiments, we isolate platform-specific variations. We focus on the symmetry, convergence, and timing of x-ray emission as observed in each platform as this can be used to infer stagnation pressure and temperature. This work performed under the auspices of the U.S. Dept. of Energy by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-640865.

  5. Modelling of Deflagration to Detonation Transition in Porous PETN of Density 1.4 g / cc with HERMES

    NASA Astrophysics Data System (ADS)

    Reaugh, John; Curtis, John; Maheswaran, Mary-Ann

    2017-06-01

    The modelling of Deflagration to Detonation Transition in explosives is a severe challenge for reactive burn models because of the complexity of the physics; there is mechanical and thermal interaction of the gaseous burn products with the burning porous matrix, with resulting compaction, shock formation and subsequent detonation. Experiments on the explosive PETN show a strong dependence of run distance to detonation on porosity. The minimum run distance appears to occur when the density is approximately 1.4 g / cc. Recent research on the High Explosive Response to Mechanical Stimulation (HERMES) model for High Explosive Violent Reaction has included the development of a model for PETN at 1.4 g / cc., which allows the prediction of the run distance in the experiments for PETN at this density. Detonation and retonation waves as seen in the experiment are evident. The HERMES simulations are analysed to help illuminate the physics occurring in the experiments. JER's work was performed under the auspices of the US DOE by LLNL under Contract DE-AC52-07NA27344 and partially funded by the Joint US DoD/DOE Munitions Technology Development Program. LLNL-ABS-723537.

  6. July 1999 working group meeting on heavy vehicle aerodynamic drag: presentations and summary of comments and conclusions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brady, M; Browand, F; Flowers, D

    A Working Group Meeting on Heavy Vehicle Aerodynamic Drag was held at University of Southern California, Los Angeles, California on July 30, 1999. The purpose of the meeting was to present technical details on the experimental and computational plans and approaches and provide an update on progress in obtaining experimental results, model developments, and simulations. The focus of the meeting was a review of University of Southern California's (USC) experimental plans and results and the computational results from Lawrence Livermore National Laboratory (LLNL) and Sandia National Laboratories (SNL) for the integrated tractor-trailer benchmark geometry called the Sandia Model. Much ofmore » the meeting discussion involved the NASA Ames 7 ft x 10 ft wind tunnel tests and the need for documentation of the results. The present and projected budget and funding situation was also discussed. Presentations were given by representatives from the Department of Energy (DOE) Office of Transportation Technology Office of Heavy Vehicle Technology (OHVT), LLNL, SNL, USC, and California Institute of Technology (Caltech). This report contains the technical presentations (viewgraphs) delivered at the Meeting, briefly summarizes the comments and conclusions, and outlines the future action items.« less

  7. Educational Revolution on the Reservation: A Working Model.

    ERIC Educational Resources Information Center

    Murphy, Pete

    1993-01-01

    Since 1986, Navajo Community College (NCC) and Lawrence Livermore National Laboratory (LLNL) have collaborated to improve science and technical education on the Navajo Reservation through equipment loans, faculty exchanges, summer student work at LLNL, scholarships for NCC students, summer workshops for elementary science teachers, and classroom…

  8. Fusion/Astrophysics Teacher Research Academy

    NASA Astrophysics Data System (ADS)

    Correll, Donald

    2005-10-01

    In order to engage California high school science teachers in the area of plasma physics and fusion research, LLNL's Fusion Energy Program has partnered with the UC Davis Edward Teller Education Center, ETEC (http://etec.ucdavis.edu), the Stanford University Solar Center (http://solar-center.stanford.edu) and LLNL's Science / Technology Education Program, STEP (http://education.llnl.gov). A four-level ``Fusion & Astrophysics Research Academy'' has been designed to give teachers experience in conducting research using spectroscopy with their students. Spectroscopy, and its relationship to atomic physics and electromagnetism, provides for an ideal plasma `bridge' to the CA Science Education Standards (http://www.cde.ca.gov/be/st/ss/scphysics.asp). Teachers attend multiple-day professional development workshops to explore new research activities for use in the high school science classroom. A Level I, 3-day program consists of two days where teachers learn how plasma researchers use spectrometers followed by instructions on how to use a research grade spectrometer for their own investigations. A 3rd day includes touring LLNL's SSPX (http://www.mfescience.org/sspx/) facility to see spectrometry being used to measure plasma properties. Spectrometry classroom kits are made available for loaning to participating teachers. Level I workshop results (http://education.llnl.gov/fusion&_slash;astro/) will be presented along with plans being developed for Level II (one week advanced SKA's), Level III (pre-internship), and Level IV (summer internship) research academies.

  9. Polar tent for reduced perturbation of NIF ignition capsules

    NASA Astrophysics Data System (ADS)

    Hammel, B. A.; Pickworth, L.; Stadermann, M.; Field, J.; Robey, H.; Scott, H. A.; Smalyuk, V.

    2016-10-01

    In simulations, a tent that contacts the capsule near the poles and departs tangential to the capsule surface greatly reduces the capsule perturbation, and the resulting mass injected into the hot-spot, compared to current capsule support methods. Target fabrication appears feasible with a layered tent (43-nm polyimide + 8-nm C) for increased stiffness. We are planning quantitative measurements of the resulting shell- ρR perturbation near peak implosion velocity (PV) using enhanced self-emission backlighting, achieved by adding 1% Ar to the capsule fill in Symcaps (4He + H). Layered DT implosions are also planned for an integrated test of capsule performance. We will describe the design and simulation predictions. Prepared by LLNL under Contract DE-AC52-07NA27344.

  10. ARC-2010-ACD10-0020-073

    NASA Image and Video Library

    2010-02-10

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. Smoke test demo.

  11. ARC-2010-ACD10-0020-065

    NASA Image and Video Library

    2010-02-10

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. Smoke test demo.

  12. Wide Area Recovery and Resiliency Program (WARRP) Knowledge Enhancement Events: CBR Workshop After Action Report

    DTIC Science & Technology

    2012-01-01

    Laboratories Walker Ray Walker Engineering Solutions, LLC Williams Patricia Denver Office of Emergency Management Wood- Zika Annmarie Lawrence Livermore...llnl.gov AnnMarie Wood- Zika woodzika1@llnl.gov Pacific Northwest National Laboratory Ann Lesperance ann.lesperance@pnnl.gov Jessica Sandusky

  13. Fast Steering Mirror systems for the U-AVLIS program at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, J.; Avicola, K.; Payne, A.

    1994-07-01

    We have successfully deployed several fast steering mirror systems in the Uranium Atomic Vapor Isotope Separation (U-AVLIS) facility at LLNL. These systems employ 2 mm to 150 mm optics and piezoelectric actuators to achieve microradian pointing accuracy with disturbance rejection bandwidths to a few hundred hertz.

  14. Critical Homeland Infrastructure Protection

    DTIC Science & Technology

    2007-01-01

    talent. Examples include: * Detection of surveillance activities; * Stand-off detection of chemical, biological, nuclear, radiation and explosive ...Manager Guardian DARPA Overview Mr. Roger Gibbs DARPA LLNL Technologies in Support of Infrastructure Mr. Don Prosnitz LLNL Protection Sandia National...FP Antiterrorism/Force Protection CBRNE Chemical Biological Radiological Nuclear Explosive CERT Commuter Emergency Response Team CIA Central

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glascoe, Lee; Gowardhan, Akshay; Lennox, Kristin

    In the interest of promoting the international exchange of technical expertise, the US Department of Energy’s Office of Emergency Operations (NA-40) and the French Commissariat à l'Energie Atomique et aux énergies alternatives (CEA) requested that the National Atmospheric Release Advisory Center (NARAC) of Lawrence Livermore National Laboratory (LLNL) in Livermore, California host a joint table top exercise with experts in emergency management and atmospheric transport modeling. In this table top exercise, LLNL and CEA compared each other’s flow and dispersion models. The goal of the comparison is to facilitate the exchange of knowledge, capabilities, and practices, and to demonstrate themore » utility of modeling dispersal at different levels of computational fidelity. Two modeling approaches were examined, a regional scale modeling approach, appropriate for simple terrain and/or very large releases, and an urban scale modeling approach, appropriate for small releases in a city environment. This report is a summary of LLNL and CEA modeling efforts from this exercise. Two different types of LLNL and CEA models were employed in the analysis: urban-scale models (Aeolus CFD at LLNL/NARAC and Parallel- Micro-SWIFT-SPRAY, PMSS, at CEA) for analysis of a 5,000 Ci radiological release and Lagrangian Particle Dispersion Models (LODI at LLNL/NARAC and PSPRAY at CEA) for analysis of a much larger (500,000 Ci) regional radiological release. Two densely-populated urban locations were chosen: Chicago with its high-rise skyline and gridded street network and Paris with its more consistent, lower building height and complex unaligned street network. Each location was considered under early summer daytime and nighttime conditions. Different levels of fidelity were chosen for each scale: (1) lower fidelity mass-consistent diagnostic, intermediate fidelity Navier-Stokes RANS models, and higher fidelity Navier-Stokes LES for urban-scale analysis, and (2) lower-fidelity single-profile meteorology versus higher-fidelity three-dimensional gridded weather forecast for regional-scale analysis. Tradeoffs between computation time and the fidelity of the results are discussed for both scales. LES, for example, requires nearly 100 times more processor time than the mass-consistent diagnostic model or the RANS model, and seems better able to capture flow entrainment behind tall buildings. As anticipated, results obtained by LLNL and CEA at regional scale around Chicago and Paris look very similar in terms of both atmospheric dispersion of the radiological release and total effective dose. Both LLNL and CEA used the same meteorological data, Lagrangian particle dispersion models, and the same dose coefficients. LLNL and CEA urban-scale modeling results show consistent phenomenological behavior and predict similar impacted areas even though the detailed 3D flow patterns differ, particularly for the Chicago cases where differences in vertical entrainment behind tall buildings are particularly notable. Although RANS and LES (LLNL) models incorporate more detailed physics than do mass-consistent diagnostic flow models (CEA), it is not possible to reach definite conclusions about the prediction fidelity of the various models as experimental measurements were not available for comparison. Stronger conclusions about the relative performances of the models involved and evaluation of the tradeoffs involved in model simplification could be made with a systematic benchmarking of urban-scale modeling. This could be the purpose of a future US / French collaborative exercise.« less

  16. Preliminary theoretical acoustic and rf sounding calculations for MILL RACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warshaw, S.I.; Dubois, P.F.

    1981-11-02

    As participant in DOE/ISA's Ionospheric Monitoring Program, LLNL has the responsibility of providing theoretical understanding and calculational support for experimental activities carried out by Los Alamos National Laboratory in using ionospheric sounders to remotely detect violent atmospheric phenomena. We have developed a system of interconnected computer codes which simulate the entire range of atmospheric and ionospheric processes involved in this remote detection procedure. We are able to model the acoustic pulse shape from an atmospheric explosion, the subsequent nonlinear transport of this energy to all parts of the immediate atmosphere including the ionosphere, and the propagation of high-frequency ratio wavesmore » through the acoustically perturbed ionosphere. Los Alamos' coverage of DNA's MILL RACE event provided an excellent opportunity to assess the credibility of the calculational system to correctly predict how ionospheric sounders would respond to a surface-based chemical explosion. In this experiment, 600 tons of high explosive were detonated at White Sands Missile Range at 12:35:40 local time on 16 September 1981. Vertical incidence rf phase sounders and bistatic oblique incidence rf sounders fielded by Los Alamos and SRI International throughout New Mexico and southern Colorado detected the ionospheric perturbation that ensued. A brief account of preliminary calculations of the acoustic disturbance and the predicted ionospheric sounder signatures for MILL RACE is presented. (WHK)« less

  17. A Multifluid Numerical Algorithm for Interpenetrating Plasma Dynamics

    NASA Astrophysics Data System (ADS)

    Ghosh, Debojyoti; Kavouklis, Christos; Berger, Richard; Chapman, Thomas; Hittinger, Jeffrey

    2017-10-01

    Interpenetrating plasmas occur in situations including inertial confinement fusion experiments, where plasmas ablate off the hohlraum and capsule surfaces and interact with each other, and in high-energy density physics experiments that involve the collision of plasma streams ablating off discs irradiated by laser beams. Single-fluid, multi-species hydrodynamic models are not well-suited to study this interaction because they cannot support more than a single fluid velocity; this results in unphysical solutions. Though kinetic models yield accurate solutions for multi-fluid interactions, they are prohibitively expensive for at-scale three-dimensional (3D) simulations. In this study, we propose a multifluid approach where the compressible fluid equations are solved for each ion species and the electrons. Electrostatic forces and inter-species friction and thermal equilibration couple the species. A high-order finite-volume algorithm with explicit time integration is used to solve on a 3D Cartesian domain, and a high-order Poisson solver is used to compute the electrostatic potential. We present preliminary results for the interpenetration of two plasma streams in vacuum and in the presence of a gas fill. This work was performed under the auspices of the U.S. DOE by Lawrence Livermore National Laboratory under Contract No. DE-AC52- 07NA27344 and funded by the LDRD Program at LLNL under project tracking code 17-ERD-081.

  18. Measurements of plasma mirror reflectivity and focal spot quality for tens of picosecond laser pulses

    NASA Astrophysics Data System (ADS)

    Forestier-Colleoni, Pierre; Williams, Jackson; Scott, Graeme; Mariscal, Dereck. A.; McGuffey, Christopher; Beg, Farhat N.; Chen, Hui; Neely, David; Ma, Tammy

    2017-10-01

    The Advanced Radiographic Capability (ARC) laser at the NIF (LLNL) is high-energy ( 4 kJ) with a pulse length of 30ps, and is capable of focusing to an intensity of 1018W/cm2 with a 100 μm focal spot. The ARC laser is at an intensity which can be used to produce proton beams. However, for applications such as radiography and warm dense matter creation, a higher laser intensity may be desired to generate more energetic proton beams. One possibility to increase the intensity is to decrease the focused spot size by employing a smaller f-number optic. But it is difficult to implement such an optic or to bring the final focusing parabola closer to the target within the complicated NIF chamber geometry. A proposal is to use ellipsoidal plasma mirrors (PM) for fast focusing of the ARC laser light, thereby increasing the peak intensity. There is uncertainty, however, in the survivability and reflectivity of PM at such long pulse durations. Here, we show experimental results from the Titan laser to study the reflectivity of flat PM as a function of laser pulse length. A calorimeter was used to measure the PM reflectivity. We also observed degradation of the far and near field energy distribution of the laser after the reflection by the PM for pulse-lengths beyond 10ps. Contract DE-AC52-07NA27344. Funded by the LLNL LDRD program: tracking code 17-ERD-039.

  19. Visualising Earth's Mantle based on Global Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Bozdag, E.; Pugmire, D.; Lefebvre, M. P.; Hill, J.; Komatitsch, D.; Peter, D. B.; Podhorszki, N.; Tromp, J.

    2017-12-01

    Recent advances in 3D wave propagation solvers and high-performance computing have enabled regional and global full-waveform inversions. Interpretation of tomographic models is often done on visually. Robust and efficient visualization tools are necessary to thoroughly investigate large model files, particularly at the global scale. In collaboration with Oak Ridge National Laboratory (ORNL), we have developed effective visualization tools and used for visualization of our first-generation global model, GLAD-M15 (Bozdag et al. 2016). VisIt (https://wci.llnl.gov/simulation/computer-codes/visit/) is used for initial exploration of the models and for extraction of seismological features. The broad capability of VisIt, and its demonstrated scalability proved valuable for experimenting with different visualization techniques, and in the creation of timely results. Utilizing VisIt's plugin-architecture, a data reader plugin was developed, which reads the ADIOS (https://www.olcf.ornl.gov/center-projects/adios/) format of our model files. Blender (https://www.blender.org) is used for the setup of lighting, materials, camera paths and rendering of geometry. Python scripting was used to control the orchestration of different geometries, as well as camera animation for 3D movies. While we continue producing 3D contour plots and movies for various seismic parameters to better visualize plume- and slab-like features as well as anisotropy throughout the mantle, our aim is to make visualization an integral part of our global adjoint tomography workflow to routinely produce various 2D cross-sections to facilitate examination of our models after each iteration. This will ultimately form the basis for use of pattern recognition techniques in our investigations. Simulations for global adjoint tomography are performed on ORNL's Titan system and visualization is done in parallel on ORNL's post-processing cluster Rhea.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verce, M. F.; Schwartz, L. I.

    This was a collaborative effort between LLNL and STE to investigate the use of vaporized hydrogen peroxide (VHP®) to decontaminate spore-contaminated heating, ventilation, and cooling (HV AC) systems in a trailer sized room. LLNL's effort under this CRADA was funded by DOE's Chemical and Biological National Security Program (CBNP), which later became part of Department of Homeland Security in 2004.

  1. The Next Linear Collider Program

    Science.gov Websites

    Navbar Other Address Books: Laboratory Phone/Email Web Directory SLAC SLAC Phonebook Entire SLAC Web FNAL Telephone Directory Fermilab Search LLNL Phone Book LLNL Web Servers LBNL Directory Services Web Search: A-Z Index KEK E-mail Database Research Projects NLC Website Search: Entire SLAC Web | Help

  2. ARC-2010-ACD10-0020-013

    NASA Image and Video Library

    2010-01-14

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. Cab being lifted into the tunnel.

  3. ARC-2010-ACD10-0020-023

    NASA Image and Video Library

    2010-02-03

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. Trailer being lifted into the tunnel.

  4. Automated System for Aneuploidy Detection in Sperm Final Report CRADA No. TC-1364-96: Phase I SBIR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyrobek, A. J.; Dunlay, R. T.

    This project was a relationship between Lawrence Livermore National Laboratory (LLNL) and Biological Detection, Inc. (now known as Cellomics, Inc.) It was funded as a Phase I SBIR from the National Institutes of Health (NIH) awarded to Cellomics, Inc. with a subcontract to LLNL.

  5. ARC-2010-ACD10-0020-082

    NASA Image and Video Library

    2010-02-10

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. Smoke test demo with Ron Schoon, Navistar.

  6. ARC-2010-ACD10-0020-079

    NASA Image and Video Library

    2010-02-10

    Lawrence Livermore National Labs (LLNL), Navistar and the Department of Energy conduct tests in the NASA Ames National Full-scale Aerodynamic Complex 80x120_foot wind tunnel. The LLNL project is aimed at aerodynamic truck and trailer devices that can reduce fuel consumption at highway speed by 10 percent. Smoke test demo with Ron Schoon, Navistar.

  7. Trip Report United Arab Emirates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakanishi, K; Rodgers, A

    2004-10-06

    Keith Nakanishi and Arthur Rodgers traveled to the United Arab Emirates in February, 2004 to continue an on-going technical collaboration with UAE University and to service the two temporary LLNL seismic stations. Nakanishi and Rodgers then participated in the Gulf Seismic Forum, which was organized by LLNL and sponsored by the University of Sharjah.

  8. Measurement of the Shock Velocity and Symmetry History in Decaying Shock Pulses

    NASA Astrophysics Data System (ADS)

    Baker, Kevin; Milovich, Jose; Jones, Oggie; Robey, Harry; Smalyuk, Vladimir; Casey, Daniel; Celliers, Peter; Clark, Dan; Giraldez, Emilio; Haan, Steve; Hamza, Alex; Berzak-Hopkins, Laura; Jancaitis, Ken; Kroll, Jeremy; Lafortune, Kai; MacGowan, Brian; Macphee, Andrew; Moody, John; Nikroo, Abbas; Peterson, Luc; Raman, Kumar; Weber, Chris; Widmayer, Clay

    2014-10-01

    Decaying first shock pulses are predicted in simulations to provide more stable implosions and still achieve a low adiabat in the fuel, enabling a higher fuel compression similar to ``low foot'' laser pulses. The first step in testing these predictions was to measure the shock velocity for both a three shock and a four shock adiabat-shaped pulse in a keyhole experimental platform. We present measurements of the shock velocity history, including the decaying shock velocity inside the ablator, and compare it with simulations, as well as with previous low and high foot pulses. Using the measured pulse shape, the predicted adiabat from simulations is presented and compared with the calculated adiabat from low and high foot laser pulse shapes. This work was performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344.

  9. Deflagration to Detonation Transition (DDT) Simulations of HMX Powder Using the HERMES Model

    NASA Astrophysics Data System (ADS)

    White, Bradley; Reaugh, John; Tringe, Joseph

    2017-06-01

    We performed computer simulations of DDT experiments with Class I HMX powder using the HERMES model (High Explosive Response to MEchanical Stimulus) in ALE3D. Parameters for the model were fitted to the limited available mechanical property data of the low-density powder, and to the Shock to Detonation Transition (SDT) test results. The DDT tests were carried out in steel-capped polycarbonate tubes. This arrangement permits direct observation of the event using both flash X-ray radiography and high speed camera imaging, and provides a stringent test of the model. We found the calculated detonation transition to be qualitatively similar to experiment. Through simulation we also explored the effects of confinement strength, the HMX particle size distribution and porosity on the computed detonation transition location. This work was performed under the auspices of the US DOE by LLNL under Contract DE-AC52-07NA27344.

  10. Development of Diagnostics for the Livermore DPF Devices

    NASA Astrophysics Data System (ADS)

    Mitrani, James; Prasad, Rahul R.; Podpaly, Yuri A.; Cooper, Christopher M.; Chapman, Steven F.; Shaw, Brian H.; Povilus, Alexander P.; Schmidt, Andrea

    2017-10-01

    LLNL is commissioning several new diagnostics to understand and optimize ion and neutron production in their dense plasma focus (DPF) systems. Gas fills used in DPF devices at LLNL are Deuterium (D2) and He accelerated onto a Be target, for production of neutrons. Neutron yields are currently measured with Helium-3 tubes, and development of yttrium-based activation detectors is currently underway. Neutron time-of-flight (nTOF) signals from prompt neutrons will be measured with gadolinium-doped liquid scintillators. An ion energy analyzer will be used to diagnose energy distribution of D + and He +2 ions. Additionally, a fast frame ICCD camera has been applied to image the plasma sheath during the rundown and pinch phases. Sheath velocity will be measured with an array of discrete photodiodes with ns time responses. A discussion of our results will be presented. Prepared by LLNL under Contract DE-AC52-07NA27344, and supported by the Laboratory Directed Research and Development Program (15-ERD-034) at LLNL and the Office of Defense Nuclear Nonproliferation Research and Development within U.S. Department of Energy.

  11. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talkingmore » about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.« less

  12. Screening Program Reduced Melanoma Mortality at the Lawrence Livermore National Laboratory, 1984-1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, MD, J S; II, PhD, D; MD, PhD, M

    Worldwide incidence of cutaneous malignant melanoma has increased substantially, and no screening program has yet demonstrated reduction in mortality. We evaluated the education, self examination and targeted screening campaign at the Lawrence Livermore National Laboratory (LLNL) from its beginning in July 1984 through 1996. The thickness and crude incidence of melanoma from the years before the campaign were compared to those obtained during the 13 years of screening. Melanoma mortality during the 13-year period was based on a National Death Index search. Expected yearly deaths from melanoma among LLNL employees were calculated by using California mortality data matched by age,more » sex, and race/ethnicity and adjusted to exclude deaths from melanoma diagnosed before the program began or before employment at LLNL. After the program began, crude incidence of melanoma thicker than 0.75 mm decreased from 18 to 4 cases per 100,000 person-years (p = 0.02), while melanoma less than 0.75mm remained stable and in situ melanoma increased substantially. No eligible melanoma deaths occurred among LLNL employees during the screening period compared with a calculated 3.39 expected deaths (p = 0.034). Education, self examination and selective screening for melanoma at LLNL significantly decreased incidence of melanoma thicker than 0.75 mm and reduced the melanoma-related mortality rate to zero. This significant decrease in mortality rate persisted for at least 3 yr after employees retired or otherwise left the laboratory.« less

  13. Training and qualification of health and safety technicians at a national laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egbert, W.F.; Trinoskey, P.A.

    1994-10-01

    Over the last 30 years, Lawrence Livermore National Laboratory (LLNL) has successfully implemented the concept of a multi-disciplined technician. LLNL Health and Safety Technicians have responsibilities in industrial hygiene, industrial safety, health physics, as well as fire, explosive, and criticality safety. One of the major benefits to this approach is the cost-effective use of workers who display an ownership of health and safety issues which is sometimes lacking when responsibilities are divided. Although LLNL has always promoted the concept of a multi-discipline technician, this concept is gaining interest within the Department of Energy (DOE) community. In November 1992, individuals frommore » Oak Ridge Institute of Science and Education (ORISE) and RUST Geotech, joined by LLNL established a committee to address the issues of Health and Safety Technicians. In 1993, the DOE Office of Environmental, Safety and Health, in response to the Defense Nuclear Facility Safety Board Recommendation 91-6, stated DOE projects, particularly environmental restoration, typically present hazards other than radiation such as chemicals, explosives, complex construction activities, etc., which require additional expertise by Radiological Control Technicians. They followed with a commitment that a training guide would be issued. The trend in the last two decades has been toward greater specialization in the areas of health and safety. In contrast, the LLNL has moved toward a generalist approach integrating the once separate functions of the industrial hygiene and health physics technician into one function.« less

  14. Historic Context and Building Assessments for the Lawrence Livermore National Laboratory Built Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullrich, R. A.; Sullivan, M. A.

    2007-09-14

    This document was prepared to support u.s. Department of Energy / National Nuclear Security Agency (DOE/NNSA) compliance with Sections 106 and 110 of the National Historic Preservation Act (NHPA). Lawrence Livermore National Laboratory (LLNL) is a DOE/NNSA laboratory and is engaged in determining the historic status of its properties at both its main site in Livermore, California, and Site 300, its test site located eleven miles from the main site. LLNL contracted with the authors via Sandia National Laboratories (SNL) to prepare a historic context statement for properties at both sites and to provide assessments of those properties of potentialmore » historic interest. The report contains an extensive historic context statement and the assessments of individual properties and groups of properties determined, via criteria established in the context statement, to be of potential interest. The historic context statement addresses the four contexts within which LLNL falls: Local History, World War II History (WWII), Cold War History, and Post-Cold War History. Appropriate historic preservation themes relevant to LLNL's history are delineated within each context. In addition, thresholds are identified for historic significance within each of the contexts based on the explication and understanding of the Secretary of the Interior's Guidelines for determining eligibility for the National Register of Historic Places. The report identifies specific research areas and events in LLNL's history that are of interest and the portions of the built environment in which they occurred. Based on that discussion, properties of potential interest are identified and assessments of them are provided. Twenty individual buildings and three areas of potential historic interest were assessed. The final recommendation is that, of these, LLNL has five individual historic buildings, two sets of historic objects, and two historic districts eligible for the National Register. All are eligible within the Cold War History context. They are listed in the table below, along with the Cold War preservation theme, period of significance, and criterion under which they are eligible.« less

  15. LLNL-G3Dv3: Global P wave tomography model for improved regional and teleseismic travel time prediction: LLNL-G3DV3---GLOBAL P WAVE TOMOGRAPHY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmons, N. A.; Myers, S. C.; Johannesson, G.

    [1] We develop a global-scale P wave velocity model (LLNL-G3Dv3) designed to accurately predict seismic travel times at regional and teleseismic distances simultaneously. The model provides a new image of Earth's interior, but the underlying practical purpose of the model is to provide enhanced seismic event location capabilities. The LLNL-G3Dv3 model is based on ∼2.8 millionP and Pnarrivals that are re-processed using our global multiple-event locator called Bayesloc. We construct LLNL-G3Dv3 within a spherical tessellation based framework, allowing for explicit representation of undulating and discontinuous layers including the crust and transition zone layers. Using a multiscale inversion technique, regional trendsmore » as well as fine details are captured where the data allow. LLNL-G3Dv3 exhibits large-scale structures including cratons and superplumes as well numerous complex details in the upper mantle including within the transition zone. Particularly, the model reveals new details of a vast network of subducted slabs trapped within the transition beneath much of Eurasia, including beneath the Tibetan Plateau. We demonstrate the impact of Bayesloc multiple-event location on the resulting tomographic images through comparison with images produced without the benefit of multiple-event constraints (single-event locations). We find that the multiple-event locations allow for better reconciliation of the large set of direct P phases recorded at 0–97° distance and yield a smoother and more continuous image relative to the single-event locations. Travel times predicted from a 3-D model are also found to be strongly influenced by the initial locations of the input data, even when an iterative inversion/relocation technique is employed.« less

  16. Non-Invasive Pneumothorax Detector Final Report CRADA No. TC02110.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, J. T.; Purcell, R.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and ElectroSonics Medical Inc. (formerly known as BIOMEC, Inc.), to develop a non-invasive pneumothorax detector based upon the micropower impulse radar technology invented at LLNL. Under a Work for Others Subcontract (L-9248), LLNL and ElectroSonics successfully demonstrated the feasibility of a novel device for non-invasive detection of pneumothorax for emergency and long-term monitoring. The device is based on Micropower Impulse Radar (MIR) Ultra Wideband (UWB) technology. Phase I experimental results were promising, showing that a pneumothorax volume even asmore » small as 30 ml was clearly detectable from the MIR signals. Phase I results contributed to the award of a National Institute of Health (NIH) SBIR Phase II grant to support further research and development. The Phase II award led to the establishment of a LLNL/ElectroSonics CRADA related to Case No. TC02045.0. Under the subsequent CRADA, LLNL and ElectroSonics successfully demonstrated the feasibility of the pneumothorax detection in human subject research trials. Under this current CRADA TC02110.0, also referred to as Phase II Type II, the project scope consisted of seven tasks in Project Year 1; five tasks in Project Year 2; and four tasks in Project Year 3. Year 1 tasks were aimed toward the delivery of the pneumothorax detector design package for the pre-production of the miniaturized CompactFlash dockable version of the system. The tasks in Project Years 2 and 3 critically depended upon the accomplishments of Task 1. Since LLNL’s task was to provide subject matter expertise and performance verification, much of the timeline of engagement by the LLNL staff depended upon the overall project milestones as determined by the lead organization ElectroSonics. The scope of efforts were subsequently adjusted accordingly to commensurate with funding availability.« less

  17. Fully coupled simulation of cosmic reionization. I. numerical methods and tests

    DOE PAGES

    Norman, Michael L.; Reynolds, Daniel R.; So, Geoffrey C.; ...

    2015-01-09

    Here, we describe an extension of the Enzo code to enable fully coupled radiation hydrodynamical simulation of inhomogeneous reionization in large similar to(100 Mpc)(3) cosmological volumes with thousands to millions of point sources. We solve all dynamical, radiative transfer, thermal, and ionization processes self-consistently on the same mesh, as opposed to a postprocessing approach which coarse-grains the radiative transfer. But, we employ a simple subgrid model for star formation which we calibrate to observations. The numerical method presented is a modification of an earlier method presented in Reynolds et al. differing principally in the operator splitting algorithm we use tomore » advance the system of equations. Radiation transport is done in the gray flux-limited diffusion (FLD) approximation, which is solved by implicit time integration split off from the gas energy and ionization equations, which are solved separately. This results in a faster and more robust scheme for cosmological applications compared to the earlier method. The FLD equation is solved using the hypre optimally scalable geometric multigrid solver from LLNL. By treating the ionizing radiation as a grid field as opposed to rays, our method is scalable with respect to the number of ionizing sources, limited only by the parallel scaling properties of the radiation solver. We test the speed and accuracy of our approach on a number of standard verification and validation tests. We show by direct comparison with Enzo's adaptive ray tracing method Moray that the well-known inability of FLD to cast a shadow behind opaque clouds has a minor effect on the evolution of ionized volume and mass fractions in a reionization simulation validation test. Finally, we illustrate an application of our method to the problem of inhomogeneous reionization in a 80 Mpc comoving box resolved with 3200(3) Eulerian grid cells and dark matter particles.« less

  18. Discrete fracture modeling of multiphase flow and hydrocarbon production in fractured shale or low permeability reservoirs

    NASA Astrophysics Data System (ADS)

    Hao, Y.; Settgast, R. R.; Fu, P.; Tompson, A. F. B.; Morris, J.; Ryerson, F. J.

    2016-12-01

    It has long been recognized that multiphase flow and transport in fractured porous media is very important for various subsurface applications. Hydrocarbon fluid flow and production from hydraulically fractured shale reservoirs is an important and complicated example of multiphase flow in fractured formations. The combination of horizontal drilling and hydraulic fracturing is able to create extensive fracture networks in low permeability shale rocks, leading to increased formation permeability and enhanced hydrocarbon production. However, unconventional wells experience a much faster production decline than conventional hydrocarbon recovery. Maintaining sustainable and economically viable shale gas/oil production requires additional wells and re-fracturing. Excessive fracturing fluid loss during hydraulic fracturing operations may also drive up operation costs and raise potential environmental concerns. Understanding and modeling processes that contribute to decreasing productivity and fracturing fluid loss represent a critical component for unconventional hydrocarbon recovery analysis. Towards this effort we develop a discrete fracture model (DFM) in GEOS (LLNL multi-physics computational code) to simulate multiphase flow and transfer in hydraulically fractured reservoirs. The DFM model is able to explicitly account for both individual fractures and their surrounding rocks, therefore allowing for an accurate prediction of impacts of fracture-matrix interactions on hydrocarbon production. We apply the DFM model to simulate three-phase (water, oil, and gas) flow behaviors in fractured shale rocks as a result of different hydraulic stimulation scenarios. Numerical results show that multiphase flow behaviors at the fracture-matrix interface play a major role in controlling both hydrocarbon production and fracturing fluid recovery rates. The DFM model developed in this study will be coupled with the existing hydro-fracture model to provide a fully integrated geomechanical and reservoir simulation capability for an accurate prediction and assessment of hydrocarbon production and hydraulic fracturing performance. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. Lawrence Livermore National Laboratory Environmental Report 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Henry E.; Armstrong, Dave; Blake, Rick G.

    Lawrence Livermore National Laboratory (LLNL) is a premier research laboratory that is part of the National Nuclear Security Administration (NNSA) within the U.S. Department of Energy (DOE). As a national security laboratory, LLNL is responsible for ensuring that the nation’s nuclear weapons remain safe, secure, and reliable. The Laboratory also meets other pressing national security needs, including countering the proliferation of weapons of mass destruction and strengthening homeland security, and conducting major research in atmospheric, earth, and energy sciences; bioscience and biotechnology; and engineering, basic science, and advanced technology. The Laboratory is managed and operated by Lawrence Livermore National Security,more » LLC (LLNS), and serves as a scientific resource to the U.S. government and a partner to industry and academia. LLNL operations have the potential to release a variety of constituents into the environment via atmospheric, surface water, and groundwater pathways. Some of the constituents, such as particles from diesel engines, are common at many types of facilities while others, such as radionuclides, are unique to research facilities like LLNL. All releases are highly regulated and carefully monitored. LLNL strives to maintain a safe, secure and efficient operational environment for its employees and neighboring communities. Experts in environment, safety and health (ES&H) support all Laboratory activities. LLNL’s radiological control program ensures that radiological exposures and releases are reduced to as low as reasonably achievable to protect the health and safety of its employees, contractors, the public, and the environment. LLNL is committed to enhancing its environmental stewardship and managing the impacts its operations may have on the environment through a formal Environmental Management System. The Laboratory encourages the public to participate in matters related to the Laboratory’s environmental impact on the community by soliciting citizens’ input on matters of significant public interest and through various communications. The Laboratory also provides public access to information on its ES&H activities. LLNL consists of two sites—an urban site in Livermore, California, referred to as the “Livermore Site,” which occupies 1.3 square miles; and a rural Experimental Test Site, referred to as “Site 300,” near Tracy, California, which occupies 10.9 square miles. In 2012 the Laboratory had a staff of approximately 7000.« less

  20. Lawrence Livermore National Laboratory Environmental Report 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, H. E.; Bertoldo, N. A.; Blake, R. G.

    Lawrence Livermore National Laboratory (LLNL) is a premier research laboratory that is part of the National Nuclear Security Administration (NNSA) within the U.S. Department of Energy (DOE). As a national security laboratory, LLNL is responsible for ensuring that the nation’s nuclear weapons remain safe, secure, and reliable. The Laboratory also meets other pressing national security needs, including countering the proliferation of weapons of mass destruction and strengthening homeland security, and conducting major research in atmospheric, earth, and energy sciences; bioscience and biotechnology; and engineering, basic science, and advanced technology. The Laboratory is managed and operated by Lawrence Livermore National Security,more » LLC (LLNS), and serves as a scientific resource to the U.S. government and a partner to industry and academia. LLNL operations have the potential to release a variety of constituents into the environment via atmospheric, surface water, and groundwater pathways. Some of the constituents, such as particles from diesel engines, are common at many types of facilities while others, such as radionuclides, are unique to research facilities like LLNL. All releases are highly regulated and carefully monitored. LLNL strives to maintain a safe, secure and efficient operational environment for its employees and neighboring communities. Experts in environment, safety and health (ES&H) support all Laboratory activities. LLNL’s radiological control program ensures that radiological exposures and releases are reduced to as low as reasonably achievable to protect the health and safety of its employees, contractors, the public, and the environment. LLNL is committed to enhancing its environmental stewardship and managing the impacts its operations may have on the environment through a formal Environmental Management System. The Laboratory encourages the public to participate in matters related to the Laboratory’s environmental impact on the community by soliciting citizens’ input on matters of significant public interest and through various communications. The Laboratory also provides public access to information on its ES&H activities. LLNL consists of two sites—an urban site in Livermore, California, referred to as the “Livermore Site,” which occupies 1.3 square miles; and a rural Experimental Test Site, referred to as “Site 300,” near Tracy, California, which occupies 10.9 square miles. In 2013 the Laboratory had a staff of approximately 6,300.« less

  1. Livermore Site Spill Prevention, Control, and Countermeasures (SPCC) Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bellah, W.; Griffin, D.; Mertesdorf, E.

    This Spill Prevention, Control, and Countermeasure (SPCC) Plan describes the measures that are taken at Lawrence Livermore National Laboratory’s (LLNL) Livermore Site in Livermore, California, to prevent, control, and handle potential spills from aboveground containers that can contain 55 gallons or more of oil. This SPCC Plan complies with the Oil Pollution Prevention regulation in Title 40 of the Code of Federal Regulations (40 CFR), Part 112 (40 CFR 112) and with 40 CFR 761.65(b) and (c), which regulates the temporary storage of polychlorinated biphenyls (PCBs). This Plan has also been prepared in accordance with Division 20, Chapter 6.67 ofmore » the California Health and Safety Code (HSC 6.67) requirements for oil pollution prevention (referred to as the Aboveground Petroleum Storage Act [APSA]), and the United States Department of Energy (DOE) Order No. 436.1. This SPCC Plan establishes procedures, methods, equipment, and other requirements to prevent the discharge of oil into or upon the navigable waters of the United States or adjoining shorelines for aboveground oil storage and use at the Livermore Site.« less

  2. FY06 LDRD Final Report: Broadband Radiation and Scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madsen, N; Fasenfest, B; White, D

    2007-03-08

    This is the final report for LDRD 01-ERD-005. The Principle Investigator was Robert Sharpe. Collaborators included Niel Madsen, Benjamin Fasenfest, John D. Rockway, of the Defense Sciences Engineering Division (DSED), Vikram Jandhyala and James Pingenot from the University of Washington, and Mark Stowell of the Center for Applications Development and Software Engineering (CADSE). It should be noted that Benjamin Fasenfest and Mark Stowell were partially supported under other funding. The purpose of this LDRD effort was to enhance LLNL's computational electromagnetics capability in the area of broadband radiation and scattering. For radiation and scattering problems our transient EM codes aremore » limited by the approximate Radiation Boundary Conditions (RBC's) used to model the radiation into an infinite space. Improved RBC's were researched, developed, and incorporated into the existing EMSolve finite-element code to provide a 10-100x improvement in the accuracy of the boundary conditions. Section I provides an introduction to the project and the project goals. Section II provides a summary of the project's research and accomplishments as presented in the attached papers.« less

  3. Selected results from LLNL-Hughes RAR for West Coast Scotland Experiment 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, S.K.; Johnston, B.; Twogood, R.

    1993-01-05

    The joint US-UK 1992 West Coast Scotland Experiment (WCSEX) was held in the Sound of Sleat from June 6 to 25. The LLNL-Hughes team fielded a fully polarimetric X-band hill-side real aperture radar to collect internal wave wake data. We present here a sample data set of the best radar runs.

  4. Development of Operational Free-Space-Optical (FSO) Laser Communication Systems Final Report CRADA No. TC02093.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, A.; Orgren, A.

    This project was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL) and LGS Innovations, LLC (formerly Lucent Technologies, Inc.), to develop long-range and mobile operational free-space optical (FSO) laser communication systems for specialized government applications. LLNL and LGS Innovations formerly Lucent Bell Laboratories Government Communications Systems performed this work for a United States Government (USG) Intelligence Work for Others (I-WFO) customer, also referred to as "Government Customer", or "Customer" and "Government Sponsor." The CRADA was a critical and required part of the LLNL technology transfer plan formore » the customer.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, C.; Arsenlis, T.; Bailey, A.

    Lawrence Livermore National Laboratory Campus Capability Plan for 2018-2028. Lawrence Livermore National Laboratory (LLNL) is one of three national laboratories that are part of the National Nuclear Security Administration. LLNL provides critical expertise to strengthen U.S. security through development and application of world-class science and technology that: Ensures the safety, reliability, and performance of the U.S. nuclear weapons stockpile; Promotes international nuclear safety and nonproliferation; Reduces global danger from weapons of mass destruction; Supports U.S. leadership in science and technology. Essential to the execution and continued advancement of these mission areas are responsive infrastructure capabilities. This report showcases each LLNLmore » capability area and describes the mission, science, and technology efforts enabled by LLNL infrastructure, as well as future infrastructure plans.« less

  6. Atmospheric Dispersion Modeling of the February 2014 Waste Isolation Pilot Plant Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasstrom, John; Piggott, Tom; Simpson, Matthew

    2015-07-22

    This report presents the results of a simulation of the atmospheric dispersion and deposition of radioactivity released from the Waste Isolation Pilot Plant (WIPP) site in New Mexico in February 2014. These simulations were made by the National Atmospheric Release Advisory Center (NARAC) at Lawrence Livermore National Laboratory (LLNL), and supersede NARAC simulation results published in a previous WIPP report (WIPP, 2014). The results presented in this report use additional, more detailed data from WIPP on the specific radionuclides released, radioactivity release amounts and release times. Compared to the previous NARAC simulations, the new simulation results in this report aremore » based on more detailed modeling of the winds, turbulence, and particle dry deposition. In addition, the initial plume rise from the exhaust vent was considered in the new simulations, but not in the previous NARAC simulations. The new model results show some small differences compared to previous results, but do not change the conclusions in the WIPP (2014) report. Presented are the data and assumptions used in these model simulations, as well as the model-predicted dose and deposition on and near the WIPP site. A comparison of predicted and measured radionuclide-specific air concentrations is also presented.« less

  7. September 2002 Working Group Meeting on Heavy Vehicle Aerodynamic Drag: Presentations and Summary of Comments and Conclusions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCallen, R

    2002-09-01

    A Working Group Meeting on Heavy Vehicle Aerodynamic Drag was held at NASA Ames Research Center on September 23, 2002. The purpose of the meeting was to present and discuss technical details on the experimental and computational work in progress and future project plans. Representatives from the Department of Energy (DOE)/Office of Energy Efficiency and Renewable Energy/Office of FreedomCAR & Vehicle Technologies, Lawrence Livermore National Laboratory (LLNL), Sandia National Laboratories (SNL), NASA Ames Research Center (NASA), University of Southern California (USC), California Institute of Technology (Caltech), Georgia Tech Research Institute (GTRI), Argonne National Laboratory (ANL), Freightliner, and Portland State Universitymore » participated in the meeting. This report contains the technical presentations (viewgraphs) delivered at the Meeting, briefly summarizes the comments and conclusions, and outlines the future action items. The meeting began with an introduction by the Project Lead Rose McCallen of LLNL, where she emphasized that the world energy consumption is predicted to relatively soon exceed the available resources (i.e., fossil, hydro, non-breeder fission). This short fall is predicted to begin around the year 2050. Minimizing vehicle aerodynamic drag will significantly reduce our Nation's dependence on foreign oil resources and help with our world-wide fuel shortage. Rose also mentioned that educating the populace and researchers as to our world energy issues is important and that our upcoming United Engineering Foundation (UEF) Conference on ''The Aerodynamics of Heavy Vehicles: Trucks, Busses, and Trains'' was one way our DOE Consortium was doing this. Mentioned were the efforts of Fred Browand from USC in organizing and attracting internationally recognized speakers to the Conference. Rose followed with an overview of the DOE project goals, deliverables, and FY03 activities. The viewgraphs are attached at the end of this report. Sid Diamond of DOE discussed the reorganization of the Office of Energy Efficiency and Renewable Energy and that the Office of Heavy Vehicle Technology is now part of the Office of FreedomCAR & Vehicle Technologies. Sid reviewed the FY03 budget and provided information on some plans for FY04. The soon to be posted DOE request for proposals from industry for projects related to parasitic energy losses was discussed. A minimum of 50% cost share by industry will be required and the proposal must be submitted by industry. Collaborative efforts in aerodynamic drag with members of the DOE consortium are encouraged. Sid also mentioned interest in aerodynamic drag contribution due to wheel wells and underbody flow. Sid also mentioned his continued interest in the application of our computational and experimental expertise to the area of locomotive and railcar aerodynamics for the reduction of drag effects and thus, the reduction of fuel consumption by trains. In summary, the technical presentations at the meeting included a review of experimental results and plans by GTRI, USC, and NASA Ames, the computational results from LLNL and SNL for the integrated tractor-trailer benchmark geometry called the Ground Transportation System (GTS) model, and by LLNL for the tractor-trailer gap and trailer wake flow, and turbulence model development and benchmark simulations being investigated by Caltech. USC is also investigating an acoustic drag reduction device that has been named ''Mozart'', GTRI continues their investigation of a blowing device, and LLNL presented their ideas for 2 new base drag reduction devices. ANL presented their plans for a DOE supported Cooperative Research and Development Agreement (CRADA) with Paccar Truck Company utilizing commercial software tools to simulate the flow and drag for an actual tractor and showed the results of some preliminary griding attempts. The attendees also had the opportunity to tour the 12-ft pressure wind tunnel the machine shop were the Generic Conventional Model (GCM, a.k.a. SLRT) was being readied for the scheduled November experiments. Much of the discussion involved wind tunnel testing plans, analysis of existing experimental data, investigations of drag reduction devices, simulation results, and needed modeling improvements. Further details are provided in the attached viewgraphs.« less

  8. Investigations of the Rayleigh-Taylor and Richtmyer-Meshkov Instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riccardo Bonazza; Mark Anderson; Jason Oakley

    2008-03-14

    The present program is centered on the experimental study of shock-induced interfacial fluid instabilities. Both 2-D (near-sinusoids) and 3-D (spheres) initial conditions are studied in a large, vertical square shock tube facility. The evolution of the interface shape, its distortion, the modal growth rates and the mixing of the fluids at the interface are all objectives of the investigation. In parallel to the experiments, calculations are performed using the Raptor code, on platforms made available by LLNL. These flows are of great relevance to both ICF and stockpile stewardship. The involvement of four graduate students is in line with themore » national laboratories' interest in the education of scientists and engineers in disciplines and technologies consistent with the labs' missions and activities.« less

  9. Investigation of the Richtmyer-Meshkov instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riccardo Bonazza; Mark Anderson; Jason Oakley

    2008-12-22

    The present program is centered on the experimental study of shock-induced interfacial fluid instabilities. Both 2-D (near-sinusoids) and 3-D (spheres) initial conditions are studied in a large, vertical square shock tube facility. The evolution of the interface shape, its distortion, the modal growth rates and the mixing of the fluids at the interface are all objectives of the investigation. In parallel to the experiments, calculations are performed using the Raptor code, on platforms made available by LLNL. These flows are of great relevance to both ICF and stockpile stewardship. The involvement of three graduate students is in line with themore » national laboratories' interest in the education of scientists and engineers in disciplines and technologies consistent with the labs' missions and activities.« less

  10. LDRD Final Report for''Tactical Laser Weapons for Defense'' SI (Tracking Code 01-SI-011)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beach, R; Zapata, L

    The focus of this project was a convincing demonstration of two new technological approaches to high beam quality; high average power solid-state laser systems that would be of interest for tactical laser weapon applications. Two pathways had been identified to such systems that built on existing thin disk and fiber laser technologies. This SI was used as seed funding to further develop and vet these ideas. Significantly, the LLNL specific enhancements to these proposed technology paths were specifically addressed for devising systems scaleable to the 100 kW average power level. In the course of performing this work we have establishedmore » an intellectual property base that protects and distinguishes us from other competitive approaches to the same end.« less

  11. 3D unstructured-mesh radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morel, J.

    1997-12-31

    Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options:more » $$S{_}n$$ (discrete-ordinates), $$P{_}n$$ (spherical harmonics), and $$SP{_}n$$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $$S{_}n$$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.« less

  12. Thermal and thermomechanical calculations of deep-rock nuclear waste disposal with the enhanced SANGRE code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heuze, F.E.

    1983-03-01

    An attempt to model the complex thermal and mechanical phenomena occurring in the disposal of high-level nuclear wastes in rock at high power loading is described. Such processes include melting of the rock, convection of the molten material, and very high stressing of the rock mass, leading to new fracturing. Because of the phase changes and the wide temperature ranges considered, realistic models must provide for coupling of the thermal and mechanical calculations, for large deformations, and for steady-state temperature-depenent creep of the rock mass. Explicit representation of convection would be desirable, as would the ability to show fracture developmentmore » and migration of fluids in cracks. Enhancements to SNAGRE consisted of: array modifications to accommodate complex variations of thermal and mechanical properties with temperature; introduction of the ability of calculate thermally induced stresses; improved management of the minimum time step and minimum temperature step to increase code efficiency; introduction of a variable heat-generation algorithm to accommodate heat decay of the nuclear materials; streamlining of the code by general editing and extensive deletion of coding used in mesh generation; and updating of the program users' manual. The enhanced LLNL version of the code was renamed LSANGRE. Phase changes were handled by introducing sharp variations in the specific heat of the rock in a narrow range about the melting point. The accuracy of this procedure was tested successfully on a melting slab problem. LSANGRE replicated the results of both the analytical solution and calculations with the finite difference TRUMP code. Following enhancement and verification, a purely thermal calculation was carried to 105 years. It went beyond the extent of maximum melt and into the beginning of the cooling phase.« less

  13. Design options for improved performance with high-density carbon ablators and low-gas fill hohlraum targets

    NASA Astrophysics Data System (ADS)

    Berzak Hopkins, L.; Divol, L.; Lepape, S.; Meezan, N. B.; Dewald, E.; Ho, D.; Khan, S.; Pak, A.; Ralph, J.; Ross, J. S.

    2016-10-01

    Recent simulation-based and experimental work using high-density carbon ablators in unlined uranium hohlraums with 0.3 mg/cc helium fill have demonstrated round implosions with minimal evolution of Legendre moment P2 during burn. To extend this promising work, design studies have been performed to explore potential performance improvements with larger capsules, while maintaining similar case-to-capsule target ratios. We present here the results of these design studies, which will motivate a series of upcoming experiments at the National Ignition Facility. Prepared by LLNL under Contract DE-AC52-07NA27344.

  14. Thrust Area Report, Engineering Research, Development and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R. T.

    1997-02-01

    The mission of the Engineering Research, Development, and Technology Program at Lawrence Livermore National Laboratory (LLNL) is to develop the knowledge base, process technologies, specialized equipment, tools and facilities to support current and future LLNL programs. Engineering`s efforts are guided by a strategy that results in dual benefit: first, in support of Department of Energy missions, such as national security through nuclear deterrence; and second, in enhancing the nation`s economic competitiveness through our collaboration with U.S. industry in pursuit of the most cost- effective engineering solutions to LLNL programs. To accomplish this mission, the Engineering Research, Development, and Technology Programmore » has two important goals: (1) identify key technologies relevant to LLNL programs where we can establish unique competencies, and (2) conduct high-quality research and development to enhance our capabilities and establish ourselves as the world leaders in these technologies. To focus Engineering`s efforts technology {ital thrust areas} are identified and technical leaders are selected for each area. The thrust areas are comprised of integrated engineering activities, staffed by personnel from the nine electronics and mechanical engineering divisions, and from other LLNL organizations. This annual report, organized by thrust area, describes Engineering`s activities for fiscal year 1996. The report provides timely summaries of objectives, methods, and key results from eight thrust areas: Computational Electronics and Electromagnetics; Computational Mechanics; Microtechnology; Manufacturing Technology; Materials Science and Engineering; Power Conversion Technologies; Nondestructive Evaluation; and Information Engineering. Readers desiring more information are encouraged to contact the individual thrust area leaders or authors. 198 refs., 206 figs., 16 tabs.« less

  15. Report on the B-Fields at NIF Workshop Held at LLNL October 12-13, 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fournier, K. B.; Moody, J. D.

    2015-12-13

    A national ICF laboratory workshop on requirements for a magnetized target capability on NIF was held by NIF at LLNL on October 12 and 13, attended by experts from LLNL, SNL, LLE, LANL, GA, and NRL. Advocates for indirect drive (LLNL), magnetic (Z) drive (SNL), polar direct drive (LLE), and basic science needing applied B (many institutions) presented and discussed requirements for the magnetized target capabilities they would like to see. 30T capability was most frequently requested. A phased operation increasing the field in steps experimentally can be envisioned. The NIF management will take the inputs from the scientific communitymore » represented at the workshop and recommend pulse-powered magnet parameters for NIF that best meet the collective user requests. In parallel, LLNL will continue investigating magnets for future generations that might be powered by compact laser-B-field generators (Moody, Fujioka, Santos, Woolsey, Pollock). The NIF facility engineers will start to analyze compatibility of the recommended pulsed magnet parameters (size, field, rise time, materials) with NIF chamber constraints, diagnostic access, and final optics protection against debris in FY16. The objective of this assessment will be to develop a schedule for achieving an initial Bfield capability. Based on an initial assessment, room temperature magnetized gas capsules will be fielded on NIF first. Magnetized cryo-ice-layered targets will take longer (more compatibility issues). Magnetized wetted foam DT targets (Olson) may have somewhat fewer compatibility issues making them a more likely choice for the first cryo-ice-layered target fielded with applied Bz.« less

  16. The ARAC-RODOS-WSPEEDI Information Exchange Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, T J

    1999-09-01

    Under the auspices of a US DOE-JAPAN Memorandum of Understanding JAERI and LLNL agreed to develop and evaluate a prototype information exchange protocol for nuclear accident emergency situations. This project received some interest from the US DOS and FEMA as it fits nicely under the umbrella of the G-7's GEMINI (Global Emergency Management Information Network Initiative) project. Because of LLNL/ARAC and JAERV WSPEEDI interest in nuclear accident consequence assessment and hazard prediction on all scales, to include global, we were happy to participate. Subsequent to the Spring 1997 RODOS-ARAC Workshop a Memorandum of Agreement was developed to enhance mutual collaborationmore » on matters of emergency systems development. In the summer of 1998 the project leaders of RODOS, WSPEEDI and ARAC met at FZK and agreed to join in a triangular collaboration on the development and demonstration of an emergency information exchange protocol. JAERI and FZK are engaged in developing a formal cooperation agreement. The purpose of this project is to evaluate the prototype information protocol application for technical feasibility and mutual benefit through simulated (real) event; quick exchange of atmospheric modeling products and environmental data during emergencies, distribution of predicted results to other countries having no prediction capabilities, and utilization of the link for collaborative studies.« less

  17. Nova Upgrade: A proposed ICF facility to demonstrate ignition and gain, revision 1

    NASA Astrophysics Data System (ADS)

    1992-07-01

    The present objective of the national Inertial Confinement Fusion (ICF) Program is to determine the scientific feasibility of compressing and heating a small mass of mixed deuterium and tritium (DT) to conditions at which fusion occurs and significant energy is released. The potential applications of ICF will be determined by the resulting fusion energy yield (amount of energy produced) and gain (ratio of energy released to energy required to heat and compress the DT fuel). Important defense and civilian applications, including weapons physics, weapons effects simulation, and ultimately the generation of electric power will become possible if yields of 100 to 1,000 MJ and gains exceeding approximately 50 can be achieved. Once ignition and propagating bum producing modest gain (2 to 10) at moderate drive energy (1 to 2 MJ) has been achieved, the extension to high gain (greater than 50) is straightforward. Therefore, the demonstration of ignition and modest gain is the final step in establishing the scientific feasibility of ICF. Lawrence Livermore National Laboratory (LLNL) proposes the Nova Upgrade Facility to achieve this demonstration by the end of the decade. This facility would be constructed within the existing Nova building at LLNL for a total cost of approximately $400 M over the proposed FY 1995-1999 construction period. This report discusses this facility.

  18. Deploying Solid Targets in Dense Plasma Focus Devices for Improved Neutron Yields

    NASA Astrophysics Data System (ADS)

    Podpaly, Y. A.; Chapman, S.; Povilus, A.; Falabella, S.; Link, A.; Shaw, B. H.; Cooper, C. M.; Higginson, D.; Holod, I.; Sipe, N.; Gall, B.; Schmidt, A. E.

    2017-10-01

    We report on recent progress in using solid targets in dense plasma focus (DPF) devices. DPFs have been observed to generate energetic ion beams during the pinch phase; these beams interact with the dense plasma in the pinch region as well as the background gas and are believed to be the primary neutron generation mechanism for a D2 gas fill. Targets can be placed in the beam path to enhance neutron yield and to shorten the neutron pulse if desired. In this work, we measure yields from placing titanium deuteride foils, deuterated polyethylene, and non-deuterated control targets in deuterium filled DPFs at both megajoule and kilojoule scales. Furthermore, we have deployed beryllium targets in a helium gas-filled, kilojoule scale DPF for use as a potential AmBe radiological source replacement. Neutron yield, neutron time of flight, and optical images are used to diagnose the effectiveness of target deployments relative to particle-in-cell simulation predictions. A discussion of target holder engineering for material compatibility and damage control will be shown as well. Prepared by LLNL under Contract DE-AC52-07NA27344. Supported by the Office of Defense Nuclear Nonproliferation Research and Development within U.S. DOE's National Nuclear Security Administration and the LLNL Institutional Computing Grand Challenge program.

  19. Comparison of resistive MHD simulations and experimental CHI discharges in NSTX

    NASA Astrophysics Data System (ADS)

    Hooper, E. B.; Sovinec, C. R.; Raman, R.; Fatima, F.

    2013-10-01

    Resistive MHD simulations using NIMROD simulate CHI discharges for NSTX startup plasmas. Quantitative comparison with experiment ensures that the simulation physics includes a minimal physics set needed to extend the simulations to new experiments, e.g. NSTX-U. Important are time-varying vacuum magnetic field, ohmic heating, thermal transport, impurity radiation, and spatially-varying plasma parameters including density. Equilibria are compared with experimental injector currents, voltages and parameters including toroidal current, photographs of emitted light and measurements of midplane temperature profiles, radiation and surface heating. Initial results demonstrate that adjusting impurity radiation and cross-field transport yields temperatures and injected-current channel widths similar to experiment. These determine the plasma resistance, feeding back to the impedance on the injector power supply. Work performed under the auspices of the U.S. Department of Energy under contracts DE-AC52-07NA27344 at LLNL and DE-AC02-09CH11466 at PPPL, and grants DE-FC02-05ER54813 at PSI Center (U. Wisc.) and DOE-FG02-12ER55115 (at Princeton U.).

  20. LINC Modeling of August 19, 2004 Queen City Barrel Company Fire In Cincinnati, OH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, M B; Nasstrom, J S; Baskett, R L

    This report details the information received, assumptions made, actions taken, and products delivered by the Lawrence Livermore National Laboratory (LLNL) during the August 19, 2004 fire at the Queen City Barrel Company (QCB) in Cincinnati, OH. During the course of the event, LLNL provided four sets of plume model products to various Cincinnati emergency response organizations.

  1. Gas Atomization Equipment Statement of Work and Specification for Engineering design, Fabrication, Testing, and Installation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boutaleb, T.; Pluschkell, T. P.

    The Gas Atomization Equipment will be used to fabricate metallic powder suitable for Powder Bed Fusion additive Manufacturing material to support Lawrence Livermore National Laboratory (LLNL) research and development. The project will modernize our capabilities to develop spherical reactive, refractory, and radioactive powders in the 10-75 μm diameter size range at LLNL.

  2. Silicon microelectronic field-emissive devices for advanced display technology

    NASA Astrophysics Data System (ADS)

    Morse, J. D.

    1993-03-01

    Field-emission displays (FED's) offer the potential advantages of high luminous efficiency, low power consumption, and low cost compared to AMLCD or CRT technologies. An LLNL team has developed silicon-point field emitters for vacuum triode structures and has also used thin-film processing techniques to demonstrate planar edge-emitter configurations. LLNL is interested in contributing its experience in this and other FED-related technologies to collaborations for commercial FED development. At LLNL, FED development is supported by computational capabilities in charge transport and surface/interface modeling in order to develop smaller, low-work-function field emitters using a variety of materials and coatings. Thin-film processing, microfabrication, and diagnostic/test labs permit experimental exploration of emitter and resistor structures. High field standoff technology is an area of long-standing expertise that guides development of low-cost spacers for FEDS. Vacuum sealing facilities are available to complete the FED production engineering process. Drivers constitute a significant fraction of the cost of any flat-panel display. LLNL has an advanced packaging group that can provide chip-on-glass technologies and three-dimensional interconnect generation permitting driver placement on either the front or the back of the display substrate.

  3. Advanced Analog Signal Processing for Fuzing Final Report CRADA No. TC-1306-96

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, C. Y.; Spencer, D.

    The purpose of this CRADA between LLNL and Kaman Aerospace/Raymond Engineering Operations (Raymond) was to demonstrate the feasibility of using Analog/Digital Neural Network (ANN) Technology for advanced signal processing, fuzing, and other applications. This cooperation sought to Ieverage the expertise and capabilities of both parties--Raymond to develop the signature recognition hardware system, using Raymond’s extensive experience in the area of system development plus Raymond’s knowledge of military applications, and LLNL to apply ANN and related technologies to an area of significant interest to the United States government. This CRADA effort was anticipated to be a three-year project consisting of threemore » phases: Phase I, Proof-of-Principle Demonstration; Phase II, Proof-of-Design, involving the development of a form-factored integrated sensor and ANN technology processo~ and Phase III, Final Design and Release of the integrated sensor and ANN fabrication process: Under Phase I, to be conducted during calendar year 1996, Raymond was to deliver to LLNL an architecture (design) for an ANN chip. LLNL was to translate the design into a stepper mask and to produce and test a prototype chip from the Raymond design.« less

  4. 2013 R&D 100 Award: New tech could mean more power for fiber lasers

    ScienceCinema

    Dawson, Jay

    2018-01-16

    An LLNL team of six physicists has developed a new technology that is a stepping stone to enable some of the limitations on high-power fiber lasers to be overcome. Their technology, dubbed "Efficient Mode-Converters for High-Power Fiber Amplifiers," allows the power of fiber lasers to be increased while maintaining high beam quality. Currently, fiber lasers are used in machining, on factory floors and in a number of defense applications and can produce tens of kilowatts of power.The conventional fiber laser design features a circular core and has fundamental limitations that make it impractical to allow higher laser power unless the core area is increased. LLNL researchers have pioneered a design to increase the laser's core area along the axis of the ribbon fiber. Their design makes it difficult to use a conventional laser beam, so the LLNL team converted the beam into a profile that propagates into the ribbon fiber and is converted back once it is amplified. The use of this LLNL technology will permit the construction of higher power lasers for lower costs and increase the power of fiber lasers from tens of kilowatts of power to about 100 kilowatts and potentially even higher.

  5. Applying Science and Technology to Combat WMD Terrorism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wuest, C R; Werne, R W; Colston, B W

    2006-05-04

    Lawrence Livermore National Laboratory (LLNL) is developing and fielding advanced strategies that dramatically improve the nation's capabilities to prevent, prepare for, detect, and respond to terrorist use of chemical, biological, radiological, nuclear, and explosive (CBRNE) weapons. The science, technology, and integrated systems we provide are informed by and developed with key partners and end users. LLNL's long-standing role as one of the two principle U.S. nuclear weapons design laboratories has led to significant resident expertise for health effects of exposure to radiation, radiation detection technologies, characterization of radioisotopes, and assessment and response capabilities for terrorist nuclear weapons use. This papermore » provides brief overviews of a number of technologies developed at LLNL that are being used to address national security needs to confront the growing threats of CBRNE terrorism.« less

  6. Applying science and technology to combat WMD terrorism

    NASA Astrophysics Data System (ADS)

    Wuest, Craig R.; Werne, Roger W.; Colston, Billy W.; Hartmann-Siantar, Christine L.

    2006-05-01

    Lawrence Livermore National Laboratory (LLNL) is developing and fielding advanced strategies that dramatically improve the nation's capabilities to prevent, prepare for, detect, and respond to terrorist use of chemical, biological, radiological, nuclear, and explosive (CBRNE) weapons. The science, technology, and integrated systems we provide are informed by and developed with key partners and end users. LLNL's long-standing role as one of the two principle U.S. nuclear weapons design laboratories has led to significant resident expertise for health effects of exposure to radiation, radiation detection technologies, characterization of radioisotopes, and assessment and response capabilities for terrorist nuclear weapons use. This paper provides brief overviews of a number of technologies developed at LLNL that are being used to address national security needs to confront the growing threats of CBRNE terrorism.

  7. Development of Plastic Substrate Technology for Active Matrix Liquid Crystal Displays Final Report CRADA No. TC-761-93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carey, P.; Kamath, H.

    Raychem Corporation (RYC) and the Lawrence Livermore National Laboratory (LLNL) conducted a development program with the goal to make rugged, low-cost., high-resolution flat panel displays based on RYC's proprietary Nematic Curvilinear Aligned Phase (NCAP) liquid crystal and LLNL's patented processes for the formation and doping of polycrystalline silicon on low-temperature, flexible, plastic substrates.

  8. Fixatives Application for Risk Mitigation Following Contamination with a Biological Agent

    DTIC Science & Technology

    2011-11-02

    PRES-  Gruinard Island 5% formaldehyde  Sverdlosk Release UNKNOWN: but washing, chloramines , soil disposal believed to have been used...507816 Lawrence Livermore National Laboratory LLNL-PRES- 4 Disinfectant >6 Log Reduction on Materials (EPA, 2010a,b; Wood et al., 2011...LL L-PRES-507816 Lawrence Livermore National Laboratory LLNL-PRES-  High disinfectant concentrations increase operational costs and risk

  9. Tech Transfer Webinar: Energy Absorbing Materials

    ScienceCinema

    Duoss, Eric

    2018-01-16

    A new material has been designed and manufactured at LLNL that can absorb mechanical energy--a cushion--while also providing protection against sheering. This ordered cellular material is 3D printed using direct ink writing techniques under development at LLNL. It is expected to find utility in application spaces that currently use unordered foams, such as sporting and consumer goods as well as defense and aerospace.

  10. Astronomy Applications of Adaptive Optics at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauman, B J; Gavel, D T

    2003-04-23

    Astronomical applications of adaptive optics at Lawrence Livermore National Laboratory (LLNL) has a history that extends from 1984. The program started with the Lick Observatory Adaptive Optics system and has progressed through the years to lever-larger telescopes: Keck, and now the proposed CELT (California Extremely Large Telescope) 30m telescope. LLNL AO continues to be at the forefront of AO development and science.

  11. Report of the Defense Science Board Task Force on Critical Homeland Infrastructure Protection

    DTIC Science & Technology

    2007-01-01

    nuclear, radiation and explosive hazards; • Monitoring “people of interest” while protecting civil liberties; • Detection of hostile intent; • Detect...Guardian DARPA Overview Mr. Roger Gibbs DARPA LLNL Technologies in Support of Infrastructure Protection Mr. Don Prosnitz LLNL Sandia National...Mechanical Engineers AT/FP Antiterrorism/Force Protection CBRNE Chemical Biological Radiological Nuclear Explosive CERT Commuter Emergency Response Team

  12. Uncertainty quantification of US Southwest climate from IPCC projections.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boslough, Mark Bruce Elrick

    2011-01-01

    The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) made extensive use of coordinated simulations by 18 international modeling groups using a variety of coupled general circulation models (GCMs) with different numerics, algorithms, resolutions, physics models, and parameterizations. These simulations span the 20th century and provide forecasts for various carbon emissions scenarios in the 21st century. All the output from this panoply of models is made available to researchers on an archive maintained by the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at LLNL. I have downloaded this data and completed the first steps toward a statisticalmore » analysis of these ensembles for the US Southwest. This constitutes the final report for a late start LDRD project. Complete analysis will be the subject of a forthcoming report.« less

  13. Conservation Laws for Gyrokinetic Equations for Large Perturbations and Flows

    NASA Astrophysics Data System (ADS)

    Dimits, Andris

    2017-10-01

    Gyrokinetic theory has proved to be very useful for the understanding of magnetized plasmas, both to simplify analytical treatments and as a basis for efficient numerical simulations. Gyrokinetic theories were previously developed in two extended orderings that are applicable to large fluctuations and flows as may arise in the tokamak edge and scrapeoff layer. In the present work, we cast the resulting equations in a field-theoretical variational form, and derive, up to second order in the respective orderings, the associated global and local energy and (linear and toroidal) momentum conservation relations that result from Noether's theorem. The consequences of these for the various possible choices of numerical discretization used in gyrokinetic simulations are considered. Prepared for US DOE by LLNL under Contract DE-AC52-07NA27344 and supported by the U.S. DOE, OFES.

  14. Thermal conduction study of warm dense aluminum by proton differential heating

    NASA Astrophysics Data System (ADS)

    Ping, Y.; Kemp, G.; McKelvey, A.; Fernandez-Panella, A.; Shepherd, R.; Collins, G.; Sio, H.; King, J.; Freeman, R.; Hua, R.; McGuffey, C.; Kim, J.; Beg, F.

    2016-10-01

    A differential heating platform has been developed for thermal conduction study (Ping et al. PoP 2015), where a temperature gradient is induced and subsequent heat flow is probed by time-resolved diagnostics. An experiment using proton differential heating has been carried out at Titan laser for Au/Al targets. Two single-shot time-resolved diagnostics are employed, SOP (streaked optical pyrometry) for surface temperature and FDI (Fourier Domain Interferometry) for surface expansion. Hydrodynamic simulations show that after 15ps, absorption in underdense plasma needs to be taken into account to correctly interpret SOP data. Comparison between simulations with different thermal conductivity models and a set of data with varying target thickness will be presented. This work was performed under DOE contract DE-AC52-07NA27344 with support from OFES Early Career program and LLNL LDRD program.

  15. Fabrication Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaedel, K.L.

    1993-03-01

    The mission of the Fabrication Technology thrust area is to have an adequate base of manufacturing technology, not necessarily resident at Lawrence Livermore National Laboratory (LLNL), to conduct the future business of LLNL. The specific goals continue to be to (1) develop an understanding of fundamental fabrication processes; (2) construct general purpose process models that will have wide applicability; (3) document findings and models in journals; (4) transfer technology to LLNL programs, industry, and colleagues; and (5) develop continuing relationships with the industrial and academic communities to advance the collective understanding of fabrication processes. The strategy to ensure success ismore » changing. For technologies in which they are expert and which will continue to be of future importance to LLNL, they can often attract outside resources both to maintain their expertise by applying it to a specific problem and to help fund further development. A popular vehicle to fund such work is the Cooperative Research and Development Agreement with industry. For technologies needing development because of their future critical importance and in which they are not expert, they use internal funding sources. These latter are the topics of the thrust area. Three FY-92 funded projects are discussed in this section. Each project clearly moves the Fabrication Technology thrust area towards the goals outlined above. They have also continued their membership in the North Carolina State University Precision Engineering Center, a multidisciplinary research and graduate program established to provide the new technologies needed by high-technology institutions in the US. As members, they have access to and use of the results of their research projects, many of which parallel the precision engineering efforts at LLNL.« less

  16. Fabrication technology

    NASA Astrophysics Data System (ADS)

    Blaedel, K. L.

    1993-03-01

    The mission of the Fabrication Technology thrust area is to have an adequate base of manufacturing technology, not necessarily resident at Lawrence Livermore National Laboratory (LLNL), to conduct the future business of LLNL. The specific goals continue to be to do the following: (1) develop an understanding of fundamental fabrication processes; (2) construct general purpose process models that will have wide applicability; (3) document findings and models in journals; (4) transfer technology to LLNL programs, industry, and colleagues; and (5) develop continuing relationships with the industrial and academic communities to advance the collective understanding of fabrication processes. The strategy to ensure success is changing. For technologies in which they are expert and which will continue to be of future importance to LLNL, they can often attract outside resources both to maintain their expertise by applying it to a specific problem and to help fund further development. A popular vehicle to fund such work is the Cooperative Research and Development Agreement with industry. For technologies needing development because of their future critical importance and in which they are not expert, they use internal funding sources. These latter are the topics of the thrust area. Three FY-92 funded projects are discussed in this section. Each project clearly moves the Fabrication Technology thrust area towards the goals outlined above. They have also continued their membership in the North Carolina State University Precision Engineering Center, a multidisciplinary research and graduate program established to provide the new technologies needed by high-technology institutions in the U.S. As members, they have access to and use of the results of their research projects, many of which parallel the precision engineering efforts at LLNL.

  17. Criteria evaluation for cleanliness testing phase 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meltzer, Michael; Koester, Carolyn; Stefanni, Chris

    1999-02-04

    The Boeing Company (Boeing) contracted with Lawrence Livermore National Laboratory (LLNL) to develop criteria for evaluating the efficacy of its parts cleaning processes. In particular, LLNL and Boeing are attempting to identify levels of contamination that lead to parts failures. Sufficient contamination to cause impairment of anodizing, alodining, painting, or welding operations is considered a "part failure." In the "Phase 0" part of the project that was recently completed, preliminary analyses of aluminum substrates were performed as a first step in determining suitable cleanliness criteria for actual Boeing parts made from this material. A wide spread of contamination levels wasmore » specified for the Phase 0 test coupons, in the hopes of finding a range in which an appropriate cleanliness specification might lie. It was planned that, based on the results of the Phase 0 testing, further more detailed analyses ("Phase 1 testing") would be performed in order to more accurately identify the most appropriate criteria. For the Phase 0 testing, Boeing supplied LLNL with 3" x 6" and 3" x 10" aluminum test panels which LLNL contaminated with measured amounts of typical hydrocarbon substances encountered in Boeing' s fabrication operations. The panels were then subjected by Boeing to normal cleaning procedures, after which they went through one of the following sets of operations: l anodizing and primer painting . alodining (chromating) and primer painting l welding The coatings or welds were then examined by both Boeing and LLNL to determine whether any of the operations were impaired, and whether there was a correlation between contamination level and damage to the parts. The experimental approach and results are described in detail.« less

  18. Special Analysis for the Disposal of the Lawrence Livermore National Laboratory Low Activity Beta/Gamma Sources Waste Stream at the Area 5 Radioactive Waste Management Site, Nevada National Security Site, Nye County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shott, Gregory J.

    This special analysis (SA) evaluates whether the Lawrence Livermore National Laboratory (LLNL) Low Activity Beta/Gamma Sources waste stream (BCLALADOEOSRP, Revision 0) is suitable for disposal by shallow land burial (SLB) at the Area 5 Radioactive Waste Management Site (RWMS) at the Nevada National Security Site (NNSS). The LLNL Low Activity Beta/Gamma Sources waste stream consists of sealed sources that are no longer needed. The LLNL Low Activity Beta/Gamma Sources waste stream required a special analysis because cobalt-60 (60Co), strontium-90 (90Sr), cesium-137 (137Cs), and radium-226 (226Ra) exceeded the NNSS Waste Acceptance Criteria (WAC) Action Levels (U.S. Department of Energy, National Nuclearmore » Security Administration Nevada Field Office [NNSA/NFO] 2015). The results indicate that all performance objectives can be met with disposal of the LLNL Low Activity Beta/Gamma Sources in a SLB trench. The LLNL Low Activity Beta/Gamma Sources waste stream is suitable for disposal by SLB at the Area 5 RWMS. However, the activity concentration of 226Ra listed on the waste profile sheet significantly exceeds the action level. Approval of the waste profile sheet could potentially allow the disposal of high activity 226Ra sources. To ensure that the generator does not include large 226Ra sources in this waste stream without additional evaluation, a control is need on the maximum 226Ra inventory. A limit based on the generator’s estimate of the total 226Ra inventory is recommended. The waste stream is recommended for approval with the control that the total 226Ra inventory disposed shall not exceed 5.5E10 Bq (1.5 Ci).« less

  19. Proceedings of the 14th International Conference on the Numerical Simulation of Plasmas

    NASA Astrophysics Data System (ADS)

    Partial Contents are as follows: Numerical Simulations of the Vlasov-Maxwell Equations by Coupled Particle-Finite Element Methods on Unstructured Meshes; Electromagnetic PIC Simulations Using Finite Elements on Unstructured Grids; Modelling Travelling Wave Output Structures with the Particle-in-Cell Code CONDOR; SST--A Single-Slice Particle Simulation Code; Graphical Display and Animation of Data Produced by Electromagnetic, Particle-in-Cell Codes; A Post-Processor for the PEST Code; Gray Scale Rendering of Beam Profile Data; A 2D Electromagnetic PIC Code for Distributed Memory Parallel Computers; 3-D Electromagnetic PIC Simulation on the NRL Connection Machine; Plasma PIC Simulations on MIMD Computers; Vlasov-Maxwell Algorithm for Electromagnetic Plasma Simulation on Distributed Architectures; MHD Boundary Layer Calculation Using the Vortex Method; and Eulerian Codes for Plasma Simulations.

  20. CERT TST December 2015 Visit Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, Robert Currier; Bailey, Teresa S.; Gamblin, G. Todd

    2016-01-25

    The annual PSAAP II TST visit to Texas A&M’s CERT Center was held on December 1-3, 2015. The agenda for the visit is attached. Non-TAMU attendees were: TST Members – Teresa Bailey (LLNL), Todd Gamblin (LLNL), Bob Little (LANL) – Chair, Chad Olinger (LANL), Shawn Pautz (SNL), Alan Williams (SNL);Other Lab staff – Skip Kahler (LANL), Ana Kupresanin (LLNL), and Rob Lowrie (LANL); AST Members – Nelson Hoffman (LANL) and Bob Voigt (Leidos) The TST wishes to express our appreciation to all involved with CERT for the high-quality posters and presentations and for the attention to logistics that enabled amore » successful visit. We have broken our comments into four sections: (1) Kudos, (2) Recommendations, (3) Feedback on Priorities for April Review, and (4) Follow-Up Activities with Labs.« less

  1. Effect of Energetic Electrons Produced by Raman Scattering on Hohlraum Dynamics

    NASA Astrophysics Data System (ADS)

    Strozzi, D. J.; Bailey, D. S.; Doeppner, T.; Divol, L.; Harte, J. A.; Michel, P.; Thomas, C. A.

    2016-10-01

    A reduced model of laser-plasma interactions, namely crossed-beam energy transfer and stimulated Raman scattering (SRS), has recently been implemented in a self-consistent or ``inline'' way in radiation-hydrodynamics codes. We extend this work to treat the energetic electrons produced by Langmuir waves (LWs) from SRS by a suprathermal, multigroup diffusion model. This gives less spatially localized heating than depositing the LW energy into the local electron fluid. We compare the resulting hard x-ray production to imaging data on the National Ignition Facility, which indicate significant emission around the laser entrance hole. We assess the effects of energetic electrons, as well as background electron heat flow, on hohlraum dynamics and capsule implosion symmetry. Work performed under the auspices of the U.S. D.O.E. by LLNL under Contract No. DE-AC52-07NA27344.

  2. Reform of the National Security Science and Technology Enterprise

    DTIC Science & Technology

    2008-10-01

    still attract the very best S&E talent.54 Table 1. National Academy Membership (Source: National Academies Website) ANL BNL JPL LANL LL LLNL IBM...ANL BNL JPL LANL LLNL NIH NIST NRL Articles 1023 761 705 1526 1038 4305 350 957 Government S&E Workforce—Tomorrow With the significant exception...ANL), Brookhaven National Laboratory ( BNL ), Jet Propulsion Laboratory (JPL), Lincoln Laboratory (LL), Los Alamos National Laboratory (LANL

  3. Multilayer deposition and EUV reflectance characterization of 131 ? flight mirrors for AIA at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soufli, R; Robinson, J C; Spiller, E

    2006-02-22

    Mo/Si multilayer coatings reflecting at 131 {angstrom} were deposited successfully on the AIA primary and secondary flight mirrors and on two coating witness Si wafers, on November 16, 2005, at LLNL. All coatings were characterized by means of EUV reflectance measurements at beamline 6.3.2 of the Advanced Light Source (ALS) synchrotron at LBNL, and were found to be well within specifications.

  4. 2013 R&D 100 Award: New tech could mean more power for fiber lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Jay

    2014-04-03

    An LLNL team of six physicists has developed a new technology that is a stepping stone to enable some of the limitations on high-power fiber lasers to be overcome. Their technology, dubbed "Efficient Mode-Converters for High-Power Fiber Amplifiers," allows the power of fiber lasers to be increased while maintaining high beam quality. Currently, fiber lasers are used in machining, on factory floors and in a number of defense applications and can produce tens of kilowatts of power.The conventional fiber laser design features a circular core and has fundamental limitations that make it impractical to allow higher laser power unless themore » core area is increased. LLNL researchers have pioneered a design to increase the laser's core area along the axis of the ribbon fiber. Their design makes it difficult to use a conventional laser beam, so the LLNL team converted the beam into a profile that propagates into the ribbon fiber and is converted back once it is amplified. The use of this LLNL technology will permit the construction of higher power lasers for lower costs and increase the power of fiber lasers from tens of kilowatts of power to about 100 kilowatts and potentially even higher.« less

  5. Lawrence Livermore National Laboratory Safeguards and Security quarterly progress report to the US Department of Energy: Quarter ending December 31, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, G.; Mansur, D.L.; Ruhter, W.D.

    1994-01-01

    The Lawrence Livermore National Laboratory (LLNL) carries out safeguards and security activities for the Department of Energy (DOE), Office of Safeguards and Security (OSS), as well as other organizations, both within and outside the DOE. This document summarizes the activities conducted for the OSS during the first quarter of fiscal year 1994 (October through December, 1993). The nature and scope of the activities carried out for OSS at LLNL require a broad base of technical expertise. To assure projects are staffed and executed effectively, projects are conducted by the organization at LLNL best able to supply the needed technical expertise.more » These projects are developed and managed by senior program managers. Institutional oversight and coordination is provided through the LLNL Deputy Director`s office. At present, the Laboratory is supporting OSS in five areas: (1) Safeguards Technology, (2) Safeguards and Decision Support, (3) Computer Security, (4) DOE Automated Physical Security, and (5) DOE Automated Visitor Access Control System. This report describes the activities in each of these five areas. The information provided includes an introduction which briefly describes the activity, summary of major accomplishments, task descriptions with quarterly progress, summaries of milestones and deliverables and publications published this quarter.« less

  6. Lawrence Livermore National Laboratory safeguards and security quarterly progress report to the U.S. Department of Energy. Quarter ending December 31, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, G.; Mansur, D.L.; Ruhter, W.D.

    The Lawrence Livermore National Laboratory (LLNL) carries out safeguards and security activities for the Department of Energy (DOE), Office of Safeguards and Security (OSS), as well as other organizations, both within and outside the DOE. This document summarizes the activities conducted for the OSS during the First Quarter of Fiscal Year 1997 (October through December, 1996). The nature and scope of the activities carried out for OSS at LLNL require a broad base of technical expertise. To assure projects are staffed and executed effectively, projects are conducted by the organization at LLNL best able to supply the needed technical expertise.more » These projects are developed and managed by senior program managers. Institutional oversight and coordination is provided through the LLNL Deputy Director`s office. At present, the Laboratory is supporting OSS in four areas: (1) safeguards technology; (2) safeguards and material accountability; (3) computer security--distributed systems; and (4) physical and personnel security support. The remainder of this report describes the activities in each of these four areas. The information provided includes an introduction which briefly describes the activity, summary of major accomplishments, task descriptions with quarterly progress, summaries of milestones and deliverables and publications published this quarter.« less

  7. Benchmarking atomic physics models for magnetically confined fusion plasma physics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M.J.; Finkenthal, M.; Soukhanovskii, V.

    In present magnetically confined fusion devices, high and intermediate {ital Z} impurities are either puffed into the plasma for divertor radiative cooling experiments or are sputtered from the high {ital Z} plasma facing armor. The beneficial cooling of the edge as well as the detrimental radiative losses from the core of these impurities can be properly understood only if the atomic physics used in the modeling of the cooling curves is very accurate. To this end, a comprehensive experimental and theoretical analysis of some relevant impurities is undertaken. Gases (Ne, Ar, Kr, and Xe) are puffed and nongases are introducedmore » through laser ablation into the FTU tokamak plasma. The charge state distributions and total density of these impurities are determined from spatial scans of several photometrically calibrated vacuum ultraviolet and x-ray spectrographs (3{endash}1600 {Angstrom}), the multiple ionization state transport code transport code (MIST) and a collisional radiative model. The radiative power losses are measured with bolometery, and the emissivity profiles were measured by a visible bremsstrahlung array. The ionization balance, excitation physics, and the radiative cooling curves are computed from the Hebrew University Lawrence Livermore atomic code (HULLAC) and are benchmarked by these experiments. (Supported by U.S. DOE Grant No. DE-FG02-86ER53214 at JHU and Contract No. W-7405-ENG-48 at LLNL.) {copyright} {ital 1999 American Institute of Physics.}« less

  8. Toward a standard reference database for computer-aided mammography

    NASA Astrophysics Data System (ADS)

    Oliveira, Júlia E. E.; Gueld, Mark O.; de A. Araújo, Arnaldo; Ott, Bastian; Deserno, Thomas M.

    2008-03-01

    Because of the lack of mammography databases with a large amount of codified images and identified characteristics like pathology, type of breast tissue, and abnormality, there is a problem for the development of robust systems for computer-aided diagnosis. Integrated to the Image Retrieval in Medical Applications (IRMA) project, we present an available mammography database developed from the union of: The Mammographic Image Analysis Society Digital Mammogram Database (MIAS), The Digital Database for Screening Mammography (DDSM), the Lawrence Livermore National Laboratory (LLNL), and routine images from the Rheinisch-Westfälische Technische Hochschule (RWTH) Aachen. Using the IRMA code, standardized coding of tissue type, tumor staging, and lesion description was developed according to the American College of Radiology (ACR) tissue codes and the ACR breast imaging reporting and data system (BI-RADS). The import was done automatically using scripts for image download, file format conversion, file name, web page and information file browsing. Disregarding the resolution, this resulted in a total of 10,509 reference images, and 6,767 images are associated with an IRMA contour information feature file. In accordance to the respective license agreements, the database will be made freely available for research purposes, and may be used for image based evaluation campaigns such as the Cross Language Evaluation Forum (CLEF). We have also shown that it can be extended easily with further cases imported from a picture archiving and communication system (PACS).

  9. System Modeling of kJ-class Petawatt Lasers at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shverdin, M Y; Rushford, M; Henesian, M A

    2010-04-14

    Advanced Radiographic Capability (ARC) project at the National Ignition Facility (NIF) is designed to produce energetic, ultrafast x-rays in the range of 70-100 keV for backlighting NIF targets. The chirped pulse amplification (CPA) laser system will deliver kilo-Joule pulses at an adjustable pulse duration from 1 ps to 50 ps. System complexity requires sophisticated simulation and modeling tools for design, performance prediction, and comprehension of experimental results. We provide a brief overview of ARC, present our main modeling tools, and describe important performance predictions. The laser system (Fig. 1) consists of an all-fiber front end, including chirped-fiber Bragg grating (CFBG)more » stretchers. The beam after the final fiber amplifier is split into two apertures and spatially shaped. The split beam first seeds a regenerative amplifier and is then amplified in a multi-pass Nd:glass amplifier. Next, the preamplified chirped pulse is split in time into four identical replicas and injected into one NIF Quad. At the output of the NIF beamline, each of the eight amplified pulses is compressed in an individual, folded, four-grating compressor. Compressor grating pairs have slightly different groove densities to enable compact folding geometry and eliminate adjacent beam cross-talk. Pulse duration is adjustable with a small, rack-mounted compressor in the front-end. We use non-sequential ray-tracing software, FRED for design and layout of the optical system. Currently, our FRED model includes all of the optical components from the output of the fiber front end to the target center (Fig. 2). CAD designed opto-mechanical components are imported into our FRED model to provide a complete system description. In addition to incoherent ray tracing and scattering analysis, FRED uses Gaussian beam decomposition to model coherent beam propagation. Neglecting nonlinear effects, we can obtain a nearly complete frequency domain description of the ARC beam at different stages in the system. We employ 3D Fourier based propagation codes: MIRO, Virtual Beamline (VBL), and PROP for time-domain pulse analysis. These codes simulate nonlinear effects, calculate near and far field beam profiles, and account for amplifier gain. Verification of correct system set-up is a major difficulty to using these codes. VBL and PROP predictions have been extensively benchmarked to NIF experiments, and the verified descriptions of specific NIF beamlines are used for ARC. MIRO has the added capability of treating bandwidth specific effects of CPA. A sample MIRO model of the NIF beamline is shown in Fig. 3. MIRO models are benchmarked to VBL and PROP in the narrow bandwidth mode. Developing a variety of simulation tools allows us to cross-check predictions of different models and gain confidence in their fidelity. Preliminary experiments, currently in progress, are allowing us to validate and refine our models, and help guide future experimental campaigns.« less

  10. Multi-level Monte Carlo Methods for Efficient Simulation of Coulomb Collisions

    NASA Astrophysics Data System (ADS)

    Ricketson, Lee

    2013-10-01

    We discuss the use of multi-level Monte Carlo (MLMC) schemes--originally introduced by Giles for financial applications--for the efficient simulation of Coulomb collisions in the Fokker-Planck limit. The scheme is based on a Langevin treatment of collisions, and reduces the computational cost of achieving a RMS error scaling as ɛ from O (ɛ-3) --for standard Langevin methods and binary collision algorithms--to the theoretically optimal scaling O (ɛ-2) for the Milstein discretization, and to O (ɛ-2 (logɛ)2) with the simpler Euler-Maruyama discretization. In practice, this speeds up simulation by factors up to 100. We summarize standard MLMC schemes, describe some tricks for achieving the optimal scaling, present results from a test problem, and discuss the method's range of applicability. This work was performed under the auspices of the U.S. DOE by the University of California, Los Angeles, under grant DE-FG02-05ER25710, and by LLNL under contract DE-AC52-07NA27344.

  11. Simulations of the Richtmyer-Meshkov Instability with experimentally measured volumetric initial conditions

    NASA Astrophysics Data System (ADS)

    Ferguson, Kevin; Sewell, Everest; Krivets, Vitaliy; Greenough, Jeffrey; Jacobs, Jeffrey

    2016-11-01

    Initial conditions for the Richtmyer-Meshkov instability (RMI) are measured in three dimensions in the University of Arizona Vertical Shock Tube using a moving magnet galvanometer system. The resulting volumetric data is used as initial conditions for the simulation of the RMI using ARES at Lawrence-Livermore National Laboratory (LLNL). The heavy gas is sulfur hexafluoride (SF6), and the light gas is air. The perturbations are generated by harmonically oscillating the gasses vertically using two loudspeakers mounted to the shock tube which cause Faraday resonance, producing a random short wavelength perturbation on the interface. Planar Mie scattering is used to illuminate the flow field through the addition of propylene glycol particles seeded in the heavy gas. An M=1.2 shock impulsively accelerates the interface, initiating instability growth. Images of the initial condition and instability growth are captured at a rate of 6 kHz using high speed cameras. Comparisons between experimental and simulation results, mixing diagnostics, and mixing zone growth are presented.

  12. Real-world applications of artificial neural networks to cardiac monitoring using radar and recent theoretical developments

    NASA Astrophysics Data System (ADS)

    Padgett, Mary Lou; Johnson, John L.; Vemuri, V. Rao

    1997-04-01

    This paper focuses on use of a new image filtering technique, Pulsed Coupled Neural Network factoring to enhance both the analysis and visual interpretation of noisy sinusoidal time signals, such as those produced by LLNL's Microwave Impulse Radar motion sensor. Separation of a slower, carrier wave from faster, finer detailed signals and from scattered noise is illustrated. The resulting images clearly illustrate the changes over time of simulated heart motion patterns. Such images can potentially assist a field medic in interpretation of the extent of combat injuries. These images can also be transmitted or stored and retrieved for later analysis.

  13. An assessment of multibody simulation tools for articulated spacecraft

    NASA Technical Reports Server (NTRS)

    Man, Guy K.; Sirlin, Samuel W.

    1989-01-01

    A survey of multibody simulation codes was conducted in the spring of 1988, to obtain an assessment of the state of the art in multibody simulation codes from the users of the codes. This survey covers the most often used articulated multibody simulation codes in the spacecraft and robotics community. There was no attempt to perform a complete survey of all available multibody codes in all disciplines. Furthermore, this is not an exhaustive evaluation of even robotics and spacecraft multibody simulation codes, as the survey was designed to capture feedback on issues most important to the users of simulation codes. We must keep in mind that the information received was limited and the technical background of the respondents varied greatly. Therefore, only the most often cited observations from the questionnaire are reported here. In this survey, it was found that no one code had both many users (reports) and no limitations. The first section is a report on multibody code applications. Following applications is a discussion of execution time, which is the most troublesome issue for flexible multibody codes. The representation of component flexible bodies, which affects both simulation setup time as well as execution time, is presented next. Following component data preparation, two sections address the accessibility or usability of a code, evaluated by considering its user interface design and examining the overall simulation integrated environment. A summary of user efforts at code verification is reported, before a tabular summary of the questionnaire responses. Finally, some conclusions are drawn.

  14. Final closure plan for the high-explosives open burn treatment facility at Lawrence Livermore National Laboratory Experimental Test Site 300

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathews, S.

    This document addresses the interim status closure of the HE Open Bum Treatment Facility, as detailed by Title 22, Division 4.5, Chapter 15, Article 7 of the Califonia Code of Regulations (CCR) and by Title 40, Code of Federal Regulations (CFR) Part 265, Subpart G, ``Closure and Post Closure.`` The Closure Plan (Chapter 1) and the Post- Closure Plan (Chapter 2) address the concept of long-term hazard elimination. The Closure Plan provides for capping and grading the HE Open Bum Treatment Facility and revegetating the immediate area in accordance with applicable requirements. The Closure Plan also reflects careful consideration ofmore » site location and topography, geologic and hydrologic factors, climate, cover characteristics, type and amount of wastes, and the potential for contaminant migration. The Post-Closure Plan is designed to allow LLNL to monitor the movement, if any, of pollutants from the treatment area. In addition, quarterly inspections will ensure that all surfaces of the closed facility, including the cover and diversion ditches, remain in good repair, thus precluding the potential for contaminant migration.« less

  15. The U. S. Department of Energy SARP review training program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauck, C.J.

    1988-01-01

    In support of its radioactive material packaging certification program, the U.S. Department of Energy (DOE) has established a special training workshop. The purpose of the two-week workshop is to develop skills in reviewing Safety Analysis Reports for Packagings (SARPs) and performing confirmatory analyses. The workshop, conducted by the Lawrence Livermore National Laboratory (LLNL) for DOE, is divided into two parts: methods of review and methods of analysis. The sessions covering methods of review are based on the DOE document, ''Packaging Review Guide for Reviewing Safety Analysis Reports for Packagings'' (PRG). The sessions cover relevant DOE Orders and all areas ofmore » review in the applicable Nuclear Regulatory Commission (NRC) Regulatory Guides. The technical areas addressed include structural and thermal behavior, materials, shielding, criticality, and containment. The course sessions on methods of analysis provide hands-on experience in the use of calculational methods and codes for reviewing SARPs. Analytical techniques and computer codes are discussed and sample problems are worked. Homework is assigned each night and over the included weekend; at the conclusion, a comprehensive take-home examination is given requiring six to ten hours to complete.« less

  16. 2013 R&D 100 Award: DNATrax could revolutionize air quality detection and tracking

    ScienceCinema

    Farquar, George

    2018-01-16

    A team of LLNL scientists and engineers has developed a safe and versatile material, known as DNA Tagged Reagents for Aerosol Experiments (DNATrax), that can be used to reliably and rapidly diagnose airflow patterns and problems in both indoor and outdoor venues. Until DNATrax particles were developed, no rapid or safe way existed to validate air transport models with realistic particles in the range of 1-10 microns. Successful DNATrax testing was conducted at the Pentagon in November 2012 in conjunction with the Pentagon Force Protection Agency. This study enhanced the team's understanding of indoor ventilation environments created by heating, ventilation and air conditioning (HVAC) systems. DNATrax are particles comprised of sugar and synthetic DNA that serve as a bar code for the particle. The potential for creating unique bar-coded particles is virtually unlimited, thus allowing for simultaneous and repeated releases, which dramatically reduces the costs associated with conducting tests for contaminants. Among the applications for the new material are indoor air quality detection, for homes, offices, ships and airplanes; urban particulate tracking, for subway stations, train stations, and convention centers; environmental release tracking; and oil and gas uses, including fracking, to better track fluid flow.

  17. SABrE User's Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, S.A.

    In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less

  18. SABrE User`s Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, S.A.

    In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less

  19. Spectral and Atomic Physics Analysis of Xenon L-Shell Emission From High Energy Laser Produced Plasmas

    NASA Astrophysics Data System (ADS)

    Thorn, Daniel; Kemp, G. E.; Widmann, K.; Benjamin, R. D.; May, M. J.; Colvin, J. D.; Barrios, M. A.; Fournier, K. B.; Liedahl, D.; Moore, A. S.; Blue, B. E.

    2016-10-01

    The spectrum of the L-shell (n =2) radiation in mid to high-Z ions is useful for probing plasma conditions in the multi-keV temperature range. Xenon in particular with its L-shell radiation centered around 4.5 keV is copiously produced from plasmas with electron temperatures in the 5-10 keV range. We report on a series of time-resolved L-shell Xe spectra measured with the NIF X-ray Spectrometer (NXS) in high-energy long-pulse (>10 ns) laser produced plasmas at the National Ignition Facility. The resolving power of the NXS is sufficiently high (E/ ∂E >100) in the 4-5 keV spectral band that the emission from different charge states is observed. An analysis of the time resolved L-shell spectrum of Xe is presented along with spectral modeling by detailed radiation transport and atomic physics from the SCRAM code and comparison with predictions from HYDRA a radiation-hydrodynamics code with inline atomic-physics from CRETIN. This work was performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344.

  20. 2013 R&D 100 Award: DNATrax could revolutionize air quality detection and tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farquar, George

    A team of LLNL scientists and engineers has developed a safe and versatile material, known as DNA Tagged Reagents for Aerosol Experiments (DNATrax), that can be used to reliably and rapidly diagnose airflow patterns and problems in both indoor and outdoor venues. Until DNATrax particles were developed, no rapid or safe way existed to validate air transport models with realistic particles in the range of 1-10 microns. Successful DNATrax testing was conducted at the Pentagon in November 2012 in conjunction with the Pentagon Force Protection Agency. This study enhanced the team's understanding of indoor ventilation environments created by heating, ventilationmore » and air conditioning (HVAC) systems. DNATrax are particles comprised of sugar and synthetic DNA that serve as a bar code for the particle. The potential for creating unique bar-coded particles is virtually unlimited, thus allowing for simultaneous and repeated releases, which dramatically reduces the costs associated with conducting tests for contaminants. Among the applications for the new material are indoor air quality detection, for homes, offices, ships and airplanes; urban particulate tracking, for subway stations, train stations, and convention centers; environmental release tracking; and oil and gas uses, including fracking, to better track fluid flow.« less

  1. Code Samples Used for Complexity and Control

    NASA Astrophysics Data System (ADS)

    Ivancevic, Vladimir G.; Reid, Darryn J.

    2015-11-01

    The following sections are included: * MathematicaⓇ Code * Generic Chaotic Simulator * Vector Differential Operators * NLS Explorer * 2C++ Code * C++ Lambda Functions for Real Calculus * Accelerometer Data Processor * Simple Predictor-Corrector Integrator * Solving the BVP with the Shooting Method * Linear Hyperbolic PDE Solver * Linear Elliptic PDE Solver * Method of Lines for a Set of the NLS Equations * C# Code * Iterative Equation Solver * Simulated Annealing: A Function Minimum * Simple Nonlinear Dynamics * Nonlinear Pendulum Simulator * Lagrangian Dynamics Simulator * Complex-Valued Crowd Attractor Dynamics * Freeform Fortran Code * Lorenz Attractor Simulator * Complex Lorenz Attractor * Simple SGE Soliton * Complex Signal Presentation * Gaussian Wave Packet * Hermitian Matrices * Euclidean L2-Norm * Vector/Matrix Operations * Plain C-Code: Levenberg-Marquardt Optimizer * Free Basic Code: 2D Crowd Dynamics with 3000 Agents

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sparks, Sandy; Miller, Russell B.

    This project evaluated the information security posture of QUALCOMM regarding its Internet connections. It also enhanced and refined the ability of LLNL to perform these evaluations and add to its body of knowledge concerning Internet threats, vulnerabilities, and countermeasures. The evaluations required a high degree of trust and cooperation between the assessors (LLNL) and the target organization (QUALCOMM). Without this high level of cooperation, the activity could easily have become an adversarial audit type situation and counterproductive to all parties.

  3. Electron-Beam Vapor Deposition of Mold Inserts Final Report CRADA No. TSB-777-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepp, T.; Feeley, T.

    Lawrence Livermore National Laboratory and H.G.G. Laser Fare, Inc. studied the application of electron-beam vapor deposition technology to the production of mold inserts for use in an injection molding machine by Laser Fare. Laser Fare provided LLNL with the requirements of the mold inserts as well as sample inserts. LLNL replicated the mold insert(s) to Laser Fare for testing by Laser Fare.

  4. Selected results from LLNL-Hughes RAR for West Coast Scotland Experiment 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, S K; Johnston, B; Twogood, R

    1993-01-05

    The joint US -- UK 1991 West Coast Scotland Experiment (WCSEX) was held in two locations. From July 5 to 12, 1991, in Upper Loch Linnhe, and from July 18 to July 26, 1991, in the Sound of Sleat. The LLNL-Hughes team fielded a fully polarimetric X-band hill-side real aperture radar to collect internal wave wake data. We present here a sample data set of the best radar runs.

  5. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Jeffrey S.

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  6. Fe L-shell Excitation Cross Section Measurements on EBIT-I

    NASA Astrophysics Data System (ADS)

    Chen, Hui; Beiersdorfer, P.; Brown, G.; Boyce, K.; Kelley, R.; Kilbourne, C.; Porter, F.; Gu, M. F.; Kahn, S.

    2006-09-01

    We report the measurement of electron impact excitation cross sections for the strong iron L-shell 3-2 lines of Fe XVII to Fe XXIV at the LLNL EBIT-I electron beam ion trap using a crystal spectrometer and NASA-Goddard Space Flight Center's 6x6 pixel array microcalorimeter. The cross sections were determined by direct normalization to the well-established cross sections for radiative electron capture. Our results include the excitation cross section for over 50 lines at multiple electron energies. Although we have found that for 3C line in Fe XVII the measured cross sections differ significantly from theory, in most cases the measurements and theory agree within 20%. This work was performed under the auspices of the U.S. DOE by LLNL under contract No. W-7405-Eng-48 and supported by NASA APRA grants to LLNL, GSFC, and Stanford University.

  7. Emergency Response Capability Baseline Needs Assessment - Compliance Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharry, John A.

    This document was prepared by John A. Sharry, LLNL Fire Marshal and Division Leader for Fire Protection and was reviewed by LLNL Emergency Management Department Head, James Colson. This document is the second of a two-part analysis on Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2016 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2016more » BNA, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures. The 2013 BNA was approved by NNSA’s Livermore Field Office on January 22, 2014.« less

  8. Estimated use of explosives in the mining industries of Algeria, Iran, Iraq, and Libya

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilburn, D.R.; Russell, J.A.; Bleiwas, D.I.

    1995-09-01

    This work was performed under Memorandum of Agreement B291534 Between the Lawrence Livermore National Laboratory (LLNL) and the United States Bureau of Mines. The Bureau of Mines authors are members of the Minerals Availability Field Office (MAFO) in Denver, CO, which uses an extensive network of information sources to develop and maintain the Minerals Availability database concerning mining and minerals properties worldwide. This study was initiated and directed by F. Heuze at LLNL. A previous study on the same subject had been commissioned by LLNL from the Mining Journal Research Services (MJRS) in London ,UK. Its results were integrated intomore » this report. MJRS is shown as one of the numerous sources which were used for this work. All sources are listed in the report. This document is arranged in four sections, one for each country, in alphabetical order. Thie outline is the same for each country.« less

  9. Hazardous-waste analysis plan for LLNL operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, R.S.

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan willmore » address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.« less

  10. Lawrence Livermore National Laboratory and Sandia National Laboratory Nuclear Accident Dosimetry Support of IER 252 and the Dose Characterization of the Flattop Reactor at the DAF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hickman, D. P.; Jeffers, K. L.; Radev, R. P.

    In support of IER 252 “Characterization of the Flattop Reactor at the NCERC”, LLNL performed ROSPEC measurements of the neutron spectrum and deployed 129 Personnel Nuclear Accident Dosimeters (PNAD) to establish the need for height corrections and verification of neutron spectrum evaluation of the fluences and dose. A very limited number of heights (typically only one or two heights) can be measured using neutron spectrometers, therefore it was important to determine if any height correction would be needed in future intercomparisons and studies. Specific measurement positions around the Flatttop reactor are provided in Figure 1. Table 1 provides run andmore » position information for LLNL measurements. The LLNL ROSPEC (R2) was used for run numbers 1 – 7, and vi. PNADs were positioned on trees during run numbers 9, 11, and 13.« less

  11. CHEETAH: A fast thermochemical code for detonation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, L.E.

    1993-11-01

    For more than 20 years, TIGER has been the benchmark thermochemical code in the energetic materials community. TIGER has been widely used because it gives good detonation parameters in a very short period of time. Despite its success, TIGER is beginning to show its age. The program`s chemical equilibrium solver frequently crashes, especially when dealing with many chemical species. It often fails to find the C-J point. Finally, there are many inconveniences for the user stemming from the programs roots in pre-modern FORTRAN. These inconveniences often lead to mistakes in preparing input files and thus erroneous results. We are producingmore » a modern version of TIGER, which combines the best features of the old program with new capabilities, better computational algorithms, and improved packaging. The new code, which will evolve out of TIGER in the next few years, will be called ``CHEETAH.`` Many of the capabilities that will be put into CHEETAH are inspired by the thermochemical code CHEQ. The new capabilities of CHEETAH are: calculate trace levels of chemical compounds for environmental analysis; kinetics capability: CHEETAH will predict chemical compositions as a function of time given individual chemical reaction rates. Initial application: carbon condensation; CHEETAH will incorporate partial reactions; CHEETAH will be based on computer-optimized JCZ3 and BKW parameters. These parameters will be fit to over 20 years of data collected at LLNL. We will run CHEETAH thousands of times to determine the best possible parameter sets; CHEETAH will fit C-J data to JWL`s,and also predict full-wall and half-wall cylinder velocities.« less

  12. Computation Directorate 2008 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to itsmore » 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.« less

  13. Comparison of photon attenuation coefficients (2-150 KeV) for diagnostic imaging simulations

    NASA Astrophysics Data System (ADS)

    Dodge, Charles W., III; Flynn, Michael J.

    2004-05-01

    The Radiology Research Laboratory at the Henry Ford Hospital has been involved in modeling x-ray units in order to predict image quality. A critical part of that modeling process is the accurate choice of interaction coefficients. This paper serves as a review and comparison of existing interaction models. Our objective was to obtain accurate and easily calculated interaction coefficients, at diagnostically relevant energies. We obtained data from: McMaster, Lawrence Berkeley Lab data (LBL), XCOM and FFAST Data from NIST, and the EPDL-97 database via LLNL. Our studies involve low energy photons; therefore, comparisons were limited to Coherent (Rayleigh), Incoherent (Compton) and Photoelectric effects, which were summed to determine a total interaction cross section. Without measured data, it becomes difficult to definitively choose the most accurate method. However, known limitations in the McMaster data and smoothing of photo-edge transitions can be used as a guide to establish more valid approaches. Each method was compared to one another graphically and at individual points. We found that agreement between all methods was excellent when away from photo-edges. Near photo-edges and at low energies, most methods were less accurate. Only the Chanter (FFAST) data seems to have consistently and accurately predicted the placement of edges (through M-shell), while minimizing smoothing errors. The EPDL-97 data by LLNL was the best over method in predicting coherent and incoherent cross sections.

  14. Modeling the Blast Load Simulator Airblast Environment using First Principles Codes. Report 1, Blast Load Simulator Environment

    DTIC Science & Technology

    2016-11-01

    ER D C/ G SL T R- 16 -3 1 Modeling the Blast Load Simulator Airblast Environment Using First Principles Codes Report 1, Blast Load...Simulator Airblast Environment using First Principles Codes Report 1, Blast Load Simulator Environment Gregory C. Bessette, James L. O’Daniel...evaluate several first principles codes (FPCs) for modeling airblast environments typical of those encountered in the BLS. The FPCs considered were

  15. An ARM data-oriented diagnostics package to evaluate the climate model simulation

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Xie, S.

    2016-12-01

    A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.

  16. An Improved Signal Model for Axion Dark Matter Searches

    NASA Astrophysics Data System (ADS)

    Lentz, Erik; ADMX Collaboration

    2017-01-01

    To date, most direct detection searches for axion dark matter, such as those by the Axion Dark Matter eXperiment (ADMX) microwave cavity search, have assumed a signal shape based on an isothermal spherical model of the Milky Way halo. Such a model is not capable of capturing contributions from realistic infall, nor from a baryonic disk. Modern N-Body simulations of structure formation can produce realistic Milky Way-like halos which include the influences of baryons, infall, and environmental influences. This talk presents an analysis of the Romulus25 N-Body simulation in the context of direct dark matter axion searches. An improved signal shape and an account of the relevant halo dynamics are given. Supported by DOE Grants DE-SC0010280, DE-FG02-96ER40956, DE-AC52-07NA27344, DE-AC03-76SF00098, the Heising-Simons Foundation and the LLNL, FNAL and PNNL LDRD program.

  17. Demonstration of Regional Discrimination of Eurasian Seismic Events Using Observations at Soviet IRIS and CDSN Stations

    DTIC Science & Technology

    1992-03-01

    Propagation of Lg Waves Across Eastern Europe and Asia, Lawrence Livermore National Laboratory Report, LLNL Report No. UCRL -52494. Press, F., and M. Ewing...the Nuclear Testing Ground in Eastern Kazakhstan, Lawrence Livermore National Laboratory Report, LLNL Report No. UCRL -52856. Ruzaikin, A., I. Nersesov...Derring Hall University Park, PA 16802 Blacksburg, VA 24061 Dr. Ralph Alewine, III Dr. Stephen Bratt DARPAftMRO Center for Seismic Studies 3701 North Fairax

  18. Hydrologic Resources Management Program and Underground Tests Area Project FY 2003 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J., B C; F., E G; K., E B

    This report describes FY 2003 technical studies conducted by the Chemical Biology and Nuclear Science Division (CBND) at Lawrence Livermore National Laboratory (LLNL) in support of the Hydrologic Resources Management Program (HRMP) and the Underground Test Area (UGTA) Project. These programs are administered by the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO) through the Defense Programs and Environmental Restoration Divisions, respectively. HRMP-sponsored work is directed toward the responsible management of the natural resources at the Nevada Test Site (NTS), enabling its continued use as a staging area for strategic operations in support of national security.more » UGTA-funded work emphasizes the development of an integrated set of groundwater flow and contaminant transport models to predict the extent of radionuclide migration from underground nuclear testing areas at the NTS. The present report is organized on a topical basis and contains five chapters that reflect the range of technical work performed by LLNL-CBND during FY 2003. Although we have emphasized investigations that were led by CBND, we also participated in a variety of collaborative studies with other UGTA and HRMP contract organizations including the Energy and Environment Directorate at LLNL (LLNL-E&E), Los Alamos National Laboratory (LANL), the Desert Research Institute (DRI), the U.S. Geological Survey (USGS), Stoller-Navarro Joint Venture (SNJV), and Bechtel Nevada (BN).« less

  19. Newberry Seismic Deployment Fieldwork Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Templeton, D C

    2012-03-21

    This report summarizes the seismic deployment of Lawrence Livermore National Laboratory (LLNL) Geotech GS-13 short-period seismometers at the Newberry Enhanced Geothermal System (EGS) Demonstration site located in Central Oregon. This Department of Energy (DOE) demonstration project is managed by AltaRock Energy Inc. AltaRock Energy had previously deployed Geospace GS-11D geophones at the Newberry EGS Demonstration site, however the quality of the seismic data was somewhat low. The purpose of the LLNL deployment was to install more sensitive sensors which would record higher quality seismic data for use in future seismic studies, such as ambient noise correlation, matched field processing earthquakemore » detection studies, and general EGS microearthquake studies. For the LLNL deployment, seven three-component seismic stations were installed around the proposed AltaRock Energy stimulation well. The LLNL seismic sensors were connected to AltaRock Energy Gueralp CMG-DM24 digitizers, which are powered by AltaRock Energy solar panels and batteries. The deployment took four days in two phases. In phase I, the sites were identified, a cavity approximately 3 feet deep was dug and a flat concrete pad oriented to true North was made for each site. In phase II, we installed three single component GS-13 seismometers at each site, quality controlled the data to ensure that each station was recording data properly, and filled in each cavity with native soil.« less

  20. California Tribal Nations Technical Water Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ben, C; Coty, J

    2005-08-15

    This research focused on identifying the key technical water issues of federally recognized California Native American tribes, the context within which these water issues arise for the tribes, and an appropriate format for potentially opening further dialogue on water research issues between the tribes and Lawrence Livermore National Laboratory (LLNL) scientists. At LLNL, a Water Quality and Resource Management Issues Workshop held in January of 2003 resulted in multiple recommendations, one proposing a LLNL dialogue with California tribes to further inform LLNL's prioritization of water issues based on identified needs across national sectors. The focus of this aforementioned Water Qualitymore » and Resource Management Issues Workshop was to identify national and international priority water research issues with which LLNL may align their research efforts and contribute to resolving these needs. LLNL staff researched various sectors to delineate the key water issues associated with each. This preliminary water issue research included diverse entities such as international water agencies, federal and state agencies, industry, non-governmental agencies, and private organizations. The key (identified) water issues across these sectors were presented to workshop attendees and used during workshop debates and sessions. However, the key water issues of federally recognized Native American tribes remained less understood, resulting in a workshop proposal for additional research and LLNL potentially hosting a dialog with representatives of these tribes. Federally recognized Native American tribes have a unique government-to-government relationship with the United States (U.S.) government, in contrast to other sectors researched for the workshop. Within the U.S., the number of federally recognized tribes currently stands at 562 and, in addition to this large number of tribes, much diversity across these tribes exists. For the purposes of this preliminary research and report, it was necessary to confine the analysis to a smaller geographic area, yet still represent the diversity of tribes and context within which tribal water issues arise. The state of California provides this opportunity. California has 106 federally recognized tribes. California is diverse in its geography, environment, demographics, and economic bases; California tribes demonstrate similar diversity. Additionally, no central repository of national or state tribal water issues exists and information must be aggregated, in general, tribe by tribe. This presents research challenges and, for this report, these were overcome by developing a method to essentially ''sub-sample'' the 106 federally recognized tribes in the state, while making every effort to maintain a sub-sample that broadly represents all of the 106 tribes. n an effort to develop an equitable and appropriate method with which to identify this set of representative tribes, multiple entities were contacted for guidance. Consultation with the Bureau of Indian Affairs (BIA), Environmental Protection Agency (EPA), Indian Health Services (IHS), U.S. Department of Agriculture (USDA), Bureau of Reclamation (BOR) and Tribal Environmental Directors, provided key information and recommendations to guide the research process. It is hoped that an appropriate representation of the diversity of tribes across the state has been achieved; this includes an adequate representation of similarities and differences between Californian tribes on key water research issues (and the same between regions). This research occurred over a limited time period (i.e., three months) and given a general concern that this may not be sufficient, any information and conclusions in this report should be viewed with this in mind. Finally, it is hoped that this research allows for an (enhanced) informed capacity to better propose further dialog between tribes and LLNL to continue to exchange water research perspectives and define potential research collaborations.« less

  1. A comparison between implicit and hybrid methods for the calculation of steady and unsteady inlet flows

    NASA Technical Reports Server (NTRS)

    Coakley, T. J.; Hsieh, T.

    1985-01-01

    Numerical simulation of steady and unsteady transonic diffuser flows using two different computer codes are discussed and compared with experimental data. The codes solve the Reynolds-averaged, compressible, Navier-Stokes equations using various turbulence models. One of the codes has been applied extensively to diffuser flows and uses the hybrid method of MacCormack. This code is relatively inefficient numerically. The second code, which was developed more recently, is fully implicit and is relatively efficient numerically. Simulations of steady flows using the implicit code are shown to be in good agreement with simulations using the hybrid code. Both simulations are in good agreement with experimental results. Simulations of unsteady flows using the two codes are in good qualitative agreement with each other, although the quantitative agreement is not as good as in the steady flow cases. The implicit code is shown to be eight times faster than the hybrid code for unsteady flow calculations and up to 32 times faster for steady flow calculations. Results of calculations using alternative turbulence models are also discussed.

  2. ASC Tri-lab Co-design Level 2 Milestone Report 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, Rich; Jones, Holger; Keasler, Jeff

    2015-09-23

    In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on amore » particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \

  3. Weapon Physicist Declassifies Rescued Nuclear Test Films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spriggs, Greg; Moye, Jim

    2017-03-15

    The U.S. conducted 210 atmospheric nuclear tests between 1945 and 1962, with multiple cameras capturing each event at around 2,400 frames per second. But in the decades since, around 10,000 of these films sat idle, scattered across the country in high-security vaults. Not only were they gathering dust, the film material itself was slowly decomposing, bringing the data they contained to the brink of being lost forever. For the past five years, Lawrence Livermore National Laboratory (LLNL) weapon physicist Greg Spriggs and a crack team of film experts, archivists and software developers have been on a mission to hunt down,more » scan, reanalyze and declassify these decomposing films. The goals are to preserve the films’ content before it’s lost forever, and provide better data to the post-testing-era scientists who use computer codes to help certify that the aging U.S. nuclear deterrent remains safe, secure and effective.« less

  4. Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images

    DTIC Science & Technology

    2009-12-01

    Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong...Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong DRDC Ottawa...for simulating radar images of a target is obtained, through direct simulation-to-measurement comparisons. A 3-dimensional computer-aided design

  5. ARM Data-Oriented Metrics and Diagnostics Package for Climate Model Evaluation Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chengzhu; Xie, Shaocheng

    A Python-based metrics and diagnostics package is currently being developed by the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Infrastructure Team at Lawrence Livermore National Laboratory (LLNL) to facilitate the use of long-term, high-frequency measurements from the ARM Facility in evaluating the regional climate simulation of clouds, radiation, and precipitation. This metrics and diagnostics package computes climatological means of targeted climate model simulation and generates tables and plots for comparing the model simulation with ARM observational data. The Coupled Model Intercomparison Project (CMIP) model data sets are also included in the package to enable model intercomparison as demonstratedmore » in Zhang et al. (2017). The mean of the CMIP model can serve as a reference for individual models. Basic performance metrics are computed to measure the accuracy of mean state and variability of climate models. The evaluated physical quantities include cloud fraction, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, and radiative fluxes, with plan to extend to more fields, such as aerosol and microphysics properties. Process-oriented diagnostics focusing on individual cloud- and precipitation-related phenomena are also being developed for the evaluation and development of specific model physical parameterizations. The version 1.0 package is designed based on data collected at ARM’s Southern Great Plains (SGP) Research Facility, with the plan to extend to other ARM sites. The metrics and diagnostics package is currently built upon standard Python libraries and additional Python packages developed by DOE (such as CDMS and CDAT). The ARM metrics and diagnostic package is available publicly with the hope that it can serve as an easy entry point for climate modelers to compare their models with ARM data. In this report, we first present the input data, which constitutes the core content of the metrics and diagnostics package in section 2, and a user's guide documenting the workflow/structure of the version 1.0 codes, and including step-by-step instruction for running the package in section 3.« less

  6. Public Key-Based Need-to-Know Authorization Engine Final Report CRADA No. TSB-1553-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mark, R.; Williams, R.

    The goals of this project were to develop a public key-based authentication service plug-in based on LLNL's requirements, integrate the public key-based authentication with the Intra Verse authorization service adn the LLNL NTK server by developing a full-featured version of the prototyped Intra Verse need-to-know plug in; and to test the authorization and need-to-know plug-in in a secured extranet prototype among selected national Labs.

  7. Centers for Disease Control and Prevention (CDC) Radiation Hazard Scale Data Product Review Feedback Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Askin, A.; Buddemeier, B.; Alai, M.

    In support of the Department of Energy (DOE) National nuclear Security Administration (NNSA) and the Centers for Disease Control and Prevention (CDC), Lawrence Livermore National Laboratory (LLNL) assisted in the development of new data templates for disseminating and communicating FRMAC1 data products using the CDC Radiation Hazard Scale communication tool. To ensure these data products will be useful to stakeholders during a radiological emergency, LLNL facilitated opportunities for product socialization and review.

  8. Rapidly Deployable Security System Final Report CRADA No. TC-2030-01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohlhepp, V.; Whiteman, B.; McKibben, M. T.

    The ultimate objective of the LEADER and LLNL strategic partnership was to develop and commercialize_a security-based system product and platform for the use in protecting the substantial physical and economic assets of the government and commerce of the United States. The primary goal of this project was to integrate video surveillance hardware developed by LLNL with a security software backbone developed by LEADER. Upon completion of the project, a prototype hardware/software security system that is highly scalable was to be demonstrated.

  9. UI Review Results and NARAC Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, J.; Eme, B.; Kim, S.

    2017-03-08

    This report describes the results of an inter-program design review completed February 16th, 2017, during the second year of a FY16-FY18 NA-84 Technology Integration (TI) project to modernize the core software system used in DOE/NNSA's National Atmospheric Release Advisory Center (NARAC, narac.llnl.gov). This review focused on the graphical user interfaces (GUI) frameworks. Reviewers (described in Appendix 2) were selected from multiple areas of the LLNL Computation directorate, based on their expertise in GUI and Web technologies.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S; Larsen, S; Wagoner, J

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less

  11. Laser Materials Processing Final Report CRADA No. TC-1526-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crane, J.; Lehane, C. J.

    2017-09-08

    This CRADA project was a joint effort between Lawrence Livermore National Laboratory (LLNL) and United Technologies Corporation (UTC)/Pratt & Whitney (P&W) to demonstrate process capability for drilling holes in turbine airfoils using LLNL-developed femtosecond laser machining technology. The basis for this development was the ability of femtosecond lasers to drill precision holes in variety of materials with little or no collateral damage. The ultimate objective was to develop a laser machine tool consisting of an extremely advanced femtosecond laser subsystem to be developed by LLNL on a best-effort basis and a drilling station for turbine blades and vanes to bemore » developed by P&W. In addition, P&W was responsible for commercializing the system. The goal of the so called Advanced Laser Drilling (ALD) system was to drill specified complex hole-shapes in turbine blades and vanes with a high degree precision and repeatability and simultaneously capable of very high speed processing.« less

  12. Dielectronic Satellite Spectra of Na-like Mo Ions Benchmarked by LLNL EBIT with Application to HED Plasmas

    NASA Astrophysics Data System (ADS)

    Stafford, A.; Safronova, A. S.; Kantsyrev, V. L.; Safronova, U. I.; Petkov, E. E.; Shlyaptseva, V. V.; Childers, R.; Shrestha, I.; Beiersdorfer, P.; Hell, H.; Brown, G. V.

    2017-10-01

    Dielectronic recombination (DR) is an important process for astrophysical and laboratory high energy density (HED) plasmas and the associated satellite lines are frequently used for plasma diagnostics. In particular, K-shell DR satellite lines were studied in detail in low-Z plasmas. L-shell Na-like spectral features from Mo X-pinches considered here represent the blend of DR and inner shell satellites and motivated the detailed study of DR at the EBIT-1 electron beam ion trap at LLNL. In these experiments the beam energy was swept between 0.6 - 2.4 keV to produce resonances at certain electron beam energies. The advantages of using an electron beam ion trap to better understand atomic processes with highly ionized ions in HED Mo plasma are highlighted. This work was supported by NNSA under DOE Grant DE-NA0002954. Work at LLNL was performed under the auspices of the U.S. DOE under Contract No. DE-AC52-07NA27344.

  13. ANNarchy: a code generation approach to neural simulations on parallel hardware

    PubMed Central

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  14. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  15. Generalized Nuclear Data: A New Structure (with Supporting Infrastructure) for Handling Nuclear Data

    NASA Astrophysics Data System (ADS)

    Mattoon, C. M.; Beck, B. R.; Patel, N. R.; Summers, N. C.; Hedstrom, G. W.; Brown, D. A.

    2012-12-01

    The Evaluated Nuclear Data File (ENDF) format was designed in the 1960s to accommodate neutron reaction data to support nuclear engineering applications in power, national security and criticality safety. Over the years, the scope of the format has been extended to handle many other kinds of data including charged particle, decay, atomic, photo-nuclear and thermal neutron scattering. Although ENDF has wide acceptance and support for many data types, its limited support for correlated particle emission, limited numeric precision, and general lack of extensibility mean that the nuclear data community cannot take advantage of many emerging opportunities. More generally, the ENDF format provides an unfriendly environment that makes it difficult for new data evaluators and users to create and access nuclear data. The Cross Section Evaluation Working Group (CSEWG) has begun the design of a new Generalized Nuclear Data (or 'GND') structure, meant to replace older formats with a hierarchy that mirrors the underlying physics, and is aligned with modern coding and database practices. In support of this new structure, Lawrence Livermore National Laboratory (LLNL) has updated its nuclear data/reactions management package Fudge to handle GND structured nuclear data. Fudge provides tools for converting both the latest ENDF format (ENDF-6) and the LLNL Evaluated Nuclear Data Library (ENDL) format to and from GND, as well as for visualizing, modifying and processing (i.e., converting evaluated nuclear data into a form more suitable to transport codes) GND structured nuclear data. GND defines the structure needed for storing nuclear data evaluations and the type of data that needs to be stored. But unlike ENDF and ENDL, GND does not define how the data are to be stored in a file. Currently, Fudge writes the structured GND data to a file using the eXtensible Markup Language (XML), as it is ASCII based and can be viewed with any text editor. XML is a meta-language, meaning that it has a primitive set of definitions for representing hierarchical data/text in a file. Other meta-languages, like HDF5 which stores the data in binary form, can also be used to store GND in a file. In this paper, we will present an overview of the new GND data structures along with associated tools in Fudge.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  17. Engineering Research and Development and Technology thrust area report FY92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R.T.; Minichino, C.

    1993-03-01

    The mission of the Engineering Research, Development, and Technology Program at Lawrence Livermore National Laboratory (LLNL) is to develop the technical staff and the technology needed to support current and future LLNL programs. To accomplish this mission, the Engineering Research, Development, and Technology Program has two important goals: (1) to identify key technologies and (2) to conduct high-quality work to enhance our capabilities in these key technologies. To help focus our efforts, we identify technology thrust areas and select technical leaders for each area. The thrust areas are integrated engineering activities and, rather than being based on individual disciplines, theymore » are staffed by personnel from Electronics Engineering, Mechanical Engineering, and other LLNL organizations, as appropriate. The thrust area leaders are expected to establish strong links to LLNL program leaders and to industry; to use outside and inside experts to review the quality and direction of the work; to use university contacts to supplement and complement their efforts; and to be certain that we are not duplicating the work of others. This annual report, organized by thrust area, describes activities conducted within the Program for the fiscal year 1992. Its intent is to provide timely summaries of objectives, theories, methods, and results. The nine thrust areas for this fiscal year are: Computational Electronics and Electromagnetics; Computational Mechanics; Diagnostics and Microelectronics; Emerging Technologies; Fabrication Technology; Materials Science and Engineering; Microwave and Pulsed Power; Nondestructive Evaluation; and Remote Sensing and Imaging, and Signal Engineering.« less

  18. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling

    PubMed Central

    Punzalan, Florencio Rusty; Kunieda, Yoshitoshi; Amano, Akira

    2015-01-01

    Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs). Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code generator can be used to generate code for physiological simulations and provides a tool for studying cardiac electrophysiology. PMID:26356082

  19. Tristan code and its application

    NASA Astrophysics Data System (ADS)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  20. Simulations and Experiments of Dynamic Granular Compaction in Non-ideal Geometries

    NASA Astrophysics Data System (ADS)

    Homel, Michael; Herbold, Eric; Lind, John; Crum, Ryan; Hurley, Ryan; Akin, Minta; Pagan, Darren; LLNL Team

    2017-06-01

    Accurately describing the dynamic compaction of granular materials is a persistent challenge in computational mechanics. Using a synchrotron x-ray source we have obtained detailed imaging of the evolving compaction front in synthetic olivine powder impacted at 300 - 600 m / s . To facilitate imaging, a non-traditional sample geometry is used, producing multiple load paths within the sample. We demonstrate that (i) commonly used models for porous compaction may produce inaccurate results for complex loading, even if the 1 - D , uniaxial-strain compaction response is reasonable, and (ii) the experimental results can be used along with simulations to determine parameters for sophisticated constitutive models that more accurately describe the strength, softening, bulking, and poroelastic response. Effects of experimental geometry and alternative configurations are discussed. Our understanding of the material response is further enhanced using mesoscale simulations that allow us to relate the mechanisms of grain fracture, contact, and comminution to the macroscale continuum response. Numerical considerations in both continuum and mesoscale simulations are described. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LDRD#16-ERD-010. LLNL-ABS-725113.

  1. Predicting Atmospheric Releases from the September 3, 2017 North Korean Event

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Simpson, M. D.; Glascoe, L. G.

    2017-12-01

    Underground nuclear explosions produce radionuclides that can be vented to the atmosphere and transported to International Monitoring System (IMS) measurement stations. Although a positive atmospheric detection from North Korea's declared test on September 3, 2017 has not been reported at any IMS station through early October, atmospheric transport models can predict when and where detections may arise and provide valuable information to optimize air collection strategies. We present predictive atmospheric transport simulations initiated in the early days after the event. Wind fields were simulated with the Weather Research and Forecast model and used to transport air tracers from an ensemble of releases in the FLEXPART dispersion model. If early venting had occurred, the simulations suggested that detections were possible at the IMS station in Takasaki, Japan. On-going and future research efforts associated with nuclear testing are focused on quantifying meteorological uncertainty, simulating releases in complex terrain, and developing new statistical methods for source attribution. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and is released as LLNL-ABS-740341.

  2. Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled High-Resolution Gamma Spectrometry Systems Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreyer, Jonathan G.; Wang, Tzu-Fang; Vo, Duc T.

    Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4more » – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.« less

  3. Final Report for "Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinath Vadlamani; Scott Kruger; Travis Austin

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less

  4. Performance Analysis: Work Control Events Identified January - August 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, C E; Freeman, J W; Kerr, C E

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less

  5. Java Performance for Scientific Applications on LLNL Computer Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapfer, C; Wissink, A

    2002-05-10

    Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less

  6. A QM/MM Metadynamics Study of the Direct Decarboxylation Mechanism for Orotidine-5'-monophosphate Decarboxylase using Two Different QM Regions: Acceleration too Small to Explain Rate of Enzyme Catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanton, Courtney; Kuo, I-F W.; Mundy, Christopher J.

    2007-11-01

    Despite decades of study, the mechanism of orotidine-5'-monophosphate decarboxylase (ODCase) remains unresolved. A computational investigation of the direct decarboxylation mechanism has been performed using mixed quantum mechanical/molecular mechanical (QM/MM) dynamics simulations. The study was performed with the program CP2K that integrates classical dynamics and ab initio dynamics based on the Born-Oppenheimer approach. Two different QM regions were explored. It was found that the size of the QM region has a dramatic effect on the calculated reaction barrier. The free energy barriers for decarboxylation of orotidine-5'-monophosphate (OMP) in solution and in the enzyme were determined with the metadynamics method to bemore » 40 kcal/mol and 33 kcal/mol, respectively. The calculated change in activation free energy (ΔΔG±) on going from solution to the enzyme is therefore -7 kcal/mol, far less than the experimental change of -23 kcal/mol (for kcat/kuncat Radzicka, A.; Wolfenden, R., Science. 1995, 267, 90-92). These results do not support the direct decarboxylation mechanism in the enzyme. Funding was provided by the University of California Lawrence Livermore National Laboratory (LLNL) and the National Institutes of Health (NIH). Part of this work was performed under the auspices of the U.S. Department of Energy by LLNL under contract No. W-7405-Eng-48. Computer resources were provided by Livermore Computing.« less

  7. Critical Issues on Materials for Gen-IV Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caro, M; Marian, J; Martinez, E

    2009-02-27

    Within the LDRD on 'Critical Issues on Materials for Gen-IV Reactors' basic thermodynamics of the Fe-Cr alloy and accurate atomistic modeling were used to help develop the capability to predict hardening, swelling and embrittlement using the paradigm of Multiscale Materials Modeling. Approaches at atomistic and mesoscale levels were linked to build-up the first steps in an integrated modeling platform that seeks to relate in a near-term effort dislocation dynamics to polycrystal plasticity. The requirements originated in the reactor systems under consideration today for future sources of nuclear energy. These requirements are beyond the present day performance of nuclear materials andmore » calls for the development of new, high temperature, radiation resistant materials. Fe-Cr alloys with 9-12% Cr content are the base matrix of advanced ferritic/martensitic (FM) steels envisaged as fuel cladding and structural components of Gen-IV reactors. Predictive tools are needed to calculate structural and mechanical properties of these steels. This project represents a contribution in that direction. The synergy between the continuous progress of parallel computing and the spectacular advances in the theoretical framework that describes materials have lead to a significant advance in our comprehension of materials properties and their mechanical behavior. We took this progress to our advantage and within this LDRD were able to provide a detailed physical understanding of iron-chromium alloys microstructural behavior. By combining ab-initio simulations, many-body interatomic potential development, and mesoscale dislocation dynamics we were able to describe their microstructure evolution. For the first time in the case of Fe-Cr alloys, atomistic and mesoscale were merged and the first steps taken towards incorporating ordering and precipitation effects into dislocation dynamics (DD) simulations. Molecular dynamics (MD) studies of the transport of self-interstitial, vacancy and point defect clusters in concentrated Fe-Cr alloys were performed for future diffusion data calculations. A recently developed parallel MC code with displacement allowed us to predict the evolution of the defect microstructures, local chemistry changes, grain boundary segregation and precipitation resulting from radiation enhanced diffusion. We showed that grain boundaries, dislocations and free surfaces are not preferential for alpha-prime precipitation, and explained experimental observations of short-range order (SRO) in Fe-rich FeCr alloys. Our atomistic studies of dislocation hardening allowed us to obtain dislocation mobility functions for BCC pure iron and Fe-Cr and determine for FCC metals the dislocation interaction with precipitates with a description to be used in Dislocation Dynamic (DD) codes. A Synchronous parallel Kinetic Monte Carlo code was developed and tested which promises to expand the range of applicability of kMC simulations. This LDRD furthered the limits of the available science on the thermodynamic and mechanic behavior of metallic alloys and extended the application of physically-based multiscale materials modeling to cases of severe temperature and neutron fluence conditions in advanced future nuclear reactors. The report is organized as follows: after a brief introduction, we present the research activities, and results obtained. We give recommendations on future LLNL activities that may contribute to the progress in this area, together with examples of possible research lines to be supported.« less

  8. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  9. Assessment and Application of the ROSE Code for Reactor Outage Thermal-Hydraulic and Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Thomas K.S.; Ko, F.-K.; Dai, L.-C

    The currently available tools, such as RELAP5, RETRAN, and others, cannot easily and correctly perform the task of analyzing the system behavior during plant outages. Therefore, a medium-sized program aiming at reactor outage simulation and evaluation, such as midloop operation (MLO) with loss of residual heat removal (RHR), has been developed. Important thermal-hydraulic processes involved during MLO with loss of RHR can be properly simulated by the newly developed reactor outage simulation and evaluation (ROSE) code. The two-region approach with a modified two-fluid model has been adopted to be the theoretical basis of the ROSE code.To verify the analytical modelmore » in the first step, posttest calculations against the integral midloop experiments with loss of RHR have been performed. The excellent simulation capacity of the ROSE code against the Institute of Nuclear Energy Research Integral System Test Facility test data is demonstrated. To further mature the ROSE code in simulating a full-sized pressurized water reactor, assessment against the WGOTHIC code and the Maanshan momentary-loss-of-RHR event has been undertaken. The successfully assessed ROSE code is then applied to evaluate the abnormal operation procedure (AOP) with loss of RHR during MLO (AOP 537.4) for the Maanshan plant. The ROSE code also has been successfully transplanted into the Maanshan training simulator to support operator training. How the simulator was upgraded by the ROSE code for MLO will be presented in the future.« less

  10. Comparison of DAC and MONACO DSMC Codes with Flat Plate Simulation

    NASA Technical Reports Server (NTRS)

    Padilla, Jose F.

    2010-01-01

    Various implementations of the direct simulation Monte Carlo (DSMC) method exist in academia, government and industry. By comparing implementations, deficiencies and merits of each can be discovered. This document reports comparisons between DSMC Analysis Code (DAC) and MONACO. DAC is NASA's standard DSMC production code and MONACO is a research DSMC code developed in academia. These codes have various differences; in particular, they employ distinct computational grid definitions. In this study, DAC and MONACO are compared by having each simulate a blunted flat plate wind tunnel test, using an identical volume mesh. Simulation expense and DSMC metrics are compared. In addition, flow results are compared with available laboratory data. Overall, this study revealed that both codes, excluding grid adaptation, performed similarly. For parallel processing, DAC was generally more efficient. As expected, code accuracy was mainly dependent on physical models employed.

  11. Sensor Fusion for Nuclear Proliferation Activity Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adel Ghanem, Ph D

    2007-03-30

    The objective of Phase 1 of this STTR project is to demonstrate a Proof-of-Concept (PoC) of the Geo-Rad system that integrates a location-aware SmartTag (made by ZonTrak) and a radiation detector (developed by LLNL). It also includes the ability to transmit the collected radiation data and location information to the ZonTrak server (ZonService). The collected data is further transmitted to a central server at LLNL (the Fusion Server) to be processed in conjunction with overhead imagery to generate location estimates of nuclear proliferation and radiation sources.

  12. Transient plasma estimation: a noise cancelling/identification approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-03-01

    The application of a noise cancelling technique to extract energy storage information from sensors occurring during fusion reactor experiments on the Tandem Mirror Experiment-Upgrade (TMX-U) at the Lawrence Livermore National Laboratory (LLNL) is examined. We show how this technique can be used to decrease the uncertainty in the corresponding sensor measurements used for diagnostics in both real-time and post-experimental environments. We analyze the performance of algorithm on the sensor data and discuss the various tradeoffs. The algorithm suggested is designed using SIG, an interactive signal processing package developed at LLNL.

  13. 2007 SB14 Source Reduction Plan/Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, L

    2007-07-24

    Aqueous solutions (mixed waste) generated from various LLNL operations, such as debris washing, sample preparation and analysis, and equipment maintenance and cleanout, were combined for storage in the B695 tank farm. Prior to combination the individual waste streams had different codes depending on the particular generating process and waste characteristics. The largest streams were CWC 132, 791, 134, 792. Several smaller waste streams were also included. This combined waste stream was treated at LLNL's waste treatment facility using a vacuum filtration and cool vapor evaporation process in preparation for discharge to sanitary sewer. Prior to discharge, the treated waste streammore » was sampled and the results were reviewed by LLNL's water monitoring specialists. The treated solution was discharged following confirmation that it met the discharge criteria. A major source, accounting for 50% for this waste stream, is metal machining, cutting and grinding operations in the engineering machine shops in B321/B131. An additional 7% was from similar operations in B131 and B132S. This waste stream primarily contains metal cuttings from machined parts, machining coolant and water, with small amounts of tramp oil from the machining and grinding equipment. Several waste reduction measures for the B321 machine shop have been taken, including the use of a small point-of-use filtering/tramp-oil coalescing/UV-sterilization coolant recycling unit, and improved management techniques (testing and replenishing) for coolants. The recycling unit had some operational problems during 2006. The machine shop is planning to have it repaired in the near future. A major source, accounting for 50% for this waste stream, is metal machining, cutting and grinding operations in the engineering machine shops in B321/B131. An additional 7% was from similar operations in B131 and B132S. This waste stream primarily contains metal cuttings from machined parts, machining coolant and water, with small amounts of tramp oil from the machining and grinding equipment. Several waste reduction measures for the B321 machine shop have been taken, including the use of a small point-of-use filtering/tramp-oil coalescing/UV-sterilization coolant recycling unit, and improved management techniques (testing and replenishing) for coolants. The recycling unit had some operational problems during 2006. The machine shop is planning to have it repaired in the near future. Quarterly waste generation data prepared by the Environmental Protection Department's P2 Team are regularly provided to engineering shops as well as other facilities so that generators can track the effectiveness of their waste minimization efforts.« less

  14. Efficient Coupling of Fluid-Plasma and Monte-Carlo-Neutrals Models for Edge Plasma Transport

    NASA Astrophysics Data System (ADS)

    Dimits, A. M.; Cohen, B. I.; Friedman, A.; Joseph, I.; Lodestro, L. L.; Rensink, M. E.; Rognlien, T. D.; Sjogreen, B.; Stotler, D. P.; Umansky, M. V.

    2017-10-01

    UEDGE has been valuable for modeling transport in the tokamak edge and scrape-off layer due in part to its efficient fully implicit solution of coupled fluid neutrals and plasma models. We are developing an implicit coupling of the kinetic Monte-Carlo (MC) code DEGAS-2, as the neutrals model component, to the UEDGE plasma component, based on an extension of the Jacobian-free Newton-Krylov (JFNK) method to MC residuals. The coupling components build on the methods and coding already present in UEDGE. For the linear Krylov iterations, a procedure has been developed to ``extract'' a good preconditioner from that of UEDGE. This preconditioner may also be used to greatly accelerate the convergence rate of a relaxed fixed-point iteration, which may provide a useful ``intermediate'' algorithm. The JFNK method also requires calculation of Jacobian-vector products, for which any finite-difference procedure is inaccurate when a MC component is present. A semi-analytical procedure that retains the standard MC accuracy and fully kinetic neutrals physics is therefore being developed. Prepared for US DOE by LLNL under Contract DE-AC52-07NA27344 and LDRD project 15-ERD-059, by PPPL under Contract DE-AC02-09CH11466, and supported in part by the U.S. DOE, OFES.

  15. Detection and Attribution of Regional Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bala, G; Mirin, A

    2007-01-19

    We developed a high resolution global coupled modeling capability to perform breakthrough studies of the regional climate change. The atmospheric component in our simulation uses a 1{sup o} latitude x 1.25{sup o} longitude grid which is the finest resolution ever used for the NCAR coupled climate model CCSM3. Substantial testing and slight retuning was required to get an acceptable control simulation. The major accomplishment is the validation of this new high resolution configuration of CCSM3. There are major improvements in our simulation of the surface wind stress and sea ice thickness distribution in the Arctic. Surface wind stress and oceanmore » circulation in the Antarctic Circumpolar Current are also improved. Our results demonstrate that the FV version of the CCSM coupled model is a state of the art climate model whose simulation capabilities are in the class of those used for IPCC assessments. We have also provided 1000 years of model data to Scripps Institution of Oceanography to estimate the natural variability of stream flow in California. In the future, our global model simulations will provide boundary data to high-resolution mesoscale model that will be used at LLNL. The mesoscale model would dynamically downscale the GCM climate to regional scale on climate time scales.« less

  16. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  17. Lawrence Livermore National Laboratory Site Seismic Safety Program: Summary of Findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savy, J B; Foxall, W

    The Lawrence Livermore National Laboratory (LLNL) Site Seismic Safety Program was conceived in 1979 during the preparation of the site Draft Environmental Impact Statement. The impetus for the program came from the development of new methodologies and geologic data that affect assessments of geologic hazards at the LLNL site; it was designed to develop a new assessment of the seismic hazard to the LLNL site and LLNL employees. Secondarily, the program was also intended to provide the technical information needed to make ongoing decisions about design criteria for future construction at LLNL and about the adequacy of existing facilities. Thismore » assessment was intended to be of the highest technical quality and to make use of the most recent and accepted hazard assessment methodologies. The basic purposes and objectives of the current revision are similar to those of the previous studies. Although all the data and experience assembled in the previous studies were utilized to their fullest, the large quantity of new information and new methodologies led to the formation of a new team that includes LLNL staff and outside consultants from academia and private consulting firms. A peer-review panel composed of individuals from academia (A. Cornell, Stanford University), the Department of Energy (DOE; Jeff Kimball), and consulting (Kevin Coppersmith), provided review and guidance. This panel was involved from the beginning of the project in a ''participatory'' type of review. The Senior Seismic Hazard Analysis Committee (SSHAC, a committee sponsored by the U.S. Nuclear Regulatory Commission, DOE, and the Electric Power Research Institute) strongly recommends the use of participatory reviews, in which the reviewers follow the progress of a project from the beginning, rather than waiting until the end to provide comments (Budnitz et al., 1997). Following the requirements for probabilistic seismic hazard analysis (PSHA) stipulated in the DOE standard DOE-STD-1023-95, a special effort was made to identify and quantify all types of uncertainties. The final seismic hazard estimates were de-aggregated to determine the contribution of all the seismic sources as well as the relative contributions of potential future earthquakes in terms of their magnitudes and distances from the site. It was found that, in agreement with previous studies, the Greenville Fault system contributes the most to the estimate of the seismic hazard expressed in terms of the probability of exceedance of the peak ground acceleration (PGA) at the center of the LLNL site (i.e., at high frequencies). It is followed closely by the Calaveras and Corral Hollow faults. The Mount Diablo thrust and the Springtown and Livermore faults were not considered in the hazard calculations in the 1991 study. In this study they contributed together approximately as much as the Greenville fault. At lower frequencies, more distant faults such as the Hayward and San Andreas faults begin to appear as substantial contributors to the total hazard. The results of this revision are presented in Figures 1 and 2. Figure 1 shows the estimated mean hazard curve in terms of the annual probability of exceedance of the peak ground acceleration (average of the two horizontal orthogonal components) at the LLNL site, assuming that the local site conditions are similar to those of a generic soil. Figure 2 shows the results in terms of the uniform hazard spectra (pseudo-spectral accelerations for 5% damping) for five return periods. Although this latest revision is based on a completely independent and in many respects very different set of data and methodology from the previous one, it gives essentially the same results for the prediction of the peak ground acceleration (PGA), albeit with a reduced uncertainty. The Greenville fault being a dominant contributor to the hazard, a field investigation was performed to better characterize the probability distribution of the rate of slip on the fault. Samples were collected from a trench located on the northern segment of the Greenville fault, and are in the process of being dated at the LLNL Center for Acceleration Mass Spectrometry (CAMS) using carbon-14. Preliminary results from the dating corroborate the range of values used in the hazard calculations. A final update after completion and qualification (quality assurance) of the date measurements, in the near future, will finalize the distribution of this important parameter, probably using Bayesian updating.« less

  18. Combustor Simulation

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    The goal was to perform 3D simulation of GE90 combustor, as part of full turbofan engine simulation. Requirements of high fidelity as well as fast turn-around time require massively parallel code. National Combustion Code (NCC) was chosen for this task as supports up to 999 processors and includes state-of-the-art combustion models. Also required is ability to take inlet conditions from compressor code and give exit conditions to turbine code.

  19. An approach for coupled-code multiphysics core simulations from a common input

    DOE PAGES

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...

    2014-12-10

    This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less

  20. Investigation on the Capability of a Non Linear CFD Code to Simulate Wave Propagation

    DTIC Science & Technology

    2003-02-01

    Linear CFD Code to Simulate Wave Propagation Pedro de la Calzada Pablo Quintana Manuel Antonio Burgos ITP, S.A. Parque Empresarial Fernando avenida...mechanisms above presented, simulation of unsteady aerodynamics with linear and nonlinear CFD codes is an ongoing activity within the turbomachinery industry

  1. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  2. Laboratory directed research and development fy1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Ayat, R A

    2000-04-11

    The Lawrence Livermore National Laboratory (LLNL) was founded in 1952 and has been managed since its inception by the University of California (UC) for the U.S. Department of Energy (DOE). Because of this long association with UC, the Laboratory has been able to recruit a world-class workforce, establish an atmosphere of intellectual freedom and innovation, and achieve recognition in relevant fields of knowledge as a scientific and technological leader. This environment and reputation are essential for sustained scientific and technical excellence. As a DOE national laboratory with about 7,000 employees, LLNL has an essential and compelling primary mission to ensuremore » that the nation's nuclear weapons remain safe, secure, and reliable and to prevent the spread and use of nuclear weapons worldwide. The Laboratory receives funding from the DOE Assistant Secretary for Defense Programs, whose focus is stewardship of our nuclear weapons stockpile. Funding is also provided by the Deputy Administrator for Defense Nuclear Nonproliferation, many Department of Defense sponsors, other federal agencies, and the private sector. As a multidisciplinary laboratory, LLNL has applied its considerable skills in high-performance computing, advanced engineering, and the management of large research and development projects to become the science and technology leader in those areas of its mission responsibility. The Laboratory Directed Research and Development (LDRD) Program was authorized by the U.S. Congress in 1984. The Program allows the Director of each DOE laboratory to fund advanced, creative, and innovative research and development (R&D) activities that will ensure scientific and technical vitality in the continually evolving mission areas at DOE and the Laboratory. In addition, the LDRD Program provides LLNL with the flexibility to nurture and enrich essential scientific and technical competencies, which attract the most qualified scientists and engineers. The LDRD Program also enables many collaborations with the scientific community in academia, national and international laboratories, and industry. The projects in the FY1999 LDRD portfolio were carefully selected to continue vigorous support of the strategic vision and the long-term goals of DOE and the Laboratory. Projects chosen for LDRD funding undergo stringent selection processes, which look for high-potential scientific return, emphasize strategic relevance, and feature technical peer reviews by external and internal experts. The FY1999 projects described in this annual report focus on supporting the Laboratory's national security needs: stewardship of the U.S. nuclear weapons stockpile, responsibility for the counter- and nonproliferation of weapons of mass destruction, development of high-performance computing, and support of DOE environmental research and waste management programs. In the past, LDRD investments have significantly enhanced LLNL scientific capabilities and greatly contributed to the Laboratory's ability to meet its national security programmatic requirements. Examples of past investments include technical precursors to the Accelerated Strategic Computing Initiative (ASCI), special-materials processing and characterization, and biodefense. Our analysis of the FY1999 portfolio shows that it strongly supports the Laboratory's national security mission. About 95% of the LDRD dollars have directly supported LLNL's national security activities in FY1999, which far exceeds the portion of LLNL's overall budget supported by National Security Programs, which is 63% for FY1999.« less

  3. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  4. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  5. Environmental Report 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallegos, Gretchen M.; Bertoldo, Nicholas A.; Campbell, Christopher G.

    The purposes of the Lawrence Livermore National Laboratory Environmental Report 2009 are to record Lawrence Livermore National Laboratory’s (LLNL’s) compliance with environmental standards and requirements, describe LLNL’s environmental protection and remediation programs, and present the results of environmental monitoring at the two LLNL sites—the Livermore site and Site 300. The report is prepared for the U.S. Department of Energy (DOE) by LLNL’s Environmental Protection Department. Submittal of the report satisfies requirements under DOE Order 231.1A, Environmental Safety and Health Reporting, and DOE Order 5400.5, Radiation Protection of the Public and Environment. The report is distributed electronically and is available atmore » https://saer.lln.gov/, the website for the LLNL annual environmental report. Previous LLNL annual environmental reports beginning in 1994 are also on the website. Some references in the electronic report text are underlined, which indicates that they are clickable links. Clicking on one of these links will open the related document, data workbook, or website that it refers to. The report begins with an executive summary, which provides the purpose of the report and an overview of LLNL’s compliance and monitoring results. The first three chapters provide background information: Chapter 1 is an overview of the location, meteorology, and hydrogeology of the two LLNL sites; Chapter 2 is a summary of LLNL’s compliance with environmental regulations; and Chapter 3 is a description of LLNL’s environmental programs with an emphasis on the Environmental Management System including pollution prevention. The majority of the report covers LLNL’s environmental monitoring programs and monitoring data for 2009: effluent and ambient air (Chapter 4); waters, including wastewater, storm water runoff, surface water, rain, and groundwater (Chapter 5); and terrestrial, including soil, sediment, vegetation, foodstuff, ambient radiation, and special status wildlife and plants (Chapter 6). Complete monitoring data, which are summarized in the body of the report, are provided in Appendix A. The remaining three chapters discuss the radiological impact on the public from LLNL operations (Chapter 7), LLNL’s groundwater remediation program (Chapter 8), and quality assurance for the environmental monitoring programs (Chapter 9).« less

  6. Study of laser-generated debris free x-ray sources produced in a high-density linear Ar, Kr, Xe, Kr/Ar and Xe/Kr/Ar mixtures gas jets by 2 ω, sub-ps LLNL Titan laser

    NASA Astrophysics Data System (ADS)

    Kantsyrev, V. L.; Schultz, K. A.; Shlyaptseva, V. V.; Safronova, A. S.; Cooper, M. C.; Shrestha, I. K.; Petkov, E. E.; Stafford, A.; Moschella, J. J.; Schmidt-Petersen, M. T.; Butcher, C. J.; Kemp, G. E.; Andrews, S. D.; Fournier, K. B.

    2016-10-01

    The study of laser-generated debris-free x-ray sources in an underdense plasma produced in a high-density linear gas-puff jet was carried out at the LLNL Titan laser (2 ω, 45 J, sub-ps) with an intensity in the 10 um focal spot of 7 x 1019 W/cm2. A linear nozzle with a fast valve was used for the generation of a clusters/gas jet. X-ray diagnostics for the spectral region of 0.7 - 9 keV include: two spectrometers and pinhole cameras, and 3 groups of fast filtered detectors. Electron beams were measured with the EPPS magnetic spectrometer (>1 MeV) and Faraday cups (>72 keV). Spectralon/spectrometer devices were also used to measure absorption of laser radiation in the jets. New results were obtained on: anisotropic generation of x-rays (laser to x-ray conversion coefficient was >1%) and characteristics of laser-generated electron beams; evolution of x-ray generation with the location of the laser focus in a cluster-gas jet, and observations of a strong x-ray flash in some focusing regimes. Non-LTE kinetic modeling was used to estimate plasma parameters. UNR work supported by the DTRA Basic Research Award # HDTRA1-13-1-0033. Work at LLNL was performed under the auspices of the U.S. DOE by LLNL under Contract DE-AC52-07NA27344.

  7. Electron Beam Production and Characterization for the PLEIADES Thomson X-ray Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, W J; Hartemann, F V; Tremaine, A M

    2002-10-14

    We report on the performance of an S-band RF photocathode electron gun and accelerator for operation with the PLEIADES Thomson x-ray source at LLNL. Simulations of beam production, transport, and focus are presented. It is shown that a 1 ps, 500 pC electron bunch with a normalized emittance of less than 5 {pi}mm-mrad can be delivered to the interaction point. Initial electron measurements are presented. Calculations of expected x-ray flux are also performed, demonstrating an expected peak spectral brightness of 10{sup 20} photons/s/mm{sup 2}/mrad{sup 2}/0.1% bandwidth. Effects of RF phase jitter are also presented, and planned phase measurements and controlmore » methods are discussed.« less

  8. Megajoule Dense Plasma Focus Solid Target Experiments

    NASA Astrophysics Data System (ADS)

    Podpaly, Y. A.; Falabella, S.; Link, A.; Povilus, A.; Higginson, D. P.; Shaw, B. H.; Cooper, C. M.; Chapman, S.; Bennett, N.; Sipe, N.; Olson, R.; Schmidt, A. E.

    2016-10-01

    Dense plasma focus (DPF) devices are plasma sources that can produce significant neutron yields from beam into gas interactions. Yield increases, up to approximately a factor of five, have been observed previously on DPFs using solid targets, such as CD2 and D2O ice. In this work, we report on deuterium solid-target experiments at the Gemini DPF. A rotatable target holder and baffle arrangement were installed in the Gemini device which allowed four targets to be deployed sequentially without breaking vacuum. Solid targets of titanium deuteride were installed and systematically studied at a variety of fill pressures, bias voltages, and target positions. Target holder design, experimental results, and comparison to simulations will be presented. Prepared by LLNL under Contract DE-AC52-07NA27344.

  9. Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleary, A J; Smith, S G; Vassilevska, T K

    2005-01-27

    The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallelmore » computers and maturation of the technology from an academic to a lab setting.« less

  10. NIMROD Simulations of Spheromak Formation, Magnetic Reconnection and Energy Confinement in SSPX

    NASA Astrophysics Data System (ADS)

    Hooper, E. B.; Sovinec, C. R.

    2005-10-01

    The SSPX spheromak is formed and driven by a coaxial electrostatic gun that injects current and magnetic flux. Magnetic fluctuations are associated with the conversion of toroidal to poloidal magnetic flux during formation. After formation, fluctuations that break axisymmetry degrade magnetic surfaces, and are anti-correlated with the core temperature and energy confinement time. We report NIMROD simulations extending earlier work^1 supporting the SSPX experiment through predictions of performance and providing insight. The simulations are in fairly good agreement with features observed in SSPX and underscore the importance of current profile control in mitigating magnetic fluctuation amplitudes and improving confinement. The simulations yield insight into magnetic reconnection and the relationship of fluctuations to field line stochasticity. We have added external circuit equations for the new 32 module capacitor bank in SSPX that will add flexibility in shaping the injector current pulses and substantially increase the injected currents and the magnetic energy. New NIMROD simulations of SSPX lead to higher temperature plasmas than in previous simulations. *Work supported by U.S. DOE, under Contr. No. W-7405-ENG-48 at U. Cal. LLNL and under grant FG02-01ER54661 at U. Wisc Madison. ^1C. R. Sovinec, B. I. Cohen, et al., Phys. Rev. Lett. 94, 035003 (2005); B. I. Cohen, E. B. Hooper, et al., Phys. Plasmas 12, 056106 (2005).

  11. Edge simulations in ELMy H-mode discharges of EAST tokamak

    NASA Astrophysics Data System (ADS)

    Xia, T. Y.; Huang, Y. Q.; Xu, X. Q.; Wu, Y. B.; Wang, L.; Zheng, Z.; Liu, J. B.; Zang, Q.; Li, Y. Y.; Zhao, D.

    2017-10-01

    Simulations of ELM crash followed by a coherent mode, leading to transient divertor heat flux on EAST are achieved by the six-field two-fluid model in BOUT + + . Three EAST ELMy H-mode discharges with different pedestal structure, geometry and plasma current Ip are studied. The ELM-driven crash of the profiles in pedestal is reproduced, and the footprints of ELM filaments on targets are comparable with the measurements from divertor probes. A coherent mode is also found in the edge region in all the simulations after the ELM crash. The frequency and poloidal wave number are in the range of the edge coherent mode (ECM) on EAST. The magnetic fluctuations of the mode are smaller than the electric field fluctuations. The detailed comparisons between simulated mode structures with measurements will be reported. Statistical analysis on the simulated turbulent fluctuations shows that both the turbulent and blobby electron anomalous transport can pump the pedestal energy out into SOL, and then flow to divertors. The similar trend of the heat flux width with Ip is obtained in the simulations. The effects of the SOL current driven by LHW on ELMs will be discussed in this paper. This work was performed under the auspices of the US DOE by LLNL under contract DE-AC52-07NA27344. It was supported by the China NSF 11405215 and 11675217.

  12. Main steam line break accident simulation of APR1400 using the model of ATLAS facility

    NASA Astrophysics Data System (ADS)

    Ekariansyah, A. S.; Deswandri; Sunaryo, Geni R.

    2018-02-01

    A main steam line break simulation for APR1400 as an advanced design of PWR has been performed using the RELAP5 code. The simulation was conducted in a model of thermal-hydraulic test facility called as ATLAS, which represents a scaled down facility of the APR1400 design. The main steam line break event is described in a open-access safety report document, in which initial conditions and assumptionsfor the analysis were utilized in performing the simulation and analysis of the selected parameter. The objective of this work was to conduct a benchmark activities by comparing the simulation results of the CESEC-III code as a conservative approach code with the results of RELAP5 as a best-estimate code. Based on the simulation results, a general similarity in the behavior of selected parameters was observed between the two codes. However the degree of accuracy still needs further research an analysis by comparing with the other best-estimate code. Uncertainties arising from the ATLAS model should be minimized by taking into account much more specific data in developing the APR1400 model.

  13. Installation of hybrid ion source on the 1-MV LLNL BioAMS spectrometer

    PubMed Central

    Ognibene, T. J.; Salazar, G. A.

    2012-01-01

    A second ion source was recently installed onto the LLNL 1-MV AMS spectrometer, which is dedicated to the quantification of 14C and 3H within biochemical samples. This source is unique among the other LLNL cesium sputter ion sources in that it can ionize both gaseous and solid samples. Also, the injection beam line has been designed to directly measure 14C/12C isotope ratios without the need for electrostatic bouncing. Preliminary tests show that this source can ionize transient CO2 gas pulses containing less than 1 ug carbon with approximately 1.5% efficiency. We demonstrate that the measured 14C/12C isotope ratio is largely unaffected by small drifts in the argon stripper gas density. We also determine that a tandem accelerating voltage of 670 kV enables the highest 14C transmission through the system. Finally, we describe a series of performance tests using solid graphite targets spanning nearly 3 orders in magnitude dynamic range and compare the results to our other ion source. PMID:23467295

  14. Environmental safety & health requirements for a federal facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, G.; Wong, J.

    1995-09-01

    I would like to take this opportunity to discuss the challenges that face an environmental, safety, and health (ES&H) manager at a federal facility situated in California. The challenges are, in many aspects, similar to those facing ES&H professionals all over this country: dwindling resources and increasing regulatory demands. The Laboratory (LLNL) is under closer scrutiny than other R&D facilities located in California because some of its research activities involve nuclear weapon design. Today I would like to talk about two actions we, the ES&H management at LLNL, have taken to decrease the impact of dwindling resources and increasing regulatorymore » demands: (1) Institution of a performance-based contract, which the University of California negotiated with the Department of Energy (DOE) to reduce the impact of special mandates required of federal facilities. Under this contract, ES&H performance is measured by results rather than by process; (2) Redesign of the LLNL Hazards Control Department to a flat organization that incorporates employee empowerment and Self-Managed Work Teams (SMWTs).« less

  15. Toward the virtual classroom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pihlman, M.; Dirks, D.H.

    1990-01-03

    The Lawrence Livermore National Laboratory (LLNL) encourages its employees to remotely attend classes given by Stanford University, University of California at Davis, and the National Technological University (NTU). To improve the quality of education for LLNL employees, we are cooperating with Stanford University in upgrading the Stanford Instructional Television Network (SITN). A dedicated high-speed communication link (Tl) between Stanford and LLNL will be used for enhanced services such as videoconferencing, real time classnotes distribution, and electronic distribution of homework assignments. The new network will also allow students to take classes from their offices with the ability to ask the professormore » questions via an automatically dialed telephone call. As part of this upgrade, we have also proposed a new videoconferencing based classroom environment where students taking remote classes would feel as though they are attending the live class. All paperwork would be available in near real time and students may converse normally with, and see, other remote students as though they were all in the same physical location. We call this the Virtual Classroom.'' 1 ref., 6 figs.« less

  16. 2009 Annual Health Physics Report for the HEU Transparency Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radev, R

    2010-04-14

    During the 2009 calendar year, Lawrence Livermore National Laboratory (LLNL) provided health physics support for the Highly Enriched Uranium (HEU) Transparency Program for external and internal radiation protection. LLNL also provided technical expertise related to BDMS radioactive sources and Russian radiation safety regulatory compliance. For the calendar year 2009, there were 159 person-trips that required dose monitoring of the U.S. monitors. Of the 159 person-trips, 149 person-trips were SMVs and 10 person-trips were Transparency Monitoring Office (TMO) trips. There were 4 monitoring visits by TMO monitors to facilities other than UEIE and 10 to UEIE itself. LLNL's Hazard Control Departmentmore » laboratories provided the dosimetry services for the HEU Transparency monitors. In 2009, the HEU Transparency activities in Russia were conducted in a radiologically safe manner for the HEU Transparency monitors in accordance with the expectations of the HEU Transparency staff, NNSA and DOE. The HEU Transparency Program now has over fifteen years of successful experience in developing and providing health and safety support in meeting its technical objectives.« less

  17. Characterizing Hohlraum Plasma Conditions at the National Ignition Facility (NIF) Using X-ray Spectroscopy

    NASA Astrophysics Data System (ADS)

    Barrios, Maria Alejandra

    2015-11-01

    Improved hohlraums will have a significant impact on increasing the likelihood of indirect drive ignition at the NIF. In indirect-drive Inertial Confinement Fusion (ICF), a high-Z hohlraum converts laser power into a tailored x-ray flux that drives the implosion of a spherical capsule filled with D-T fuel. The x-radiation drive to capsule coupling sets the velocity, adiabat, and symmetry of the implosion. Previous experiments in gas-filled hohlraums determined that the laser-hohlraum energy coupling is 20-25% less than modeled, therefore identifying energy loss mechanisms that reduce the efficacy of the hohlraum drive is central to improving implosion performance. Characterizing the plasma conditions, particularly the plasma electron temperature (Te) , is critical to understanding mechanism that affect the energy coupling such as the laser plasma interactions (LPI), hohlraum x-ray conversion efficiency, and dynamic drive symmetry. The first Te measurements inside a NIF hohlraum, presented here, were achieved using K-shell X-ray spectroscopy of an Mn-Co tracer dot. The dot is deposited on a thin-walled CH capsule, centered on the hohlraum symmetry axis below the laser entrance hole (LEH) of a bottom-truncated hohlraum. The hohlraum x-ray drive ablates the dot and causes it to flow upward, towards the LEH, entering the hot laser deposition region. An absolutely calibrated streaked spectrometer with a line of sight into the LEH records the temporal history of the Mn and Co X-ray emission. The measured (interstage) Lyα/ Heα line ratios for Co and Mn and the Mn-Heα/Co-Heα isoelectronic line ratio are used to infer the local plasma Te from the atomic physics code SCRAM. Time resovled x-ray images perpendicular to the hohlraum axis record the dot expansion and trajectory into the LEH region. The temporal evolution of the measured Te and dot trajectory are compared with simulations from radiation-hydrodynamic codes. This work was performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344.

  18. Modeling of gun barrel surface erosion: Historic perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckingham, A.C.

    1996-08-01

    Results and interpretations of numerical simulations of some dominant processes influencing gun barrel propellant combustion and flow-induced erosion are presented. Results include modeled influences of erosion reduction techniques such as solid additives, vapor phase chemical modifications, and alteration of surface solid composition through use of thin coatings. Precedents and historical perspective are provided with predictions from traditional interior ballistics compared to computer simulations. Accelerating reactive combustion flow, multiphase and multicomponent transport, flow-to-surface thermal/momentum/phase change/gas-surface chemical exchanges, surface and micro-depth subsurface heating/stress/composition evolution and their roles in inducing surface cracking, spall, ablation, melting, and vaporization are considered. Recognition is given tomore » cyclic effects of previous firing history on material preconditioning. Current perspective and outlook for future are based on results of a US Army-LLNL erosion research program covering 7 y in late 1970s. This is supplemented by more recent research on hypervelocity electromagnetic projectile launchers.« less

  19. Numerical Simulations of Near-Field Blast Effects using Kinetic Plates

    NASA Astrophysics Data System (ADS)

    Neuscamman, Stephanie; Manner, Virginia; Brown, Geoffrey; Glascoe, Lee

    2013-06-01

    Numerical simulations using two hydrocodes were compared to near-field measurements of blast impulse associated with ideal and non-ideal explosives to gain insight into testing results and predict untested configurations. The recently developed kinetic plate test was designed to measure blast impulse in the near-field by firing spherical charges in close range from steel plates and probing plate acceleration using laser velocimetry. Plate velocities for ideal, non-ideal and aluminized explosives tests were modeled using a three dimensional hydrocode. The effects of inert additives in the explosive formulation were modeled using a 1-D hydrocode with multiphase flow capability using Lagrangian particles. The relative effect of particle impact on the plate compared to the blast wave impulse is determined and modeling is compared to free field pressure results. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. This is abstract LLNL-ABS-622152.

  20. Modelling controlled VDE's and ramp-down scenarios in ITER

    NASA Astrophysics Data System (ADS)

    Lodestro, L. L.; Kolesnikov, R. A.; Meyer, W. H.; Pearlstein, L. D.; Humphreys, D. A.; Walker, M. L.

    2011-10-01

    Following the design reviews of recent years, the ITER poloidal-field coil-set design, including in-vessel coils (VS3), and the divertor configuration have settled down. The divertor and its material composition (the latter has not been finalized) affect the development of fiducial equilibria and scenarios together with the coils through constraints on strike-point locations and limits on the PF and control systems. Previously we have reported on our studies simulating controlled vertical events in ITER with the JCT 2001 controller to which we added a PID VS3 circuit. In this paper we report and compare controlled VDE results using an optimized integrated VS and shape controller in the updated configuration. We also present our recent simulations of alternate ramp-down scenarios, looking at the effects of ramp-down time and shape strategies, using these controllers. This work performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344.

  1. Massively Parallel Real-Time TDDFT Simulations of Electronic Stopping Processes

    NASA Astrophysics Data System (ADS)

    Yost, Dillon; Lee, Cheng-Wei; Draeger, Erik; Correa, Alfredo; Schleife, Andre; Kanai, Yosuke

    Electronic stopping describes transfer of kinetic energy from fast-moving charged particles to electrons, producing massive electronic excitations in condensed matter. Understanding this phenomenon for ion irradiation has implications in modern technologies, ranging from nuclear reactors, to semiconductor devices for aerospace missions, to proton-based cancer therapy. Recent advances in high-performance computing allow us to achieve an accurate parameter-free description of these phenomena through numerical simulations. Here we discuss results from our recently-developed large-scale real-time TDDFT implementation for electronic stopping processes in important example materials such as metals, semiconductors, liquid water, and DNA. We will illustrate important insight into the physics underlying electronic stopping and we discuss current limitations of our approach both regarding physical and numerical approximations. This work is supported by the DOE through the INCITE awards and by the NSF. Part of this work was performed under the auspices of U.S. DOE by LLNL under Contract DE-AC52-07NA27344.

  2. Python Radiative Transfer Emission code (PyRaTE): non-LTE spectral lines simulations

    NASA Astrophysics Data System (ADS)

    Tritsis, A.; Yorke, H.; Tassis, K.

    2018-05-01

    We describe PyRaTE, a new, non-local thermodynamic equilibrium (non-LTE) line radiative transfer code developed specifically for post-processing astrochemical simulations. Population densities are estimated using the escape probability method. When computing the escape probability, the optical depth is calculated towards all directions with density, molecular abundance, temperature and velocity variations all taken into account. A very easy-to-use interface, capable of importing data from simulations outputs performed with all major astrophysical codes, is also developed. The code is written in PYTHON using an "embarrassingly parallel" strategy and can handle all geometries and projection angles. We benchmark the code by comparing our results with those from RADEX (van der Tak et al. 2007) and against analytical solutions and present case studies using hydrochemical simulations. The code will be released for public use.

  3. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  4. MOCCA code for star cluster simulation: comparison with optical observations using COCOA

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Olech, Arkadiusz; Hypki, Arkadiusz

    2016-02-01

    We introduce and present preliminary results from COCOA (Cluster simulatiOn Comparison with ObservAtions) code for a star cluster after 12 Gyr of evolution simulated using the MOCCA code. The COCOA code is being developed to quickly compare results of numerical simulations of star clusters with observational data. We use COCOA to obtain parameters of the projected cluster model. For comparison, a FITS file of the projected cluster was provided to observers so that they could use their observational methods and techniques to obtain cluster parameters. The results show that the similarity of cluster parameters obtained through numerical simulations and observations depends significantly on the quality of observational data and photometric accuracy.

  5. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  6. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  7. LLNL Partners with IBM on Brain-Like Computing Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Essen, Brian

    Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

  8. LLNL Partners with IBM on Brain-Like Computing Chip

    ScienceCinema

    Van Essen, Brian

    2018-06-25

    Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

  9. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Qing; Whaley, Richard Clint; Qasem, Apan

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis,more » identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.« less

  10. Three-dimensional Monte-Carlo simulation of gamma-ray scattering and production in the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.J.

    1989-05-15

    Monte Carlo codes have been developed to simulate gamma-ray scattering and production in the atmosphere. The scattering code simulates interactions of low-energy gamma rays (20 to several hundred keV) from an astronomical point source in the atmosphere; a modified code also simulates scattering in a spacecraft. Four incident spectra, typical of gamma-ray bursts, solar flares, and the Crab pulsar, and 511 keV line radiation have been studied. These simulations are consistent with observations of solar flare radiation scattered from the atmosphere. The production code simulates the interactions of cosmic rays which produce high-energy (above 10 MeV) photons and electrons. Itmore » has been used to calculate gamma-ray and electron albedo intensities at Palestine, Texas and at the equator; the results agree with observations in most respects. With minor modifications this code can be used to calculate intensities of other high-energy particles. Both codes are fully three-dimensional, incorporating a curved atmosphere; the production code also incorporates the variation with both zenith and azimuth of the incident cosmic-ray intensity due to geomagnetic effects. These effects are clearly reflected in the calculated albedo by intensity contrasts between the horizon and nadir, and between the east and west horizons.« less

  11. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  12. Object-oriented approach for gas turbine engine simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Felder, James L.

    1995-01-01

    An object-oriented gas turbine engine simulation program was developed. This program is a prototype for a more complete, commercial grade engine performance program now being proposed as part of the Numerical Propulsion System Simulator (NPSS). This report discusses architectural issues of this complex software system and the lessons learned from developing the prototype code. The prototype code is a fully functional, general purpose engine simulation program, however, only the component models necessary to model a transient compressor test rig have been written. The production system will be capable of steady state and transient modeling of almost any turbine engine configuration. Chief among the architectural considerations for this code was the framework in which the various software modules will interact. These modules include the equation solver, simulation code, data model, event handler, and user interface. Also documented in this report is the component based design of the simulation module and the inter-component communication paradigm. Object class hierarchies for some of the code modules are given.

  13. Two-dimensional implosion simulations with a kinetic particle code [2D implosion simulations with a kinetic particle code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sagert, Irina; Even, Wesley Paul; Strother, Terrance Timothy

    Here, we perform two-dimensional implosion simulations using a Monte Carlo kinetic particle code. The application of a kinetic transport code is motivated, in part, by the occurrence of nonequilibrium effects in inertial confinement fusion capsule implosions, which cannot be fully captured by hydrodynamic simulations. Kinetic methods, on the other hand, are able to describe both continuum and rarefied flows. We perform simple two-dimensional disk implosion simulations using one-particle species and compare the results to simulations with the hydrodynamics code rage. The impact of the particle mean free path on the implosion is also explored. In a second study, we focusmore » on the formation of fluid instabilities from induced perturbations. We find good agreement with hydrodynamic studies regarding the location of the shock and the implosion dynamics. Differences are found in the evolution of fluid instabilities, originating from the higher resolution of rage and statistical noise in the kinetic studies.« less

  14. Two-dimensional implosion simulations with a kinetic particle code [2D implosion simulations with a kinetic particle code

    DOE PAGES

    Sagert, Irina; Even, Wesley Paul; Strother, Terrance Timothy

    2017-05-17

    Here, we perform two-dimensional implosion simulations using a Monte Carlo kinetic particle code. The application of a kinetic transport code is motivated, in part, by the occurrence of nonequilibrium effects in inertial confinement fusion capsule implosions, which cannot be fully captured by hydrodynamic simulations. Kinetic methods, on the other hand, are able to describe both continuum and rarefied flows. We perform simple two-dimensional disk implosion simulations using one-particle species and compare the results to simulations with the hydrodynamics code rage. The impact of the particle mean free path on the implosion is also explored. In a second study, we focusmore » on the formation of fluid instabilities from induced perturbations. We find good agreement with hydrodynamic studies regarding the location of the shock and the implosion dynamics. Differences are found in the evolution of fluid instabilities, originating from the higher resolution of rage and statistical noise in the kinetic studies.« less

  15. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  16. OSIRIS - an object-oriented parallel 3D PIC code for modeling laser and particle beam-plasma interaction

    NASA Astrophysics Data System (ADS)

    Hemker, Roy

    1999-11-01

    The advances in computational speed make it now possible to do full 3D PIC simulations of laser plasma and beam plasma interactions, but at the same time the increased complexity of these problems makes it necessary to apply modern approaches like object oriented programming to the development of simulation codes. We report here on our progress in developing an object oriented parallel 3D PIC code using Fortran 90. In its current state the code contains algorithms for 1D, 2D, and 3D simulations in cartesian coordinates and for 2D cylindrically-symmetric geometry. For all of these algorithms the code allows for a moving simulation window and arbitrary domain decomposition for any number of dimensions. Recent 3D simulation results on the propagation of intense laser and electron beams through plasmas will be presented.

  17. 5D Tempest simulations of kinetic edge turbulence

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M. R.; Hittinger, J. A.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.; Umansky, M. V.; Qin, H.

    2006-10-01

    Results are presented from the development and application of TEMPEST, a nonlinear five dimensional (3d2v) gyrokinetic continuum code. The simulation results and theoretical analysis include studies of H-mode edge plasma neoclassical transport and turbulence in real divertor geometry and its relationship to plasma flow generation with zero external momentum input, including the important orbit-squeezing effect due to the large electric field flow-shear in the edge. In order to extend the code to 5D, we have formulated a set of fully nonlinear electrostatic gyrokinetic equations and a fully nonlinear gyrokinetic Poisson's equation which is valid for both neoclassical and turbulence simulations. Our 5D gyrokinetic code is built on 4D version of Tempest neoclassical code with extension to a fifth dimension in binormal direction. The code is able to simulate either a full torus or a toroidal segment. Progress on performing 5D turbulence simulations will be reported.

  18. Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.

    PubMed

    Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan

    2016-09-01

    Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. COCOA code for creating mock observations of star cluster models

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2018-04-01

    We introduce and present results from the COCOA (Cluster simulatiOn Comparison with ObservAtions) code that has been developed to create idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. In this paper, we describe the COCOA code and demonstrate its different applications by utilizing globular cluster (GC) models simulated with the MOCCA (MOnte Carlo Cluster simulAtor) code. COCOA is used to synthetically observe these different GC models with optical telescopes, perform point spread function photometry, and subsequently produce observed colour-magnitude diagrams. We also use COCOA to compare the results from synthetic observations of a cluster model that has the same age and metallicity as the Galactic GC NGC 2808 with observations of the same cluster carried out with a 2.2 m optical telescope. We find that COCOA can effectively simulate realistic observations and recover photometric data. COCOA has numerous scientific applications that maybe be helpful for both theoreticians and observers that work on star clusters. Plans for further improving and developing the code are also discussed in this paper.

  20. Simulated meteorite impacts and volcanic explosions: Ejecta analyses and planetary implications

    NASA Technical Reports Server (NTRS)

    Gratz, A. J.; Nellis, W. J.

    1992-01-01

    Past cratering studies have focused primarily on crater morphology. However, important questions remain about the nature of crater deposits. Phenomena that need to be studied include the distribution of shock effects in crater deposits and crater walls; the origin of mono- and polymict breccia; differences between local and distal ejecta; deformation induced by explosive volcanism; and the production of unshocked, high-speed ejecta that could form the lunar and martian meteorites found on the Earth. To study these phenomena, one must characterize ejecta and crater wall materials from impacts produced under controlled conditions. New efforts at LLNL simulate impacts and volcanism and study resultant deformation. All experiments use the two-stage light-gas gun facility at LLNL to accelerate projectiles to velocities of 0.2 to 4.3 km/s, including shock pressures of 0.9 to 50 GPa. We use granite targets and novel experimental geometries to unravel cratering processes in crystalline rocks. We have thus far conducted three types of simulations: soft recovery of ejecta, 'frozen crater' experiments, and an 'artificial volcano. Our ejecta recovery experiments produced a useful separation of impactites. Material originally below the projectile remained trapped there, embedded in the soft metal of the flyer plate. In contrast, material directly adjacent to the projectile was jetted away from the impact, producing an ejecta cone that was trapped in the foam recovery fixture. We find that a significant component of crater ejecta shows no signs of strong shock; this material comes from the near-surface 'interference zone' surrounding the impact site. This phenomenon explains the existence of unshocked meteorites on the Earth of lunar and martian origin. Impact of a large bolide on neighboring planets will produce high-speed, weakly shocked ejecta, which may be trapped by the Earth's gravitational field. 'Frozen crater' experiments show that the interference zone is highly localized; indeed, disaggregation does not extend beyond approx. 1.5 crater radii. A cone-shaped region extending downward from the impact site is completely disaggregated, including powdered rock that escaped into the projectile tube. Petrographic analysis of crater ejecta and wall material will be presented. Finally, study of ejecta from 0.9- and 1.3-GPa simulations of volcanic explosions reveal a complete lack of shock metamorphism. The ejecta shows no evidence of PDF's, amorphization, high-pressure phases, or mosaicism. Instead, all deformation was brittle, with fractures irregular (not planar) and most intergranular. The extent of fracturing was remarkable, with the entire sample reduced to fragments of gravel size and smaller.

  1. Simulated meteorite impacts and volcanic explosions: Ejecta analyses and planetary implications

    NASA Astrophysics Data System (ADS)

    Gratz, A. J.; Nellis, W. J.

    1992-09-01

    Past cratering studies have focused primarily on crater morphology. However, important questions remain about the nature of crater deposits. Phenomena that need to be studied include the distribution of shock effects in crater deposits and crater walls; the origin of mono- and polymict breccia; differences between local and distal ejecta; deformation induced by explosive volcanism; and the production of unshocked, high-speed ejecta that could form the lunar and martian meteorites found on the Earth. To study these phenomena, one must characterize ejecta and crater wall materials from impacts produced under controlled conditions. New efforts at LLNL simulate impacts and volcanism and study resultant deformation. All experiments use the two-stage light-gas gun facility at LLNL to accelerate projectiles to velocities of 0.2 to 4.3 km/s, including shock pressures of 0.9 to 50 GPa. We use granite targets and novel experimental geometries to unravel cratering processes in crystalline rocks. We have thus far conducted three types of simulations: soft recovery of ejecta, 'frozen crater' experiments, and an 'artificial volcano. Our ejecta recovery experiments produced a useful separation of impactites. Material originally below the projectile remained trapped there, embedded in the soft metal of the flyer plate. In contrast, material directly adjacent to the projectile was jetted away from the impact, producing an ejecta cone that was trapped in the foam recovery fixture. We find that a significant component of crater ejecta shows no signs of strong shock; this material comes from the near-surface 'interference zone' surrounding the impact site. This phenomenon explains the existence of unshocked meteorites on the Earth of lunar and martian origin. Impact of a large bolide on neighboring planets will produce high-speed, weakly shocked ejecta, which may be trapped by the Earth's gravitational field. 'Frozen crater' experiments show that the interference zone is highly localized; indeed, disaggregation does not extend beyond approx. 1.5 crater radii. A cone-shaped region extending downward from the impact site is completely disaggregated, including powdered rock that escaped into the projectile tube. Petrographic analysis of crater ejecta and wall material will be presented. Finally, study of ejecta from 0.9- and 1.3-GPa simulations of volcanic explosions reveal a complete lack of shock metamorphism. The ejecta shows no evidence of PDF's, amorphization, high-pressure phases, or mosaicism.

  2. Simulation of Weld Mechanical Behavior to Include Welding-Induced Residual Stress and Distortion: Coupling of SYSWELD and Abaqus Codes

    DTIC Science & Technology

    2015-11-01

    induced residual stresses and distortions from weld simulations in the SYSWELD software code in structural Finite Element Analysis ( FEA ) simulations...performed in the Abaqus FEA code is presented. The translation of these results is accomplished using a newly developed Python script. Full details of...Local Weld Model in Structural FEA ....................................................15 CONCLUSIONS

  3. Rollback of an intraoceanic subduction system and termination against a continental margin

    NASA Astrophysics Data System (ADS)

    Campbell, S. M.; Simmons, N. A.; Moucha, R.

    2017-12-01

    The Southeast Indian Slab (SEIS) seismic anomaly has been suggested to represent a Tethyan intraoceanic subduction system which operated during the Jurassic until its termination at or near the margin of East Gondwana (Simmons et al., 2015). As plate reconstructions suggest the downgoing plate remained coupled to the continental margin, this long-lived system likely experienced a significant amount of slab rollback and trench migration (up to 6000 km). Using a 2D thermomechanical numerical code that includes the effects of phase transitions, we test this interpretation by modeling the long-term subduction, transition zone stagnation, and rollback of an intraoceanic subduction system in which the downgoing plate remains coupled to a continental margin. In addition, we also investigate the termination style of such a system, with a particular focus on the potential for some continental subduction beneath an overriding oceanic plate. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-735738

  4. SCANS (Shipping Cask ANalysis System) a microcomputer-based analysis system for shipping cask design review: User`s manual to Version 3a. Volume 1, Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mok, G.C.; Thomas, G.R.; Gerhard, M.A.

    SCANS (Shipping Cask ANalysis System) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent fuel shipping casks. SCANS is an easy-to-use system that calculates the global response to impact loads, pressure loads and thermal conditions, providing reviewers with an independent check on analyses submitted by licensees. SCANS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens thatmore » contain descriptive data requests. Analysis options are based on regulatory cases described in the Code of Federal Regulations 10 CFR 71 and Regulatory Guides published by the US Nuclear Regulatory Commission in 1977 and 1978.« less

  5. Divertor extreme ultraviolet (EUV) survey spectroscopy in DIII-D

    NASA Astrophysics Data System (ADS)

    McLean, Adam; Allen, Steve; Ellis, Ron; Jarvinen, Aaro; Soukhanovskii, Vlad; Boivin, Rejean; Gonzales, Eduardo; Holmes, Ian; Kulchar, James; Leonard, Anthony; Williams, Bob; Taussig, Doug; Thomas, Dan; Marcy, Grant

    2017-10-01

    An extreme ultraviolet spectrograph measuring resonant emissions of D and C in the lower divertor has been added to DIII-D to help resolve an 2X discrepancy between bolometrically measured radiated power and that predicted by boundary codes for DIII-D, JET and ASDEX-U. With 290 and 450 gr/mm gratings, the DivSPRED spectrometer, an 0.3 m flat-field McPherson model 251, measures ground state transitions for D (the Lyman series) and C (e.g., C IV, 155 nm) which account for >75% of radiated power in the divertor. Combined with Thomson scattering and imaging in the DIII-D divertor, measurements of position, temperature and fractional power emission from plasma components are made and compared to UEDGE/SOLPS-ITER. Mechanical, optical, electrical, vacuum, and shielding aspects of DivSPRED are presented. Work supported under USDOE Cooperative Agreement DE-FC02-04ER54698 and DE-AC52-07NA27344, and by the LLNL Laboratory Directed R&D Program, project #17-ERD-020.

  6. Numerical simulation of experiments in the Giant Planet Facility

    NASA Technical Reports Server (NTRS)

    Green, M. J.; Davy, W. C.

    1979-01-01

    Utilizing a series of existing computer codes, ablation experiments in the Giant Planet Facility are numerically simulated. Of primary importance is the simulation of the low Mach number shock layer that envelops the test model. The RASLE shock-layer code, used in the Jupiter entry probe heat-shield design, is adapted to the experimental conditions. RASLE predictions for radiative and convective heat fluxes are in good agreement with calorimeter measurements. In simulating carbonaceous ablation experiments, the RASLE code is coupled directly with the CMA material response code. For the graphite models, predicted and measured recessions agree very well. Predicted recession for the carbon phenolic models is 50% higher than that measured. This is the first time codes used for the Jupiter probe design have been compared with experiments.

  7. Institute of Geophyics and Planetary Physics. Annual report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryerson, F.J.

    1995-09-29

    The Institute of Geophysics and Planetary Physics (IGPP) is a Multicampus Research Unit of the University of California (UC). IGPP was founded in 1946 at UC Los Angeles with a charter to further research in the earth and planetary sciences and in related fields. The Institute now has branches at UC campuses in Los Angeles, San Diego, Riverside, and Irvine and at Los Alamos and Lawrence Livermore national laboratories. The University-wide IGPP has played an important role in establishing interdisciplinary research in the earth and planetary sciences. For example, IGPP was instrumental in founding the fields of physical oceanography andmore » space physics, which at the time fell between the cracks of established university departments. Because of its multicampus orientation, IGPP has sponsored important interinstitutional consortia in the earth and planetary sciences. Each of the six branches has a somewhat different intellectual emphasis as a result of the interplay between strengths of campus departments and Laboratory programs. The IGPP branch at Lawrence Livermore National Laboratory (LLNL) was approved by the Regents of the University of California in 1982. IGPP-LLNL emphasizes research in seismology, geochemistry, cosmochemistry, high-pressure sciences, and astrophysics. It provides a venue for studying the fundamental aspects of these fields, thereby complementing LLNL programs that pursue applications of these disciplines in national security and energy research. IGPP-LLNL is directed by Charles Alcock and is structured around three research centers. The Center for Geosciences, headed by George Zandt and Frederick Ryerson, focuses on research in geophysics and geochemistry. The Center for High-Pressure Sciences, headed by William Nellis, sponsors research on the properties of planetary materials and on the synthesis and preparation of new materials using high-pressure processing.« less

  8. Classical and quantum simulations of warm dense carbon

    NASA Astrophysics Data System (ADS)

    Whitley, Heather; Sanchez, David; Hamel, Sebastien; Correa, Alfredo; Benedict, Lorin

    We have applied classical and DFT-based molecular dynamics (MD) simulations to study the equation of state of carbon in the warm dense matter regime (ρ = 3.7 g/cc, 0.86 eV

  9. Proton deflectometry characterization of Biermann-Battery field advection

    NASA Astrophysics Data System (ADS)

    Pollock, Bradley; Moore, Alastair; Meezan, Nathan; Eder, Dave; Kane, Jave; Strozzi, David; Wilks, Scott; Rinderknecht, Hans; Zylstra, Alex; Fujioka, Shinsuke; Kemp, Gregory; Moody, John

    2017-10-01

    Laser-foil interactions are well known to produce azimuthal magnetic fields around the laser spot due to the orthogonal density and temperature gradients that develop near the foil surface (the Biermann-Battery effect). Simulations show that these fields produced inside hohlraums used for indirect drive experiments at the National Ignition Facility (NIF); however, modeling these fields and their advection is very computationally expensive on the temporal and spatial scales relevant for typical NIF hohlraum experiments ( 10 ns, few mm). The hohlraum geometry also makes directly probing the fields somewhat challenging, limiting the available experimental data on these fields under NIF conditions. In particular, the relative contributions of frozen-in and Nernst advection of the field away from the hohlraum wall is not currently well understood. We have developed a new target platform for direct measurements of the field topology in a NIF-relevant configuration. Using a single cone of NIF, a 2.5 mm long, 5.4 mm diameter Au ring is illuminated with a similar beam geometry to that of one ring of beams in a full-scale hohlraum experiment. The ring target has no end caps, providing a clear line of sight for probing through the ring. A D3He filled exploding pusher placed 5 cm below the ring is illuminated by an additional 60 beams of NIF to produce protons, some of which propagate through the ring. Work was performed under the auspices of the U.S. Department of Energy by LLNL under Contract DE-AC52-07NA27344 and under LDRD support from LLNL.

  10. Accurate Wavelength Measurements and Modeling of Fe XV to Fe XIX Spectra Recorded in High-Density Plasmas between 13.5 and 17 Å

    NASA Astrophysics Data System (ADS)

    May, M. J.; Beiersdorfer, P.; Dunn, J.; Jordan, N.; Hansen, S. B.; Osterheld, A. L.; Faenov, A. Ya.; Pikuz, T. A.; Skobelev, I. Yu.; Flora, F.; Bollanti, S.; Di Lazzaro, P.; Murra, D.; Reale, A.; Reale, L.; Tomassetti, G.; Ritucci, A.; Francucci, M.; Martellucci, S.; Petrocelli, G.

    2005-06-01

    Iron spectra have been recorded from plasmas created at three different laser plasma facilities: the Tor Vergata University laser in Rome (Italy), the Hercules laser at ENEA in Frascati (Italy), and the Compact Multipulse Terawatt (COMET) laser at LLNL in California (USA). The measurements provide a means of identifying dielectronic satellite lines from Fe XVI and Fe XV in the vicinity of the strong 2p-->3d transitions of Fe XVII. About 80 Δn>=1 lines of Fe XV (Mg-like) to Fe XIX (O-like) were recorded between 13.8 and 17.1 Å with a high spectral resolution (λ/Δλ~4000) about 30 of these lines are from Fe XVI and Fe XV. The laser-produced plasmas had electron temperatures between 100 and 500 eV and electron densities between 1020 and 1022 cm-3. The Hebrew University Lawrence Livermore Atomic Code (HULLAC) was used to calculate the atomic structure and atomic rates for Fe XV-XIX. HULLAC was used to calculate synthetic line intensities at Te=200 eV and ne=1021 cm-3 for three different conditions to illustrate the role of opacity: optically thin plasmas with no excitation-autoionization/dielectronic recombination (EA/DR) contributions to the line intensities, optically thin plasmas that included EA/DR contributions to the line intensities, and optically thick plasmas (optical depth ~200 μm) that included EA/DR contributions to the line intensities. The optically thick simulation best reproduced the recorded spectrum from the Hercules laser. However, some discrepancies between the modeling and the recorded spectra remain.

  11. Accurate wavelength measurements and modeling of FeXV to FeXIX spectra recorded in high density plasmas between 13.5 to 17 A.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M; Beiersdorfer, P; Dunn, J

    Iron spectra have been recorded from plasmas created at three different laser plasma facilities, the Tor Vergata University laser in Rome (Italy), the Hercules laser at ENEA in Frascati (Italy), and the Compact Multipulse Terawatt (COMET) laser at LLNL in California (USA). The measurements provide a means of identifying dielectronic satellite lines from FeXVI and FeXV in the vicinity of the strong 2p {yields} 3d transitions of FeXVII. About 80 {Delta}n {ge} 1 lines of FeXV (Mg-like) to FeXIX (O-like) were recorded between 13.8 to 17.1 {angstrom} with a high spectral resolution ({lambda}/{Delta}{lambda} {approx} 4000), about thirty of these linesmore » are from FeXVI and FeXV. The laser produced plasmas had electron temperatures between 100 to 500 eV and electron densities between 10{sup 20} to 10{sup 22} cm{sup -3}. The Hebrew University Lawrence Livermore Atomic Code (HULLAC) was used to calculate the atomic structure and atomic rates for FeXV to FeXIX. HULLAC was used to calculate synthetic line intensities at T{sub e} = 200 eV and n{sub e} = 10{sup 21}cm{sup -3} for three different conditions to illustrate the role of opacity: optically thin plasmas with no excitation-autoionization/dielectronic recombination (EA/DR) contributions to the line intensities, optically thin plasmas that included EA/DR contributions to the line intensities, and optically thick plasmas (optical depth {approx} 200 {micro}m) that included EA/DR contributions to the line intensities. The optically thick simulation best reproduced the recorded spectrum from the Hercules laser. However some discrepancies between the modeling and the recorded spectra remain.« less

  12. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  13. Aerodynamic Analysis of the M33 Projectile Using the CFX Code

    DTIC Science & Technology

    2011-12-01

    is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) The M33 projectile has been analyzed using the ANSYS CFX code that is based...analyzed using the ANSYS CFX code that is based on the numerical solution of the full Navier-Stokes equations. Simulation data were obtained...using the CFX code. The ANSYS - CFX code is a commercial CFD program used to simulate fluid flow in a variety of applications such as gas turbine

  14. Muon simulation codes MUSIC and MUSUN for underground physics

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, V. A.

    2009-03-01

    The paper describes two Monte Carlo codes dedicated to muon simulations: MUSIC (MUon SImulation Code) and MUSUN (MUon Simulations UNderground). MUSIC is a package for muon transport through matter. It is particularly useful for propagating muons through large thickness of rock or water, for instance from the surface down to underground/underwater laboratory. MUSUN is designed to use the results of muon transport through rock/water to generate muons in or around underground laboratory taking into account their energy spectrum and angular distribution.

  15. Simulation Studies for Inspection of the Benchmark Test with PATRASH

    NASA Astrophysics Data System (ADS)

    Shimosaki, Y.; Igarashi, S.; Machida, S.; Shirakata, M.; Takayama, K.; Noda, F.; Shigaki, K.

    2002-12-01

    In order to delineate the halo-formation mechanisms in a typical FODO lattice, a 2-D simulation code PATRASH (PArticle TRAcking in a Synchrotron for Halo analysis) has been developed. The electric field originating from the space charge is calculated by the Hybrid Tree code method. Benchmark tests utilizing three simulation codes of ACCSIM, PATRASH and SIMPSONS were carried out. These results have been confirmed to be fairly in agreement with each other. The details of PATRASH simulation are discussed with some examples.

  16. Validation: Codes to compare simulation data to various observations

    NASA Astrophysics Data System (ADS)

    Cohn, J. D.

    2017-02-01

    Validation provides codes to compare several observations to simulated data with stellar mass and star formation rate, simulated data stellar mass function with observed stellar mass function from PRIMUS or SDSS-GALEX in several redshift bins from 0.01-1.0, and simulated data B band luminosity function with observed stellar mass function, and to create plots for various attributes, including stellar mass functions, and stellar mass to halo mass. These codes can model predictions (in some cases alongside observational data) to test other mock catalogs.

  17. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  18. Examining the effects of microstructure and loading on the shock initiation of HMX with mesoscale simulations

    NASA Astrophysics Data System (ADS)

    Springer, H. Keo; Tarver, Craig; Bastea, Sorin

    2015-06-01

    We perform reactive mesoscale simulations to study shock initiation in HMX over a range of pore morphologies and sizes, porosities, and loading conditions in order to improve our understanding of structure-performance relationships. These relationships are important because they guide the development of advanced macroscale models incorporating hot spot mechanisms and the optimization of novel energetic material microstructures. Mesoscale simulations are performed using the multiphysics hydrocode, ALE3D. Spherical, elliptical, polygonal, and crack-like pore geometries 0.1, 1, 10, and 100 microns in size and 2, 5, 10, and 14% porosity are explored. Loading conditions are realized with shock pressures of 6, 10, 20, 38, and 50 GPa. A Cheetah-based tabular model, including temperature-dependent heat capacity, is used for the unreacted and the product equation-of-state. Also, in-line Cheetah is used to probe chemical species evolution. The influence of microstructure and shock loading on shock-to-detonation-transition run distance, reaction rate and product gas species evolution are discussed. This work performed under the auspices of the U.S. DOE by LLNL under Contract DE-AC52-07NA27344. This work is funded by the Joint DoD-DOE Munitions Program.

  19. Simulations of the impact of localized defects on ICF implosions

    NASA Astrophysics Data System (ADS)

    Milovich, Jose; Robey, Harry; Weber, Christopher; Sepke, Scott; Clark, Daniel; Koning, Joe; Smalyuk, Vladimir; Martinez, David

    2016-10-01

    Recent experiments have identified the tent membranes that support the capsule as a source of a large azimuthal perturbation at the point of departure from the surface. Highly-resolved 2D simulations have shown that vorticity generated by the interaction of the ablated capsule material and the tent allows for the penetration of cold ablator material into the burning hot-spot likely cooling the central burning plasma. These observations have motivated the search for alternative supporting methods. One of the techniques being considered uses the existing fill-tube (needed to deliver the cryogenic fuel) supported against gravity by a thin rod (cantilever) spanning the hohlraum diameter. Recent experiments have assessed the perturbation induced on the target as the rod is positioned along the fill-tube at different distances from the capsule surface and found optical-depth modulations oriented along the cantilever direction, possibly caused by laser spot shadowing or hydro-coupling. To fully understand the data we have undertaken an extensive study of highly-resolved 2D integrated simulations abled to resolve the 12 um diameter cantilever. Results of our computations and comparison with the experiments will be presented. Prepared by LLNL under Contract DE-AC52-07NA27344.

  20. 2D and 3D Simulations of Exploding Pusher Capsules

    NASA Astrophysics Data System (ADS)

    Pino, Jesse; Smith, Andrew; Miles, Aaron

    2011-10-01

    A research campaign is underway at the National Ignition Facility (NIF) at LLNL to study rapidly evolving, non-LTE, inertial fusion plasmas. The goal is to field thin-shelled, gas filled ``Exploding Pusher'' capsules in a Polar Direct Drive (PDD) configuration. Ion temperatures of > 15 keV and electron temperatures of > 5 keV are reached. A small convergence ratio and rapidly ablated shell reduce susceptibility to hydrodynamic instabilities. Using 1D simulations, most favorable configurations were found to be thin SiO2 or Be shells containing 10 atm of D2-He3 in a 2:1 ratio. This poster describes the 2D and 3D ARES Radiation Hydrodynamics simulations of these capsules. 2D simulations are essential because the PDD configuration requires that each of the beams be ``repointed'' away from their nominal angles. Each beam can also have a separate power profile and focal length. Large ensembles of simulations were run to probe the parameter space and find the optimal pointing resulting in the most spherical implosions. Response surfaces were constructed to ascertain the susceptibility to shot-time fluctuations. We also discuss resolution convergence and present preliminary results of 3D modeling. This work performed under the auspices of the U.S. DoE by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

Top