Sample records for alamos accelerator code

  1. The Los Alamos Laser Acceleration of Particles Workshop and beginning of the advanced accelerator concepts field

    NASA Astrophysics Data System (ADS)

    Joshi, C.

    2012-12-01

    The first Advanced Acceleration of Particles-AAC-Workshop (actually named Laser Acceleration of Particles Workshop) was held at Los Alamos in January 1982. The workshop lasted a week and divided all the acceleration techniques into four categories: near field, far field, media, and vacuum. Basic theorems of particle acceleration were postulated (later proven) and specific experiments based on the four categories were formulated. This landmark workshop led to the formation of the advanced accelerator R&D program in the HEP office of the DOE that supports advanced accelerator research to this day. Two major new user facilities at Argonne and Brookhaven and several more directed experimental efforts were built to explore the advanced particle acceleration schemes. It is not an exaggeration to say that the intellectual breadth and excitement provided by the many groups who entered this new field provided the needed vitality to then recently formed APS Division of Beams and the new online journal Physical Review Special Topics-Accelerators and Beams. On this 30th anniversary of the AAC Workshops, it is worthwhile to look back at the legacy of the first Workshop at Los Alamos and the fine groundwork it laid for the field of advanced accelerator concepts that continues to flourish to this day.

  2. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  3. Electrical Engineering in Los Alamos Neutron Science Center Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, Michael James

    The field of electrical engineering plays a significant role in particle accelerator design and operations. Los Alamos National Laboratories LANSCE facility utilizes the electrical energy concepts of power distribution, plasma generation, radio frequency energy, electrostatic acceleration, signals and diagnostics. The culmination of these fields produces a machine of incredible potential with uses such as isotope production, neutron spallation, neutron imaging and particle analysis. The key isotope produced in LANSCE isotope production facility is Strontium-82 which is utilized for medical uses such as cancer treatment and positron emission tomography also known as PET scans. Neutron spallation is one of the verymore » few methods used to produce neutrons for scientific research the other methods are natural decay of transuranic elements from nuclear reactors. Accelerator produce neutrons by accelerating charged particles into neutron dense elements such as tungsten imparting a neutral particle with kinetic energy, this has the benefit of producing a large number of neutrons as well as minimizing the waste generated. Utilizing the accelerator scientist can gain an understanding of how various particles behave and interact with matter to better understand the natural laws of physics and the universe around us.« less

  4. Encoded physics knowledge in checking codes for nuclear cross section libraries at Los Alamos

    NASA Astrophysics Data System (ADS)

    Parsons, D. Kent

    2017-09-01

    Checking procedures for processed nuclear data at Los Alamos are described. Both continuous energy and multi-group nuclear data are verified by locally developed checking codes which use basic physics knowledge and common-sense rules. A list of nuclear data problems which have been identified with help of these checking codes is also given.

  5. The Los Alamos suite of relativistic atomic physics codes

    DOE PAGES

    Fontes, C. J.; Zhang, H. L.; Jr, J. Abdallah; ...

    2015-05-28

    The Los Alamos SuitE of Relativistic (LASER) atomic physics codes is a robust, mature platform that has been used to model highly charged ions in a variety of ways. The suite includes capabilities for calculating data related to fundamental atomic structure, as well as the processes of photoexcitation, electron-impact excitation and ionization, photoionization and autoionization within a consistent framework. These data can be of a basic nature, such as cross sections and collision strengths, which are useful in making predictions that can be compared with experiments to test fundamental theories of highly charged ions, such as quantum electrodynamics. The suitemore » can also be used to generate detailed models of energy levels and rate coefficients, and to apply them in the collisional-radiative modeling of plasmas over a wide range of conditions. Such modeling is useful, for example, in the interpretation of spectra generated by a variety of plasmas. In this work, we provide a brief overview of the capabilities within the Los Alamos relativistic suite along with some examples of its application to the modeling of highly charged ions.« less

  6. Development of the Los Alamos continuous high average-power microsecond pulser ion accelerator

    NASA Astrophysics Data System (ADS)

    Bitteker, L. J.; Wood, B. P.; Davis, H. A.; Waganaar, W. J.; Boyd, I. D.; Lovberg, R. H.

    2000-10-01

    The continuous high average-power microsecond pulser (CHAMP) ion accelerator is being constructed at Los Alamos National Laboratory. Progress on the testing of the CHAMP diode is discussed. A direct simulation Monte Carlo computer code is used to investigate the puffed gas fill of the CHAMP anode. High plenum pressures and low plenum volumes are found to be desirable for effective gas puffs. The typical gas fill time is 150-180 μs from initiation of valve operation to end of fill. Results of anode plasma production at three stages of development are discussed. Plasma properties are monitored with electric and magnetic field probes. From this data, the near coil plasma density under nominal conditions is found to be on the order of 1×1016 cm-3. Large error is associated with this calculation due to inconsistencies between tests and the limitations of the instrumentation used. The diode insulating magnetic field is observed to result in lower density plasma with a more diffuse structure than for the cases when the insulating field is not applied. The importance of these differences in plasma quality on the beam production is yet to be determined.

  7. Airport-Noise Levels and Annoyance Model (ALAMO) system's reference manual

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    The airport-noise levels and annoyance model (ALAMO) is described in terms of the constituent modules, the execution of ALAMO procedure files, necessary for system execution, and the source code documentation associated with code development at Langley Research Center. The modules constituting ALAMO are presented both in flow graph form, and through a description of the subroutines and functions that comprise them.

  8. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  9. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  10. Los Alamos high-power proton linac designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, G.P.

    1995-10-01

    Medium-energy high-power proton linear accelerators have been studied at Los Alamos as drivers for spallation neutron applications requiring large amounts of beam power. Reference designs for such accelerators are discussed, important design factors are reviewed, and issues and concern specific to this unprecedented power regime are discussed.

  11. Los Alamos and Lawrence Livermore National Laboratories Code-to-Code Comparison of Inter Lab Test Problem 1 for Asteroid Impact Hazard Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Robert P.; Miller, Paul; Howley, Kirsten

    The NNSA Laboratories have entered into an interagency collaboration with the National Aeronautics and Space Administration (NASA) to explore strategies for prevention of Earth impacts by asteroids. Assessment of such strategies relies upon use of sophisticated multi-physics simulation codes. This document describes the task of verifying and cross-validating, between Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), modeling capabilities and methods to be employed as part of the NNSA-NASA collaboration. The approach has been to develop a set of test problems and then to compare and contrast results obtained by use of a suite of codes, includingmore » MCNP, RAGE, Mercury, Ares, and Spheral. This document provides a short description of the codes, an overview of the idealized test problems, and discussion of the results for deflection by kinetic impactors and stand-off nuclear explosions.« less

  12. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  13. FPGA acceleration of rigid-molecule docking codes

    PubMed Central

    Sukhwani, B.; Herbordt, M.C.

    2011-01-01

    Modelling the interactions of biological molecules, or docking, is critical both to understanding basic life processes and to designing new drugs. The field programmable gate array (FPGA) based acceleration of a recently developed, complex, production docking code is described. The authors found that it is necessary to extend their previous three-dimensional (3D) correlation structure in several ways, most significantly to support simultaneous computation of several correlation functions. The result for small-molecule docking is a 100-fold speed-up of a section of the code that represents over 95% of the original run-time. An additional 2% is accelerated through a previously described method, yielding a total acceleration of 36× over a single core and 10× over a quad-core. This approach is found to be an ideal complement to graphics processing unit (GPU) based docking, which excels in the protein–protein domain. PMID:21857870

  14. COLAcode: COmoving Lagrangian Acceleration code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin V.

    2016-02-01

    COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.

  15. The Los Alamos Supernova Light Curve Project: Current Projects and Future Directions

    NASA Astrophysics Data System (ADS)

    Wiggins, Brandon Kerry; Los Alamos Supernovae Research Group

    2015-01-01

    The Los Alamos Supernova Light Curve Project models supernovae in the ancient and modern universe to determine the luminosities of observability of certain supernovae events and to explore the physics of supernovae in the local universe. The project utilizes RAGE, Los Alamos' radiation hydrodynamics code to evolve the explosions of progenitors prepared in well-established stellar evolution codes. RAGE allows us to capture events such as shock breakout and collisions of ejecta with shells of material which cannot be modeled well in other codes. RAGE's dumps are then ported to LANL's SPECTRUM code which uses LANL's OPLIB opacities database to calculate light curves and spectra. In this paper, we summarize our recent work in modeling supernovae.

  16. Production Level CFD Code Acceleration for Hybrid Many-Core Architectures

    NASA Technical Reports Server (NTRS)

    Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.

    2012-01-01

    In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.

  17. Corkscrew Motion of an Electron Beam due to Coherent Variations in Accelerating Potentials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Carl August

    2016-09-13

    Corkscrew motion results from the interaction of fluctuations of beam electron energy with accidental magnetic dipoles caused by misalignment of the beam transport solenoids. Corkscrew is a serious concern for high-current linear induction accelerators (LIA). A simple scaling law for corkscrew amplitude derived from a theory based on a constant-energy beam coasting through a uniform magnetic field has often been used to assess LIA vulnerability to this effect. We use a beam dynamics code to verify that this scaling also holds for an accelerated beam in a non-uniform magnetic field, as in a real accelerator. Results of simulations with thismore » code are strikingly similar to measurements on one of the LIAs at Los Alamos National Laboratory.« less

  18. New Generation of Los Alamos Opacity Tables

    NASA Astrophysics Data System (ADS)

    Colgan, James; Kilcrease, D. P.; Magee, N. H.; Sherrill, M. E.; Abdallah, J.; Hakel, P.; Fontes, C. J.; Guzik, J. A.; Mussack, K. A.

    2016-05-01

    We present a new generation of Los Alamos OPLIB opacity tables that have been computed using the ATOMIC code. Our tables have been calculated for all 30 elements from hydrogen through zinc and are publicly available through our website. In this poster we discuss the details of the calculations that underpin the new opacity tables. We also show several recent applications of the use of our opacity tables to solar modeling and other astrophysical applications. In particular, we demonstrate that use of the new opacities improves the agreement between solar models and helioseismology, but does not fully resolve the long-standing `solar abundance' problem. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396.

  19. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of themore » physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.« less

  20. Inner Radiation Belt Representation of the Energetic Electron Environment: Model and Data Synthesis Using the Salammbo Radiation Belt Transport Code and Los Alamos Geosynchronous and GPS Energetic Particle Data

    NASA Technical Reports Server (NTRS)

    Friedel, R. H. W.; Bourdarie, S.; Fennell, J.; Kanekal, S.; Cayton, T. E.

    2004-01-01

    The highly energetic electron environment in the inner magnetosphere (GEO inward) has received a lot of research attention in resent years, as the dynamics of relativistic electron acceleration and transport are not yet fully understood. These electrons can cause deep dielectric charging in any space hardware in the MEO to GEO region. We use a new and novel approach to obtain a global representation of the inner magnetospheric energetic electron environment, which can reproduce the absolute environment (flux) for any spacecraft orbit in that region to within a factor of 2 for the energy range of 100 KeV to 5 MeV electrons, for any levels of magnetospheric activity. We combine the extensive set of inner magnetospheric energetic electron observations available at Los Alamos with the physics based Salammbo transport code, using the data assimilation technique of "nudging". This in effect input in-situ data into the code and allows the diffusion mechanisms in the code to interpolate the data into regions and times of no data availability. We present here details of the methods used, both in the data assimilation process and in the necessary inter-calibration of the input data used. We will present sample runs of the model/data code and compare the results to test spacecraft data not used in the data assimilation process.

  1. A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, A.M.M.; Paulson, C.C.; Peacock, M.A.

    1995-10-01

    A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G.H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. A decisionmore » has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.« less

  2. A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Alan M. M.; Paulson, C. C.; Peacock, M. A.

    1995-09-15

    A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G. H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. Amore » decision has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.« less

  3. A New Generation of Los Alamos Opacity Tables

    DOE PAGES

    Colgan, James Patrick; Kilcrease, David Parker; Magee, Jr., Norman H.; ...

    2016-01-26

    We present a new, publicly available, set of Los Alamos OPLIB opacity tables for the elements hydrogen through zinc. Our tables are computed using the Los Alamos ATOMIC opacity and plasma modeling code, and make use of atomic structure calculations that use fine-structure detail for all the elements considered. Our equation-of-state (EOS) model, known as ChemEOS, is based on the minimization of free energy in a chemical picture and appears to be a reasonable and robust approach to determining atomic state populations over a wide range of temperatures and densities. In this paper we discuss in detail the calculations thatmore » we have performed for the 30 elements considered, and present some comparisons of our monochromatic opacities with measurements and other opacity codes. We also use our new opacity tables in solar modeling calculations and compare and contrast such modeling with previous work.« less

  4. Beam breakup in an advanced linear induction accelerator

    DOE PAGES

    Ekdahl, Carl August; Coleman, Joshua Eugene; McCuistian, Brian Trent

    2016-07-01

    Two linear induction accelerators (LIAs) have been in operation for a number of years at the Los Alamos Dual Axis Radiographic Hydrodynamic Test (DARHT) facility. A new multipulse LIA is being developed. We have computationally investigated the beam breakup (BBU) instability in this advanced LIA. In particular, we have explored the consequences of the choice of beam injector energy and the grouping of LIA cells. We find that within the limited range of options presently under consideration for the LIA architecture, there is little adverse effect on the BBU growth. The computational tool that we used for this investigation wasmore » the beam dynamics code linear accelerator model for DARHT (LAMDA). In conclusion, to confirm that LAMDA was appropriate for this task, we first validated it through comparisons with the experimental BBU data acquired on the DARHT accelerators.« less

  5. LEGO: A modular accelerator design code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Y.; Donald, M.; Irwin, J.

    1997-08-01

    An object-oriented accelerator design code has been designed and implemented in a simple and modular fashion. It contains all major features of its predecessors: TRACY and DESPOT. All physics of single-particle dynamics is implemented based on the Hamiltonian in the local frame of the component. Components can be moved arbitrarily in the three dimensional space. Several symplectic integrators are used to approximate the integration of the Hamiltonian. A differential algebra class is introduced to extract a Taylor map up to arbitrary order. Analysis of optics is done in the same way both for the linear and nonlinear case. Currently, themore » code is used to design and simulate the lattices of the PEP-II. It will also be used for the commissioning.« less

  6. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  7. Study of an External Neutron Source for an Accelerator-Driven System using the PHITS Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugawara, Takanori; Iwasaki, Tomohiko; Chiba, Takashi

    A code system for the Accelerator Driven System (ADS) has been under development for analyzing dynamic behaviors of a subcritical core coupled with an accelerator. This code system named DSE (Dynamics calculation code system for a Subcritical system with an External neutron source) consists of an accelerator part and a reactor part. The accelerator part employs a database, which is calculated by using PHITS, for investigating the effect related to the accelerator such as the changes of beam energy, beam diameter, void generation, and target level. This analysis method using the database may introduce some errors into dynamics calculations sincemore » the neutron source data derived from the database has some errors in fitting or interpolating procedures. In this study, the effects of various events are investigated to confirm that the method based on the database is appropriate.« less

  8. Los Alamos Science Facilities

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  9. Living in Los Alamos

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  10. Code comparison for accelerator design and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-01-01

    We present a comparison between results obtained from standard accelerator physics codes used for the design and analysis of synchrotrons and storage rings, with programs SYNCH, MAD, HARMON, PATRICIA, PATPET, BETA, DIMAD, MARYLIE and RACE-TRACK. In our analysis we have considered 5 (various size) lattices with large and small angles including AGS Booster (10/degree/ bend), RHIC (2.24/degree/), SXLS, XLS (XUV ring with 45/degree/ bend) and X-RAY rings. The differences in the integration methods used and the treatment of the fringe fields in these codes could lead to different results. The inclusion of nonlinear (e.g., dipole) terms may be necessary inmore » these calculations specially for a small ring. 12 refs., 6 figs., 10 tabs.« less

  11. Further Studies of the NRL Collective Particle Accelerator VIA Numerical Modeling with the MAGIC Code.

    DTIC Science & Technology

    1984-08-01

    COLLFCTIVF PAPTTCLE ACCELERATOR VIA NUMERICAL MODFLINC WITH THF MAGIC CODE Robert 1. Darker Auqust 19F4 Final Report for Period I April. qI84 - 30...NUMERICAL MODELING WITH THE MAGIC CODE Robert 3. Barker August 1984 Final Report for Period 1 April 1984 - 30 September 1984 Prepared for: Scientific...Collective Final Report Particle Accelerator VIA Numerical Modeling with April 1 - September-30, 1984 MAGIC Code. 6. PERFORMING ORG. REPORT NUMBER MRC/WDC-R

  12. Transform coding for hardware-accelerated volume rendering.

    PubMed

    Fout, Nathaniel; Ma, Kwan-Liu

    2007-01-01

    Hardware-accelerated volume rendering using the GPU is now the standard approach for real-time volume rendering, although limited graphics memory can present a problem when rendering large volume data sets. Volumetric compression in which the decompression is coupled to rendering has been shown to be an effective solution to this problem; however, most existing techniques were developed in the context of software volume rendering, and all but the simplest approaches are prohibitive in a real-time hardware-accelerated volume rendering context. In this paper we present a novel block-based transform coding scheme designed specifically with real-time volume rendering in mind, such that the decompression is fast without sacrificing compression quality. This is made possible by consolidating the inverse transform with dequantization in such a way as to allow most of the reprojection to be precomputed. Furthermore, we take advantage of the freedom afforded by off-line compression in order to optimize the encoding as much as possible while hiding this complexity from the decoder. In this context we develop a new block classification scheme which allows us to preserve perceptually important features in the compression. The result of this work is an asymmetric transform coding scheme that allows very large volumes to be compressed and then decompressed in real-time while rendering on the GPU.

  13. Stockpile Stewardship: Los Alamos

    ScienceCinema

    McMillan, Charlie; Morgan, Nathanial; Goorley, Tom; Merrill, Frank; Funk, Dave; Korzekwa, Deniece; Laintz, Ken

    2018-01-16

    "Heritage of Science" is a short video that highlights the Stockpile Stewardship program at Los Alamos National Laboratory. Stockpile Stewardship was conceived in the early 1990s as a national science-based program that could assure the safety, security, and effectiveness of the U.S. nuclear deterrent without the need for full-scale underground nuclear testing. This video was produced by Los Alamos National Laboratory for screening at the Lab's Bradbury Science Museum in Los Alamos, NM and is narrated by science correspondent Miles O'Brien.

  14. New Mexico: Los Alamos

    Atmospheric Science Data Center

    2014-05-15

    article title:  Los Alamos, New Mexico     View Larger JPEG image ... kb) Multi-angle views of the Fire in Los Alamos, New Mexico, May 9, 2000. These true-color images covering north-central New Mexico ...

  15. Electron-beam dynamics for an advanced flash-radiography accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Carl August Jr.

    2015-06-22

    Beam dynamics issues were assessed for a new linear induction electron accelerator. Special attention was paid to equilibrium beam transport, possible emittance growth, and beam stability. Especially problematic would be high-frequency beam instabilities that could blur individual radiographic source spots, low-frequency beam motion that could cause pulse-to-pulse spot displacement, and emittance growth that could enlarge the source spots. Beam physics issues were examined through theoretical analysis and computer simulations, including particle-in cell (PIC) codes. Beam instabilities investigated included beam breakup (BBU), image displacement, diocotron, parametric envelope, ion hose, and the resistive wall instability. Beam corkscrew motion and emittance growth frommore » beam mismatch were also studied. It was concluded that a beam with radiographic quality equivalent to the present accelerators at Los Alamos will result if the same engineering standards and construction details are upheld.« less

  16. 3D unstructured-mesh radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morel, J.

    1997-12-31

    Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options:more » $$S{_}n$$ (discrete-ordinates), $$P{_}n$$ (spherical harmonics), and $$SP{_}n$$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $$S{_}n$$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.« less

  17. Tested by Fire - How two recent Wildfires affected Accelerator Operations at LANL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spickermann, Thomas

    2012-08-01

    In a little more than a decade two large wild fires threatened Los Alamos and impacted accelerator operations at LANL. In 2000 the Cerro Grande Fire destroyed hundreds of homes, as well as structures and equipment at the DARHT facility. The DARHT accelerators were safe in a fire-proof building. In 2011 the Las Conchas Fire burned about 630 square kilometers (250 square miles) and came dangerously close to Los Alamos/LANL. LANSCE accelerator operations Lessons Learned during Las Conchas fire: (1) Develop a plan to efficiently shut down the accelerator on short notice; (2) Establish clear lines of communication in emergencymore » situations; and (3) Plan recovery and keep squirrels out.« less

  18. MAPA: an interactive accelerator design code with GUI

    NASA Astrophysics Data System (ADS)

    Bruhwiler, David L.; Cary, John R.; Shasharina, Svetlana G.

    1999-06-01

    The MAPA code is an interactive accelerator modeling and design tool with an X/Motif GUI. MAPA has been developed in C++ and makes full use of object-oriented features. We present an overview of its features and describe how users can independently extend the capabilities of the entire application, including the GUI. For example, a user can define a new model for a focusing or accelerating element. If the appropriate form is followed, and the new element is "registered" with a single line in the specified file, then the GUI will fully support this user-defined element type after it has been compiled and then linked to the existing application. In particular, the GUI will bring up windows for modifying any relevant parameters of the new element type. At present, one can use the GUI for phase space tracking, finding fixed points and generating line plots for the Twiss parameters, the dispersion and the accelerator geometry. The user can define new types of simulations which the GUI will automatically support by providing a menu option to execute the simulation and subsequently rendering line plots of the resulting data.

  19. GAPD: a GPU-accelerated atom-based polychromatic diffraction simulation code.

    PubMed

    E, J C; Wang, L; Chen, S; Zhang, Y Y; Luo, S N

    2018-03-01

    GAPD, a graphics-processing-unit (GPU)-accelerated atom-based polychromatic diffraction simulation code for direct, kinematics-based, simulations of X-ray/electron diffraction of large-scale atomic systems with mono-/polychromatic beams and arbitrary plane detector geometries, is presented. This code implements GPU parallel computation via both real- and reciprocal-space decompositions. With GAPD, direct simulations are performed of the reciprocal lattice node of ultralarge systems (∼5 billion atoms) and diffraction patterns of single-crystal and polycrystalline configurations with mono- and polychromatic X-ray beams (including synchrotron undulator sources), and validation, benchmark and application cases are presented.

  20. Deploying electromagnetic particle-in-cell (EM-PIC) codes on Xeon Phi accelerators boards

    NASA Astrophysics Data System (ADS)

    Fonseca, Ricardo

    2014-10-01

    The complexity of the phenomena involved in several relevant plasma physics scenarios, where highly nonlinear and kinetic processes dominate, makes purely theoretical descriptions impossible. Further understanding of these scenarios requires detailed numerical modeling, but fully relativistic particle-in-cell codes such as OSIRIS are computationally intensive. The quest towards Exaflop computer systems has lead to the development of HPC systems based on add-on accelerator cards, such as GPGPUs and more recently the Xeon Phi accelerators that power the current number 1 system in the world. These cards, also referred to as Intel Many Integrated Core Architecture (MIC) offer peak theoretical performances of >1 TFlop/s for general purpose calculations in a single board, and are receiving significant attention as an attractive alternative to CPUs for plasma modeling. In this work we report on our efforts towards the deployment of an EM-PIC code on a Xeon Phi architecture system. We will focus on the parallelization and vectorization strategies followed, and present a detailed performance evaluation of code performance in comparison with the CPU code.

  1. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1995-01-01

    This report presents the results of a study to implement convergence acceleration techniques based on the multigrid concept in the two-dimensional and three-dimensional versions of the Proteus computer code. The first section presents a review of the relevant literature on the implementation of the multigrid methods in computer codes for compressible flow analysis. The next two sections present detailed stability analysis of numerical schemes for solving the Euler and Navier-Stokes equations, based on conventional von Neumann analysis and the bi-grid analysis, respectively. The next section presents details of the computational method used in the Proteus computer code. Finally, the multigrid implementation and applications to several two-dimensional and three-dimensional test problems are presented. The results of the present study show that the multigrid method always leads to a reduction in the number of iterations (or time steps) required for convergence. However, there is an overhead associated with the use of multigrid acceleration. The overhead is higher in 2-D problems than in 3-D problems, thus overall multigrid savings in CPU time are in general better in the latter. Savings of about 40-50 percent are typical in 3-D problems, but they are about 20-30 percent in large 2-D problems. The present multigrid method is applicable to steady-state problems and is therefore ineffective in problems with inherently unstable solutions.

  2. Science and Innovation at Los Alamos

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  3. Electron-Beam Dynamics for an Advanced Flash-Radiography Accelerator

    DOE PAGES

    Ekdahl, Carl

    2015-11-17

    Beam dynamics issues were assessed for a new linear induction electron accelerator being designed for multipulse flash radiography of large explosively driven hydrodynamic experiments. Special attention was paid to equilibrium beam transport, possible emittance growth, and beam stability. Especially problematic would be high-frequency beam instabilities that could blur individual radiographic source spots, low-frequency beam motion that could cause pulse-to-pulse spot displacement, and emittance growth that could enlarge the source spots. Furthermore, beam physics issues were examined through theoretical analysis and computer simulations, including particle-in-cell codes. Beam instabilities investigated included beam breakup, image displacement, diocotron, parametric envelope, ion hose, and themore » resistive wall instability. The beam corkscrew motion and emittance growth from beam mismatch were also studied. It was concluded that a beam with radiographic quality equivalent to the present accelerators at Los Alamos National Laboratory will result if the same engineering standards and construction details are upheld.« less

  4. Particle-in-cell/accelerator code for space-charge dominated beam simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-08

    Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model.more » The code is guilt atop the Python interpreter language.« less

  5. Status report on the 'Merging' of the Electron-Cloud Code POSINST with the 3-D Accelerator PIC CODE WARP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vay, J.-L.; Furman, M.A.; Azevedo, A.W.

    2004-04-19

    We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.

  6. Los Alamos Climatology 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruggeman, David Alan

    The Los Alamos National Laboratory (LANL or the Laboratory) operates a meteorology monitoring network to support LANL emergency response, engineering designs, environmental compliance, environmental assessments, safety evaluations, weather forecasting, environmental monitoring, research programs, and environmental restoration. Weather data has been collected in Los Alamos since 1910. Bowen (1990) provided climate statistics (temperature and precipitation) for the 1961– 1990 averaging period, and included other analyses (e.g., wind and relative humidity) based on the available station locations and time periods. This report provides an update to the 1990 publication Los Alamos Climatology (Bowen 1990).

  7. GeNN: a code generation framework for accelerated brain simulations

    NASA Astrophysics Data System (ADS)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  8. GeNN: a code generation framework for accelerated brain simulations.

    PubMed

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-07

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  9. GeNN: a code generation framework for accelerated brain simulations

    PubMed Central

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  10. Los Alamos offers Fellowships

    NASA Astrophysics Data System (ADS)

    Los Alamos National Laboratory in New Mexico is calling for applications for postdoctoral appointments and research fellowships. The positions are available in geoscience as well as other scientific disciplines.The laboratory, which is operated by the University of California for the Department of Energy, awards J. Robert Oppenheimer Research Fellowships to scientists that either have or will soon complete doctoral degrees. The appointments are for two years, are renewable for a third year, and carry a stipend of $51,865 per year. Potential applicants should send a resume or employment application and a statement of research goals to Carol M. Rich, Div. 89, Human Resources Development Division, MS P290, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 by mid-November.

  11. Reliability enhancement of Navier-Stokes codes through convergence acceleration

    NASA Technical Reports Server (NTRS)

    Merkle, Charles L.; Dulikravich, George S.

    1995-01-01

    Methods for enhancing the reliability of Navier-Stokes computer codes through improving convergence characteristics are presented. The improving of these characteristics decreases the likelihood of code unreliability and user interventions in a design environment. The problem referred to as a 'stiffness' in the governing equations for propulsion-related flowfields is investigated, particularly in regard to common sources of equation stiffness that lead to convergence degradation of CFD algorithms. Von Neumann stability theory is employed as a tool to study the convergence difficulties involved. Based on the stability results, improved algorithms are devised to ensure efficient convergence in different situations. A number of test cases are considered to confirm a correlation between stability theory and numerical convergence. The examples of turbulent and reacting flow are presented, and a generalized form of the preconditioning matrix is derived to handle these problems, i.e., the problems involving additional differential equations for describing the transport of turbulent kinetic energy, dissipation rate and chemical species. Algorithms for unsteady computations are considered. The extension of the preconditioning techniques and algorithms derived for Navier-Stokes computations to three-dimensional flow problems is discussed. New methods to accelerate the convergence of iterative schemes for the numerical integration of systems of partial differential equtions are developed, with a special emphasis on the acceleration of convergence on highly clustered grids.

  12. Los Alamos National Lab: National Security Science

    Science.gov Websites

    SKIP TO PAGE CONTENT Los Alamos National Laboratory Delivering science and technology to protect Permit for Storm Water Public Reading Room Environment Home News Los Alamos National Lab: National deposition operations for the Center for Integrated Nanotechnologies at Los Alamos. Innovation drives his

  13. GOTHIC: Gravitational oct-tree code accelerated by hierarchical time step controlling

    NASA Astrophysics Data System (ADS)

    Miki, Yohei; Umemura, Masayuki

    2017-04-01

    The tree method is a widely implemented algorithm for collisionless N-body simulations in astrophysics well suited for GPU(s). Adopting hierarchical time stepping can accelerate N-body simulations; however, it is infrequently implemented and its potential remains untested in GPU implementations. We have developed a Gravitational Oct-Tree code accelerated by HIerarchical time step Controlling named GOTHIC, which adopts both the tree method and the hierarchical time step. The code adopts some adaptive optimizations by monitoring the execution time of each function on-the-fly and minimizes the time-to-solution by balancing the measured time of multiple functions. Results of performance measurements with realistic particle distribution performed on NVIDIA Tesla M2090, K20X, and GeForce GTX TITAN X, which are representative GPUs of the Fermi, Kepler, and Maxwell generation of GPUs, show that the hierarchical time step achieves a speedup by a factor of around 3-5 times compared to the shared time step. The measured elapsed time per step of GOTHIC is 0.30 s or 0.44 s on GTX TITAN X when the particle distribution represents the Andromeda galaxy or the NFW sphere, respectively, with 224 = 16,777,216 particles. The averaged performance of the code corresponds to 10-30% of the theoretical single precision peak performance of the GPU.

  14. Proceedings of the 1995 Particle Accelerator Conference and international Conference on High-Energy Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1996-01-01

    Papers from the sixteenth biennial Particle Accelerator Conference, an international forum on accelerator science and technology held May 1–5, 1995, in Dallas, Texas, organized by Los Alamos National Laboratory (LANL) and Stanford Linear Accelerator Center (SLAC), jointly sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Nuclear and Plasma Sciences Society (NPSS), the American Physical Society (APS) Division of Particles and Beams (DPB), and the International Union of Pure and Applied Physics (IUPAP), and conducted with support from the US Department of Energy, the National Science Foundation, and the Office of Naval Research.

  15. Comparisons of time explicit hybrid kinetic-fluid code Architect for Plasma Wakefield Acceleration with a full PIC code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massimo, F., E-mail: francesco.massimo@ensta-paristech.fr; Dipartimento SBAI, Università di Roma “La Sapienza“, Via A. Scarpa 14, 00161 Roma; Atzeni, S.

    Architect, a time explicit hybrid code designed to perform quick simulations for electron driven plasma wakefield acceleration, is described. In order to obtain beam quality acceptable for applications, control of the beam-plasma-dynamics is necessary. Particle in Cell (PIC) codes represent the state-of-the-art technique to investigate the underlying physics and possible experimental scenarios; however PIC codes demand the necessity of heavy computational resources. Architect code substantially reduces the need for computational resources by using a hybrid approach: relativistic electron bunches are treated kinetically as in a PIC code and the background plasma as a fluid. Cylindrical symmetry is assumed for themore » solution of the electromagnetic fields and fluid equations. In this paper both the underlying algorithms as well as a comparison with a fully three dimensional particle in cell code are reported. The comparison highlights the good agreement between the two models up to the weakly non-linear regimes. In highly non-linear regimes the two models only disagree in a localized region, where the plasma electrons expelled by the bunch close up at the end of the first plasma oscillation.« less

  16. The Los Alamos Neutron Science Center Spallation Neutron Sources

    NASA Astrophysics Data System (ADS)

    Nowicki, Suzanne F.; Wender, Stephen A.; Mocko, Michael

    The Los Alamos Neutron Science Center (LANSCE) provides the scientific community with intense sources of neutrons, which can be used to perform experiments supporting civilian and national security research. These measurements include nuclear physics experiments for the defense program, basic science, and the radiation effect programs. This paper focuses on the radiation effects program, which involves mostly accelerated testing of semiconductor parts. When cosmic rays strike the earth's atmosphere, they cause nuclear reactions with elements in the air and produce a wide range of energetic particles. Because neutrons are uncharged, they can reach aircraft altitudes and sea level. These neutrons are thought to be the most important threat to semiconductor devices and integrated circuits. The best way to determine the failure rate due to these neutrons is to measure the failure rate in a neutron source that has the same spectrum as those produced by cosmic rays. Los Alamos has a high-energy and a low-energy neutron source for semiconductor testing. Both are driven by the 800-MeV proton beam from the LANSCE accelerator. The high-energy neutron source at the Weapons Neutron Research (WNR) facility uses a bare target that is designed to produce fast neutrons with energies from 100 keV to almost 800 MeV. The measured neutron energy distribution from WNR is very similar to that of the cosmic-ray-induced neutrons in the atmosphere. However, the flux provided at the WNR facility is typically 5×107 times more intense than the flux of the cosmic-ray-induced neutrons. This intense neutron flux allows testing at greatly accelerated rates. An irradiation test of less than an hour is equivalent to many years of neutron exposure due to cosmic-ray neutrons. The low-energy neutron source is located at the Lujan Neutron Scattering Center. It is based on a moderated source that provides useful neutrons from subthermal energies to ∼100 keV. The characteristics of these sources, and

  17. The Los Alamos Neutron Science Center Spallation Neutron Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nowicki, Suzanne F.; Wender, Stephen A.; Mocko, Michael

    The Los Alamos Neutron Science Center (LANSCE) provides the scientific community with intense sources of neutrons, which can be used to perform experiments supporting civilian and national security research. These measurements include nuclear physics experiments for the defense program, basic science, and the radiation effect programs. This paper focuses on the radiation effects program, which involves mostly accelerated testing of semiconductor parts. When cosmic rays strike the earth's atmosphere, they cause nuclear reactions with elements in the air and produce a wide range of energetic particles. Because neutrons are uncharged, they can reach aircraft altitudes and sea level. These neutronsmore » are thought to be the most important threat to semiconductor devices and integrated circuits. The best way to determine the failure rate due to these neutrons is to measure the failure rate in a neutron source that has the same spectrum as those produced by cosmic rays. Los Alamos has a high-energy and a low-energy neutron source for semiconductor testing. Both are driven by the 800-MeV proton beam from the LANSCE accelerator. The high-energy neutron source at the Weapons Neutron Research (WNR) facility uses a bare target that is designed to produce fast neutrons with energies from 100 keV to almost 800 MeV. The measured neutron energy distribution from WNR is very similar to that of the cosmic-ray-induced neutrons in the atmosphere. However, the flux provided at the WNR facility is typically 5×107 times more intense than the flux of the cosmic-ray-induced neutrons. This intense neutron flux allows testing at greatly accelerated rates. An irradiation test of less than an hour is equivalent to many years of neutron exposure due to cosmic-ray neutrons. The low-energy neutron source is located at the Lujan Neutron Scattering Center. It is based on a moderated source that provides useful neutrons from subthermal energies to ~100 keV. The characteristics of these sources

  18. The Los Alamos Neutron Science Center Spallation Neutron Sources

    DOE PAGES

    Nowicki, Suzanne F.; Wender, Stephen A.; Mocko, Michael

    2017-10-26

    The Los Alamos Neutron Science Center (LANSCE) provides the scientific community with intense sources of neutrons, which can be used to perform experiments supporting civilian and national security research. These measurements include nuclear physics experiments for the defense program, basic science, and the radiation effect programs. This paper focuses on the radiation effects program, which involves mostly accelerated testing of semiconductor parts. When cosmic rays strike the earth's atmosphere, they cause nuclear reactions with elements in the air and produce a wide range of energetic particles. Because neutrons are uncharged, they can reach aircraft altitudes and sea level. These neutronsmore » are thought to be the most important threat to semiconductor devices and integrated circuits. The best way to determine the failure rate due to these neutrons is to measure the failure rate in a neutron source that has the same spectrum as those produced by cosmic rays. Los Alamos has a high-energy and a low-energy neutron source for semiconductor testing. Both are driven by the 800-MeV proton beam from the LANSCE accelerator. The high-energy neutron source at the Weapons Neutron Research (WNR) facility uses a bare target that is designed to produce fast neutrons with energies from 100 keV to almost 800 MeV. The measured neutron energy distribution from WNR is very similar to that of the cosmic-ray-induced neutrons in the atmosphere. However, the flux provided at the WNR facility is typically 5×107 times more intense than the flux of the cosmic-ray-induced neutrons. This intense neutron flux allows testing at greatly accelerated rates. An irradiation test of less than an hour is equivalent to many years of neutron exposure due to cosmic-ray neutrons. The low-energy neutron source is located at the Lujan Neutron Scattering Center. It is based on a moderated source that provides useful neutrons from subthermal energies to ~100 keV. The characteristics of these sources

  19. Theoretical modeling of laser-induced plasmas using the ATOMIC code

    NASA Astrophysics Data System (ADS)

    Colgan, James; Johns, Heather; Kilcrease, David; Judge, Elizabeth; Barefield, James, II; Clegg, Samuel; Hartig, Kyle

    2014-10-01

    We report on efforts to model the emission spectra generated from laser-induced breakdown spectroscopy (LIBS). LIBS is a popular and powerful method of quickly and accurately characterizing unknown samples in a remote manner. In particular, LIBS is utilized by the ChemCam instrument on the Mars Science Laboratory. We model the LIBS plasma using the Los Alamos suite of atomic physics codes. Since LIBS plasmas generally have temperatures of somewhere between 3000 K and 12000 K, the emission spectra typically result from the neutral and singly ionized stages of the target atoms. We use the Los Alamos atomic structure and collision codes to generate sets of atomic data and use the plasma kinetics code ATOMIC to perform LTE or non-LTE calculations that generate level populations and an emission spectrum for the element of interest. In this presentation we compare the emission spectrum from ATOMIC with an Fe LIBS laboratory-generated plasma as well as spectra from the ChemCam instrument. We also discuss various physics aspects of the modeling of LIBS plasmas that are necessary for accurate characterization of the plasma, such as multi-element target composition effects, radiation transport effects, and accurate line shape treatments. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396.

  20. New facility for ion beam materials characterization and modification at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tesmer, J.R.; Maggiore, C.J.; Parkin, D.M.

    1988-01-01

    The Ion Beam Materials Laboratory (IBML) is a new Los Alamos laboratory devoted to the characterization and modification of the near surfaces of materials. The primary instruments of the IBML are a tandem electrostatic accelerator, a National Electrostatics Corp. Model 9SDH, coupled with a Varian CF-3000 ion implanter. The unique organizational structure of the IBML as well as the operational characteristics of the 9SDH (after approximately 3000 h of operation) and the laboratories' research capabilities will be discussed. Examples of current research results will also be presented. 5 refs., 2 figs.

  1. Water Supply at Los Alamos 1998-2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard J. Koch; David B. Rogers

    2003-03-01

    For the period 1998 through 2001, the total water used at Los Alamos from all sources ranged from 1325 million gallons (Mg) in 1999 to 1515 Mg in 2000. Groundwater production ranged from 1323 Mg in 1999 to 1506 Mg in 2000 from the Guaje, Pajarito, and Otowi fields. Nonpotable surface water used from Los Alamos reservoir ranged from zero gallons in 2001 to 9.3 Mg in 2000. For years 1998 through 2001, over 99% of all water used at Los Alamos was groundwater. Water use by Los Alamos National Laboratory (LANL) between 1998 and 2001 ranged from 379 Mgmore » in 2000 to 461 Mg in 1998. The LANL water use in 2001 was 393 Mg or 27% of the total water use at Los Alamos. Water use by Los Alamos County ranged from 872 Mg in 1999 to 1137 Mg in 2000, and averaged 1006 Mg/yr. Four new replacement wells in the Guaje field (G-2A, G-3A, G-4A, and G-5A) were drilled in 1998 and began production in 1999; with existing well G-1A, the Guaje field currently has five producing wells. Five of the old Guaje wells (G-1, G-2, G-4, G-5, and G-6) were plugged and abandoned in 1999, and one well (G-3) was abandoned but remains as an observation well for the Guaje field. The long-term water level observations in production and observation (test) wells at Los Alamos are consistent with the formation of a cone of depression in response to water production. The water level decline is gradual and at most has been about 0.7 to 2 ft per year for production wells and from 0.4 to 0.9 ft/yr for observation (test) wells. The largest water level declines have been in the Guaje field where nonpumping water levels were about 91 ft lower in 2001 than in 1951. The initial water levels of the Guaje replacement wells were 32 to 57 ft lower than the initial water levels of adjacent original Guaje wells. When production wells are taken off-line for pump replacement or repair, water levels have returned to within about 25 ft of initial static levels within 6 to 12 months. Thus, the water-level trends suggest no

  2. Code Modernization of VPIC

    NASA Astrophysics Data System (ADS)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  3. Investigation and Prediction of RF Window Performance in APT Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, S. Jr.

    1997-05-01

    The work described in this report was performed between November 1996 and May 1997 in support of the APT (Accelerator Production of Tritium) Program at Los Alamos National Laboratory. The goal was to write and to test computer programs for charged particle orbits in RF fields. The well-documented programs were written in portable form and compiled for standard personal computers for easy distribution to LANL researchers. They will be used in several APT applications including the following. Minimization of multipactor effects in the moderate {beta} superconducting linac cavities under design for the APT accelerator. Investigation of suppression techniques for electronmore » multipactoring in high-power RF feedthroughs. Modeling of the response of electron detectors for the protection of high power RF vacuum windows. In the contract period two new codes, Trak{_}RF and WaveSim, were completed and several critical benchmark etests were carried out. Trak{_}RF numerically tracks charged particle orbits in combined electrostatic, magnetostatic and electromagnetic fields. WaveSim determines frequency-domain RF field solutions and provides a key input to Trak{_}RF. The two-dimensional programs handle planar or cylindrical geometries. They have several unique characteristics.« less

  4. Los Alamos Fires From Landsat 7

    NASA Technical Reports Server (NTRS)

    2002-01-01

    On May 9, 2000, the Landsat 7 satellite acquired an image of the area around Los Alamos, New Mexico. The Landsat 7 satellite acquired this image from 427 miles in space through its sensor called the Enhanced Thematic Mapper Plus (ETM+). Evident within the imagery is a view of the ongoing Cerro Grande fire near the town of Los Alamos and the Los Alamos National Laboratory. Combining the high-resolution (30 meters per pixel in this scene) imaging capacity of ETM+ with its multi-spectral capabilities allows scientists to penetrate the smoke plume and see the structure of the fire on the surface. Notice the high-level of detail in the infrared image (bottom), in which burn scars are clearly distinguished from the hotter smoldering and flaming parts of the fire. Within this image pair several features are clearly visible, including the Cerro Grande fire and smoke plume, the town of Los Alamos, the Los Alamos National Laboratory and associated property, and Cerro Grande peak. Combining ETM+ channels 7, 4, and 2 (one visible and two infrared channels) results in a false color image where vegetation appears as bright to dark green (bottom image). Forested areas are generally dark green while herbaceous vegetation is light green. Rangeland or more open areas appear pink to light purple. Areas with extensive pavement or urban development appear light blue or white to purple. Less densely-developed residential areas appear light green and golf courses are very bright green. The areas recently burned appear black. Dark red to bright red patches, or linear features within the burned area, are the hottest and possibly actively burning areas of the fire. The fire is spreading downslope and the front of the fire is readily detectable about 2 kilometers to the west and south of Los Alamos. Combining ETM+ channels 3, 2, and 1 provides a true-color image of the greater Los Alamos region (top image). Vegetation is generally dark to medium green. Forested areas are very dark green

  5. Liquid rocket combustor computer code development

    NASA Technical Reports Server (NTRS)

    Liang, P. Y.

    1985-01-01

    The Advanced Rocket Injector/Combustor Code (ARICC) that has been developed to model the complete chemical/fluid/thermal processes occurring inside rocket combustion chambers are highlighted. The code, derived from the CONCHAS-SPRAY code originally developed at Los Alamos National Laboratory incorporates powerful features such as the ability to model complex injector combustion chamber geometries, Lagrangian tracking of droplets, full chemical equilibrium and kinetic reactions for multiple species, a fractional volume of fluid (VOF) description of liquid jet injection in addition to the gaseous phase fluid dynamics, and turbulent mass, energy, and momentum transport. Atomization and droplet dynamic models from earlier generation codes are transplated into the present code. Currently, ARICC is specialized for liquid oxygen/hydrogen propellants, although other fuel/oxidizer pairs can be easily substituted.

  6. Internship at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunham, Ryan Q.

    2012-07-11

    Los Alamos National Laboratory (LANL) is located in Los Alamos, New Mexico. It provides support for our country's nuclear weapon stockpile as well as many other scientific research projects. I am an Undergraduate Student Intern in the Systems Design and Analysis group within the Nuclear Nonproliferation division of the Global Security directorate at LANL. I have been tasked with data analysis and modeling of particles in a fluidized bed system for the capture of carbon dioxide from power plant flue gas.

  7. Publications of Los Alamos Research, 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheridan, C.J.; McClary, W.J.; Rich, J.A.

    1984-10-01

    This bibliography is a compilation of unclassified publications of work done at the Los Alamos National Laboratory for 1983. Papers published in 1982 are included regardless of when they were actually written. Publications received too late for inclusion in earlier compilations have also been listed. Declassification of previously classified reports is considered to constitute publication. All classified issuances are omitted - even those papers, themselves unclassified, which were published only as part of a classified document. If a paper was published more than once, all places of publication are included. The bibliography includes Los Alamos National Laboratory reports, papers releasedmore » as non-Laboratory reports, journal articles, books, chapters of books, conference papers either published separately or as part of conference proceedings issued as books or reports, papers publishd in congressional hearings, theses, and US patents. Publications by Los Alamos authors that are not records of Laboratory-sponsored work are included when the Library becomes aware of them.« less

  8. A Sailor in the Los Alamos Navy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judd, D. L.; Meade, Roger Allen

    As part of the War Department’s Manhattan Engineer District (MED), Los Alamos was an Army installation during World War II, complete with a base commander and a brace of MPs. But it was a unique Army installation, having more civilian then military personnel. Even more unique was the work performed by the civilian population, work that required highly educated scientists and engineers. As the breadth, scope, and complexity of the Laboratory’s work increased, more and more technically educated and trained personnel were needed. But, the manpower needs of the nation’s war economy had created a shortage of such people. Tomore » meet its manpower needs, the MED scoured the ranks of the Army for anyone who had technical training and reassigned these men to its laboratories, including Los Alamos, as part of its Special Engineer Detachment (SED). Among the SEDs assigned to Los Alamos was Val Fitch, who was awarded the Nobel Prize in Physics in 1980. Another was Al Van Vessem, who helped stack the TNT for the 100 ton test, bolted together the Trinity device, and rode shotgun with the bomb has it was driven from Los Alamos to ground zero.« less

  9. Environmental analysis of Lower Pueblo/Lower Los Alamos Canyon, Los Alamos, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferenbaugh, R.W.; Buhl, T.E.; Stoker, A.K.

    1994-12-01

    The radiological survey of the former radioactive waste treatment plant site (TA-45), Acid Canyon, Pueblo Canyon, and Los Alamos Canyon found residual contamination at the site itself and in the channel and banks of Acid, Pueblo, and lower Los Alamos Canyons all the way to the Rio Grande. The largest reservoir of residual radioactivity is in lower Pueblo Canyon, which is on DOE property. However, residual radioactivity does not exceed proposed cleanup criteria in either lower Pueblo or lower Los Alamos Canyons. The three alternatives proposed are (1) to take no action, (2) to construct a sediment trap in lowermore » Pueblo Canyon to prevent further transport of residual radioactivity onto San Ildefonso Indian Pueblo land, and (3) to clean the residual radioactivity from the canyon system. Alternative 2, to cleanup the canyon system, is rejected as a viable alternative. Thousands of truckloads of sediment would have to be removed and disposed of, and this effort is unwarranted by the low levels of contamination present. Residual radioactivity levels, under either present conditions or projected future conditions, will not result in significant radiation doses to persons exposed. Modeling efforts show that future transport activity will not result in any residual radioactivity concentrations higher than those already existing. Thus, although construction of a sediment trap in lower Pueblo Canyon is a viable alternative, this effort also is unwarranted, and the no-action alternative is the preferred alternative.« less

  10. Los Alamos Neutron Science Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kippen, Karen Elizabeth

    For more than 30 years the Los Alamos Neutron Science Center (LANSCE) has provided the scientific underpinnings in nuclear physics and material science needed to ensure the safety and surety of the nuclear stockpile into the future. In addition to national security research, the LANSCE User Facility has a vibrant research program in fundamental science, providing the scientific community with intense sources of neutrons and protons to perform experiments supporting civilian research and the production of medical and research isotopes. Five major experimental facilities operate simultaneously. These facilities contribute to the stockpile stewardship program, produce radionuclides for medical testing, andmore » provide a venue for industrial users to irradiate and test electronics. In addition, they perform fundamental research in nuclear physics, nuclear astrophysics, materials science, and many other areas. The LANSCE User Program plays a key role in training the next generation of top scientists and in attracting the best graduate students, postdoctoral researchers, and early-career scientists. The U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA) —the principal sponsor of LANSCE—works with the Office of Science and the Office of Nuclear Energy, which have synergistic long-term needs for the linear accelerator and the neutron science that is the heart of LANSCE.« less

  11. Theoretical atomic physics code development I: CATS: Cowan Atomic Structure Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdallah, J. Jr.; Clark, R.E.H.; Cowan, R.D.

    An adaptation of R.D. Cowan's Atomic Structure program, CATS, has been developed as part of the Theoretical Atomic Physics (TAPS) code development effort at Los Alamos. CATS has been designed to be easy to run and to produce data files that can interface with other programs easily. The CATS produced data files currently include wave functions, energy levels, oscillator strengths, plane-wave-Born electron-ion collision strengths, photoionization cross sections, and a variety of other quantities. This paper describes the use of CATS. 10 refs.

  12. Los Alamos National Laboratory Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neu, Mary

    Mary Neu, Associate Director for Chemistry, Life and Earth Sciences at Los Alamos National Laboratory, delivers opening remarks at the "Sequencing, Finishing, Analysis in the Future" meeting in Santa Fe, NM.

  13. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebe, A.; Leveling, A.; Lu, T.

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay gamma-quanta by the residuals in the activated structures and scoring the prompt doses of these gamma-quanta at arbitrary distances frommore » those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and showed a good agreement. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.« less

  14. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    NASA Astrophysics Data System (ADS)

    Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.

    2018-01-01

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay γ-quanta by the residuals in the activated structures and scoring the prompt doses of these γ-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and against experimental data from the CERF facility at CERN, and FermiCORD showed reasonable agreement with these. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.

  15. SEDs at Los Alamos: A Personal Memoir

    NASA Astrophysics Data System (ADS)

    Bederson, Benjamin

    2001-03-01

    I have written this personal memoir approximately 55 years after the events I describe. It is based almost exclusively on memory, since apart from the diary I kept while on Tinian, I have few documents concerning it. It covers my service in the U.S. Army's Special Engineering Detachment (SED) in Oak Ridge and Los Alamos in 1944-45, on Tinian island, the launching pad for the bombing raids on Japan, in the summer and fall of 1945, and my return to Los Alamos until my discharge in January 1946.

  16. GPU acceleration of the Locally Selfconsistent Multiple Scattering code for first principles calculation of the ground state and statistical physics of materials

    NASA Astrophysics Data System (ADS)

    Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; Rennich, Steven; Rogers, James H.

    2017-02-01

    The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.

  17. Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharirli, M.; Rand, J.L.; Sasser, M.K.

    1992-01-01

    The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less

  18. Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharirli, M.; Rand, J.L.; Sasser, M.K.

    1992-12-01

    The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less

  19. High Gradient Accelerator Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Temkin, Richard

    The goal of the MIT program of research on high gradient acceleration is the development of advanced acceleration concepts that lead to a practical and affordable next generation linear collider at the TeV energy level. Other applications, which are more near-term, include accelerators for materials processing; medicine; defense; mining; security; and inspection. The specific goals of the MIT program are: • Pioneering theoretical research on advanced structures for high gradient acceleration, including photonic structures and metamaterial structures; evaluation of the wakefields in these advanced structures • Experimental research to demonstrate the properties of advanced structures both in low-power microwave coldmore » test and high-power, high-gradient test at megawatt power levels • Experimental research on microwave breakdown at high gradient including studies of breakdown phenomena induced by RF electric fields and RF magnetic fields; development of new diagnostics of the breakdown process • Theoretical research on the physics and engineering features of RF vacuum breakdown • Maintaining and improving the Haimson / MIT 17 GHz accelerator, the highest frequency operational accelerator in the world, a unique facility for accelerator research • Providing the Haimson / MIT 17 GHz accelerator facility as a facility for outside users • Active participation in the US DOE program of High Gradient Collaboration, including joint work with SLAC and with Los Alamos National Laboratory; participation of MIT students in research at the national laboratories • Training the next generation of Ph. D. students in the field of accelerator physics.« less

  20. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  1. Testing of a Plasmadynamic Hypervelocity Dust Accelerator

    NASA Astrophysics Data System (ADS)

    Ticos, Catalin M.; Wang, Zhehui; Dorf, Leonid A.; Wurden, G. A.

    2006-10-01

    A plasmadynamic accelerator for microparticles (or dust grains) has been designed, built and tested at Los Alamos National laboratory. The dust grains are expected to be accelerated to hypervelocities on the order of 1-30 km/s, depending on their size. The key components of the plasmadynamic accelerator are a coaxial plasma gun operated at 10 kV, a dust dispenser activated by a piezoelectric transducer, and power and remote-control systems. The coaxial plasma gun produces a high density (10^18 cm-3) and low temperature (˜ 1 eV) plasma in deuterium ejected by J x B forces, which provides drag on the dust particles in its path. Carbon dust particles will be used, with diameters from 1 to 50 μm. The plasma parameters produced in the coaxial gun are presented and their implication to dust acceleration is discussed. High speed dust will be injected in the National Spherical Torus Experiment to measure the pitch angle of magnetic field lines.

  2. Fifty-one years of Los Alamos Spacecraft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenimore, Edward E.

    2014-09-04

    From 1963 to 2014, the Los Alamos National Laboratory was involved in at least 233 spacecraft. There are probably only one or two institutions in the world that have been involved in so many spacecraft. Los Alamos space exploration started with the Vela satellites for nuclear test detection, but soon expanded to ionospheric research (mostly barium releases), radioisotope thermoelectric generators, solar physics, solar wind, magnetospheres, astrophysics, national security, planetary physics, earth resources, radio propagation in the ionosphere, and cubesats. Here, we present a list of the spacecraft, their purpose, and their launch dates for use during RocketFest

  3. GPU acceleration of the Locally Selfconsistent Multiple Scattering code for first principles calculation of the ground state and statistical physics of materials

    DOE PAGES

    Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; ...

    2016-07-12

    The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn–Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. In this paper, we present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Finally, using the Craymore » XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.« less

  4. SimTrack: A compact c++ code for particle orbit and spin tracking in accelerators

    DOE PAGES

    Luo, Yun

    2015-08-29

    SimTrack is a compact c++ code of 6-d symplectic element-by-element particle tracking in accelerators originally designed for head-on beam–beam compensation simulation studies in the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory. It provides a 6-d symplectic orbit tracking with the 4th order symplectic integration for magnet elements and the 6-d symplectic synchro-beam map for beam–beam interaction. Since its inception in 2009, SimTrack has been intensively used for dynamic aperture calculations with beam–beam interaction for RHIC. Recently, proton spin tracking and electron energy loss due to synchrotron radiation were added. In this article, I will present the code architecture,more » physics models, and some selected examples of its applications to RHIC and a future electron-ion collider design eRHIC.« less

  5. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix B: Surface ground motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, T.A.; Baker, D.F.; Edwards, C.L.

    1993-10-01

    Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less

  6. Using the Internet in Middle Schools: A Model for Success. A Collaborative Effort between Los Alamos National Laboratory (LANL) and Los Alamos Middle School (LAMS).

    ERIC Educational Resources Information Center

    Addessio, Barbara K.; And Others

    Los Alamos National Laboratory (LANL) developed a model for school networking using Los Alamos Middle School as a testbed. The project was a collaborative effort between the school and the laboratory. The school secured administrative funding for hardware and software; and LANL provided the network architecture, installation, consulting, and…

  7. Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System

    NASA Astrophysics Data System (ADS)

    Aizawa, Naoto; Iwasaki, Tomohiko

    2014-06-01

    Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.

  8. A survey of macromycete diversity at Los Alamos National Laboratory, Bandelier National Monument, and Los Alamos County; A preliminary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarmie, N.; Rogers, F.J.

    The authors have completed a 5-year survey (1991--1995) of macromycetes found in Los Alamos County, Los Alamos National Laboratory, and Bandelier National Monument. The authors have compiled a database of 1,048 collections, their characteristics, and identifications. The database represents 123 (98%) genera and 175 (73%) species reliably identified. Issues of habitat loss, species extinction, and ecological relationships are addressed, and comparisons with other surveys are made. With this baseline information and modeling of this baseline data, one can begin to understand more about the fungal flora of the area.

  9. Los Alamos, Toshiba probing Fukushima with cosmic rays

    ScienceCinema

    Morris, Christopher

    2018-01-16

    Los Alamos National Laboratory has announced an impending partnership with Toshiba Corporation to use a Los Alamos technique called muon tomography to safely peer inside the cores of the Fukushima Daiichi reactors and create high-resolution images of the damaged nuclear material inside without ever breaching the cores themselves. The initiative could reduce the time required to clean up the disabled complex by at least a decade and greatly reduce radiation exposure to personnel working at the plant. Muon radiography (also called cosmic-ray radiography) uses secondary particles generated when cosmic rays collide with upper regions of Earth's atmosphere to create images of the objects that the particles, called muons, penetrate. The process is analogous to an X-ray image, except muons are produced naturally and do not damage the materials they contact. Muon radiography has been used before in imaginative applications such as mapping the interior of the Great Pyramid at Giza, but Los Alamos's muon tomography technique represents a vast improvement over earlier technology.

  10. New Rad Lab for Los Alamos

    ScienceCinema

    None

    2017-12-09

    The topping out ceremony for a key construction stage in the Los Alamos National Laboratory's newest facility, the Radiological Laboratory Utility & Office Building. This is part of the National Nu...  

  11. Notes on Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meade, Roger Allen

    In 1954 an unknown author drafted a report, reprinted below, describing the Laboratory and the community as they existed in late 1953. This report, perhaps intended to be crafted into a public relations document, is valuable because it gives us an autobiographical look at Los Alamos during the first half of the 1950s. It has been edited to enhance readability.

  12. Parallel processing a three-dimensional free-lagrange code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandell, D.A.; Trease, H.E.

    1989-01-01

    A three-dimensional, time-dependent free-Lagrange hydrodynamics code has been multitasked and autotasked on a CRAY X-MP/416. The multitasking was done by using the Los Alamos Multitasking Control Library, which is a superset of the CRAY multitasking library. Autotasking is done by using constructs which are only comment cards if the source code is not run through a preprocessor. The three-dimensional algorithm has presented a number of problems that simpler algorithms, such as those for one-dimensional hydrodynamics, did not exhibit. Problems in converting the serial code, originally written for a CRAY-1, to a multitasking code are discussed. Autotasking of a rewritten versionmore » of the code is discussed. Timing results for subroutines and hot spots in the serial code are presented and suggestions for additional tools and debugging aids are given. Theoretical speedup results obtained from Amdahl's law and actual speedup results obtained on a dedicated machine are presented. Suggestions for designing large parallel codes are given.« less

  13. Calculations of skyshine from an intense portable electron linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estes, G.P.; Hughes, H.G.; Fry, D.A.

    1994-12-31

    The MCNP Monte carlo code has been used at Los Alamos to calculate skyshine and terrain albedo efects from an intense portable electron linear accelerator that is to be used by the Russian Federation to radiograph nuclear weapons that may have been damaged by accidents. Relative dose rate profiles have been calculated. The design of the accelerator, along with a diagram, is presented.

  14. Pre Incident Planning For The Los Alamos National Laboratory

    DTIC Science & Technology

    2017-12-01

    laboratory was asked to design and build the world’s first atomic bomb . The Los Alamos Fire Department (LAFD) provides emergency response services to...Project: the newly established laboratory was asked to design and build the world’s first atomic bomb . The Los Alamos Fire Department (LAFD) provides...lower priority despite its importance to the responders’ scene safety.20 In a Carolina Fire Rescue EMS Journal article, retired New York City

  15. Possibility for ultra-bright electron beam acceleration in dielectric wakefield accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simakov, Evgenya I.; Carlsten, Bruce E.; Shchegolkov, Dmitry Yu.

    2012-12-21

    We describe a conceptual proposal to combine the Dielectric Wakefield Accelerator (DWA) with the Emittance Exchanger (EEX) to demonstrate a high-brightness DWA with a gradient of above 100 MV/m and less than 0.1% induced energy spread in the accelerated beam. We currently evaluate the DWA concept as a performance upgrade for the future LANL signature facility MaRIE with the goal of significantly reducing the electron beam energy spread. The preconceptual design for MaRIE is underway at LANL, with the design of the electron linear accelerator being one of the main research goals. Although generally the baseline design needs to bemore » conservative and rely on existing technology, any future upgrade would immediately call for looking into the advanced accelerator concepts capable of boosting the electron beam energy up by a few GeV in a very short distance without degrading the beam's quality. Scoping studies have identified large induced energy spreads as the major cause of beam quality degradation in high-gradient advanced accelerators for free-electron lasers. We describe simulations demonstrating that trapezoidal bunch shapes can be used in a DWA to greatly reduce the induced beam energy spread, and, in doing so, also preserve the beam brightness at levels never previously achieved. This concept has the potential to advance DWA technology to a level that would make it suitable for the upgrades of the proposed Los Alamos MaRIE signature facility.« less

  16. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  17. Los Alamos Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergen, Benjamin Karl

    This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programming models. It starts by listing their assumptions for the programming models and then details a hierarchical programming model at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.

  18. Effects of acceleration rate on Rayleigh-Taylor instability in elastic-plastic materials

    NASA Astrophysics Data System (ADS)

    Banerjee, Arindam; Polavarapu, Rinosh

    2016-11-01

    The effect of acceleration rate in the elastic-plastic transition stage of Rayleigh-Taylor instability in an accelerated non-Newtonian material is investigated experimentally using a rotating wheel experiment. A non-Newtonian material (mayonnaise) was accelerated at different rates by varying the angular acceleration of a rotating wheel and growth patterns of single mode perturbations with different combinations of amplitude and wavelength were analyzed. Experiments were run at two different acceleration rates to compare with experiments presented in prior years at APS DFD meetings and the peak amplitude responses are captured using a high-speed camera. Similar to the instability acceleration, the elastic-plastic transition acceleration is found to be increasing with increase in acceleration rate for a given amplitude and wavelength. The experimental results will be compared to various analytical strength models and prior experimental studies using Newtonian fluids. Authors acknowledge funding support from Los Alamos National Lab subcontract(370333) and DOE-SSAA Grant (DE-NA0001975).

  19. A progress report on UNICOS misuse detection at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, J.L.; Jackson, K.A.; Stallings, C.A.

    An effective method for detecting computer misuse is the automatic monitoring and analysis of on-line user activity. During the past year, Los Alamos enhanced its Network Anomaly Detection and Intrusion Reporter (NADIR) to include analysis of user activity on Los Alamos` UNICOS Crays. In near real-time, NADIR compares user activity to historical profiles and tests activity against expert rules. The expert rules express Los Alamos` security policy and define improper or suspicious behavior. NADIR reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. This paper describes the implementation to date of the UNICOS component ofmore » NADIR, along with the operational experiences and future plans for the system.« less

  20. Los Alamos, Toshiba probing Fukushima with cosmic rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Christopher

    2014-06-16

    Los Alamos National Laboratory has announced an impending partnership with Toshiba Corporation to use a Los Alamos technique called muon tomography to safely peer inside the cores of the Fukushima Daiichi reactors and create high-resolution images of the damaged nuclear material inside without ever breaching the cores themselves. The initiative could reduce the time required to clean up the disabled complex by at least a decade and greatly reduce radiation exposure to personnel working at the plant. Muon radiography (also called cosmic-ray radiography) uses secondary particles generated when cosmic rays collide with upper regions of Earth's atmosphere to create imagesmore » of the objects that the particles, called muons, penetrate. The process is analogous to an X-ray image, except muons are produced naturally and do not damage the materials they contact. Muon radiography has been used before in imaginative applications such as mapping the interior of the Great Pyramid at Giza, but Los Alamos's muon tomography technique represents a vast improvement over earlier technology.« less

  1. Biological assessment for the effluent reduction program, Los Alamos National Laboratory, Los Alamos, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cross, S.P.

    1996-08-01

    This report describes the biological assessment for the effluent recution program proposed to occur within the boundaries of Los Alamos National Laboratory. Potential effects on wetland plants and on threatened and endangered species are discussed, along with a detailed description of the individual outfalls resulting from the effluent reduction program.

  2. Next-generation acceleration and code optimization for light transport in turbid media using GPUs

    PubMed Central

    Alerstam, Erik; Lo, William Chun Yip; Han, Tianyi David; Rose, Jonathan; Andersson-Engels, Stefan; Lilge, Lothar

    2010-01-01

    A highly optimized Monte Carlo (MC) code package for simulating light transport is developed on the latest graphics processing unit (GPU) built for general-purpose computing from NVIDIA - the Fermi GPU. In biomedical optics, the MC method is the gold standard approach for simulating light transport in biological tissue, both due to its accuracy and its flexibility in modelling realistic, heterogeneous tissue geometry in 3-D. However, the widespread use of MC simulations in inverse problems, such as treatment planning for PDT, is limited by their long computation time. Despite its parallel nature, optimizing MC code on the GPU has been shown to be a challenge, particularly when the sharing of simulation result matrices among many parallel threads demands the frequent use of atomic instructions to access the slow GPU global memory. This paper proposes an optimization scheme that utilizes the fast shared memory to resolve the performance bottleneck caused by atomic access, and discusses numerous other optimization techniques needed to harness the full potential of the GPU. Using these techniques, a widely accepted MC code package in biophotonics, called MCML, was successfully accelerated on a Fermi GPU by approximately 600x compared to a state-of-the-art Intel Core i7 CPU. A skin model consisting of 7 layers was used as the standard simulation geometry. To demonstrate the possibility of GPU cluster computing, the same GPU code was executed on four GPUs, showing a linear improvement in performance with an increasing number of GPUs. The GPU-based MCML code package, named GPU-MCML, is compatible with a wide range of graphics cards and is released as an open-source software in two versions: an optimized version tuned for high performance and a simplified version for beginners (http://code.google.com/p/gpumcml). PMID:21258498

  3. Gadolinium-148 and other spallation production cross section measurements for accelerator target facilities

    NASA Astrophysics Data System (ADS)

    Kelley, Karen Corzine

    At the Los Alamos Neutron Science Center accelerator complex, protons are accelerated to 800 MeV and directed to two tungsten targets, Target 4 at the Weapons Neutron Research facility and the 1L target at the Lujan Center. The Department of Energy requires hazard classification analyses to be performed on these targets and places limits on certain radionuclide inventories in the targets to avoid characterizing the facilities as "nuclear facilities." Gadolinium-148 is a radionuclide created from the spallation of tungsten. Allowed isotopic inventories are particularly low for this isotope because it is an alpha-particle emitter with a 75-year half-life. The activity level of Gadolinium-148 is low, but it encompasses almost two-thirds of the total dose burden for the two tungsten targets based on present yield estimates. From a hazard classification standpoint, this severely limits the lifetime of these tungsten targets. The cross section is not well-established experimentally and this is the motivation for measuring the Gadolinium-148 production cross section from tungsten. In a series of experiments at the Weapons Neutron Research facility, Gadolinium-148 production was measured for 600- and 800-MeV protons on tungsten, tantalum, and gold. These experiments used 3 mum thin tungsten, tantalum, and gold foils and 10 mum thin aluminum activation foils. In addition, spallation yields were determined for many short-lived and long-lived spallation products with these foils using gamma and alpha spectroscopy and compared with predictions of the Los Alamos National Laboratory codes CEM2k+GEM2 and MCNPX. The cumulative Gadolinium-148 production cross section measured from tantalum, tungsten, and gold for incident 600-MeV protons were 15.2 +/- 4.0, 8.31 +/- 0.92, and 0.591 +/- 0.155, respectively. The average production cross sections measured at 800 MeV were 28.6 +/- 3.5, 19.4 +/- 1.8, and 3.69 +/- 0.50 for tantalum, tungsten, and gold, respectively. These cumulative

  4. Los Alamos Using Neutrons to Stop Nuclear Smugglers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favalli, Andrea; Swinhoe, Martyn; Roark, Kevin

    Los Alamos National Laboratory researchers have successfully demonstrated for the first time that laser-generated neutrons can be enlisted as a useful tool in the War on Terror. The international research team used the short-pulse laser at Los Alamos's TRIDENT facility to generate a neutron beam with novel characteristics that interrogated a closed container to confirm the presence and quantity of nuclear material inside. The successful experiment paves the way for creation of a table-top-sized or truck-mounted neutron generator that could be installed at strategic locations worldwide to thwart smugglers trafficking in nuclear materials.

  5. Los Alamos Using Neutrons to Stop Nuclear Smugglers

    ScienceCinema

    Favalli, Andrea; Swinhoe, Martyn; Roark, Kevin

    2018-02-14

    Los Alamos National Laboratory researchers have successfully demonstrated for the first time that laser-generated neutrons can be enlisted as a useful tool in the War on Terror. The international research team used the short-pulse laser at Los Alamos's TRIDENT facility to generate a neutron beam with novel characteristics that interrogated a closed container to confirm the presence and quantity of nuclear material inside. The successful experiment paves the way for creation of a table-top-sized or truck-mounted neutron generator that could be installed at strategic locations worldwide to thwart smugglers trafficking in nuclear materials.

  6. Electron acceleration in the Solar corona - 3D PiC code simulations of guide field reconnection

    NASA Astrophysics Data System (ADS)

    Alejandro Munoz Sepulveda, Patricio

    2017-04-01

    The efficient electron acceleration in the solar corona detected by means of hard X-ray emission is still not well understood. Magnetic reconnection through current sheets is one of the proposed production mechanisms of non-thermal electrons in solar flares. Previous works in this direction were based mostly on test particle calculations or 2D fully-kinetic PiC simulations. We have now studied the consequences of self-generated current-aligned instabilities on the electron acceleration mechanisms by 3D magnetic reconnection. For this sake, we carried out 3D Particle-in-Cell (PiC) code numerical simulations of force free reconnecting current sheets, appropriate for the description of the solar coronal plasmas. We find an efficient electron energization, evidenced by the formation of a non-thermal power-law tail with a hard spectral index smaller than -2 in the electron energy distribution function. We discuss and compare the influence of the parallel electric field versus the curvature and gradient drifts in the guiding-center approximation on the overall acceleration, and their dependence on different plasma parameters.

  7. Wider pulsation instability regions for β Cephei and SPB stars calculated using new Los Alamos opacities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walczak, Przemysław; Fontes, Christopher John; Colgan, James Patrick

    Here, our goal is to test the newly developed OPLIB opacity tables from Los Alamos National Laboratory and check their influence on the pulsation properties of B-type stars. We calculated models using MESA and Dziembowski codes for stellar evolution and linear, nonadiabatic pulsations, respectively. We derived the instability domains of β Cephei and SPB-types for different opacity tables OPLIB, OP, and OPAL. As a result, the new OPLIB opacities have the highest Rosseland mean opacity coefficient near the so-called Z-bump. Therefore, the OPLIB instability domains are wider than in the case of OP and OPAL data.

  8. Wider pulsation instability regions for β Cephei and SPB stars calculated using new Los Alamos opacities

    DOE PAGES

    Walczak, Przemysław; Fontes, Christopher John; Colgan, James Patrick; ...

    2015-08-13

    Here, our goal is to test the newly developed OPLIB opacity tables from Los Alamos National Laboratory and check their influence on the pulsation properties of B-type stars. We calculated models using MESA and Dziembowski codes for stellar evolution and linear, nonadiabatic pulsations, respectively. We derived the instability domains of β Cephei and SPB-types for different opacity tables OPLIB, OP, and OPAL. As a result, the new OPLIB opacities have the highest Rosseland mean opacity coefficient near the so-called Z-bump. Therefore, the OPLIB instability domains are wider than in the case of OP and OPAL data.

  9. Parallel processing a real code: A case history

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandell, D.A.; Trease, H.E.

    1988-01-01

    A three-dimensional, time-dependent Free-Lagrange hydrodynamics code has been multitasked and autotasked on a Cray X-MP/416. The multitasking was done by using the Los Alamos Multitasking Control Library, which is a superset of the Cray multitasking library. Autotasking is done by using constructs which are only comment cards if the source code is not run through a preprocessor. The 3-D algorithm has presented a number of problems that simpler algorithms, such as 1-D hydrodynamics, did not exhibit. Problems in converting the serial code, originally written for a Cray 1, to a multitasking code are discussed, Autotasking of a rewritten version ofmore » the code is discussed. Timing results for subroutines and hot spots in the serial code are presented and suggestions for additional tools and debugging aids are given. Theoretical speedup results obtained from Amdahl's law and actual speedup results obtained on a dedicated machine are presented. Suggestions for designing large parallel codes are given. 8 refs., 13 figs.« less

  10. Flaws found in Los Alamos safety procedures

    NASA Astrophysics Data System (ADS)

    Gwynne, Peter

    2017-12-01

    A US government panel on nuclear safety has discovered a series of safety issues at the Los Alamos National Laboratory, concluding that government oversight of the lab's emergency preparation has been ineffective.

  11. 75 FR 72829 - Los Alamos Historical Document Retrieval and Assessment (LAHDRA) Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... Historical Document Retrieval and Assessment (LAHDRA) Project The Centers for Disease Control and Prevention... release of the Final Report of the Los Alamos Historical Document Retrieval and Assessment (LAHDRA)Project... information about historical chemical or radionuclide releases from facilities at the Los Alamos National...

  12. A physicists guide to The Los Alamos Primer

    NASA Astrophysics Data System (ADS)

    Reed, B. Cameron

    2016-11-01

    In April 1943, a group of scientists at the newly established Los Alamos Laboratory were given a series of lectures by Robert Serber on what was then known of the physics and engineering issues involved in developing fission bombs. Serber’s lectures were recorded in a 24 page report titled The Los Alamos Primer, which was subsequently declassified and published in book form. This paper describes the background to the Primer and analyzes the physics contained in its 22 sections. The motivation for this paper is to provide a firm foundation of the background and contents of the Primer for physicists interested in the Manhattan Project and nuclear weapons.

  13. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-10-01

    This document is the third volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, exceptmore » for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of appendices C through U of the report« less

  14. Environmental surveillance at Los Alamos during 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-07-01

    This report describes environmental monitoring activities at Los Alamos National Laboratory for 1994. Data were collected to assess external penetrating radiation, airborne emissions, liquid effluents, radioactivity of environmental materials and food stuffs, and environmental compliance.

  15. Accelerating Convolutional Sparse Coding for Curvilinear Structures Segmentation by Refining SCIRD-TS Filter Banks.

    PubMed

    Annunziata, Roberto; Trucco, Emanuele

    2016-11-01

    Deep learning has shown great potential for curvilinear structure (e.g., retinal blood vessels and neurites) segmentation as demonstrated by a recent auto-context regression architecture based on filter banks learned by convolutional sparse coding. However, learning such filter banks is very time-consuming, thus limiting the amount of filters employed and the adaptation to other data sets (i.e., slow re-training). We address this limitation by proposing a novel acceleration strategy to speed-up convolutional sparse coding filter learning for curvilinear structure segmentation. Our approach is based on a novel initialisation strategy (warm start), and therefore it is different from recent methods improving the optimisation itself. Our warm-start strategy is based on carefully designed hand-crafted filters (SCIRD-TS), modelling appearance properties of curvilinear structures which are then refined by convolutional sparse coding. Experiments on four diverse data sets, including retinal blood vessels and neurites, suggest that the proposed method reduces significantly the time taken to learn convolutional filter banks (i.e., up to -82%) compared to conventional initialisation strategies. Remarkably, this speed-up does not worsen performance; in fact, filters learned with the proposed strategy often achieve a much lower reconstruction error and match or exceed the segmentation performance of random and DCT-based initialisation, when used as input to a random forest classifier.

  16. Particle acceleration and transport at a 2D CME-driven shock using the HAFv3 and PATH Code

    NASA Astrophysics Data System (ADS)

    Li, G.; Ao, X.; Fry, C. D.; Verkhoglyadova, O. P.; Zank, G. P.

    2012-12-01

    We study particle acceleration at a 2D CME-driven shock and the subsequent transport in the inner heliosphere (up to 2 AU) by coupling the kinematic Hakamada-Akasofu-Fry version 3 (HAFv3) solar wind model (Hakamada and Akasofu, 1982, Fry et al. 2003) with the Particle Acceleration and Transport in the Heliosphere (PATH) model (Zank et al., 2000, Li et al., 2003, 2005, Verkhoglyadova et al. 2009). The HAFv3 provides the evolution of a two-dimensional shock geometry and other plasma parameters, which are fed into the PATH model to investigate the effect of a varying shock geometry on particle acceleration and transport. The transport module of the PATH model is parallelized and utilizes the state-of-the-art GPU computation technique to achieve a rapid physics-based numerical description of the interplanetary energetic particles. Together with a fast execution of the HAFv3 model, the coupled code gives us a possibility to nowcast/forecast the interplanetary radiation environment.

  17. Lattice modeling and application of independent component analysis to high power, long bunch beams in the Los Alamos Proton Storage Ring

    NASA Astrophysics Data System (ADS)

    Kolski, Jeffrey

    The linear lattice properties of the Proton Storage Ring (PSR) at the Los Alamos Neutron Science Center (LANSCE) in Los Alamos, NM were measured and applied to determine a better linear accelerator model. We found that the initial model was deficient in predicting the vertical focusing strength. The additional vertical focusing was located through fundamental understanding of experiment and statistically rigorous analysis. An improved model was constructed and compared against the initial model and measurement at operation set points and set points far away from nominal and was shown to indeed be an enhanced model. Independent component analysis (ICA) is a tool for data mining in many fields of science. Traditionally, ICA is applied to turn-by-turn beam position data as a means to measure the lattice functions of the real machine. Due to the diagnostic setup for the PSR, this method is not applicable. A new application method for ICA is derived, ICA applied along the length of the bunch. The ICA modes represent motions within the beam pulse. Several of the dominate ICA modes are experimentally identified.

  18. Recent Infrasound Calibration Activity at Los Alamos

    NASA Astrophysics Data System (ADS)

    Whitaker, R. W.; Marcillo, O. E.

    2014-12-01

    Absolute infrasound sensor calibration is necessary for estimating source sizes from measured waveforms. This can be an important function in treaty monitoring. The Los Alamos infrasound calibration chamber is capable of absolute calibration. Early in 2014 the Los Alamos infrasound calibration chamber resumed operations in its new location after an unplanned move two years earlier. The chamber has two sources of calibration signals. The first is the original mechanical piston, and the second is a CLD Dynamics Model 316 electro-mechanical unit that can be digitally controlled and provide a richer set of calibration options. During 2008-2010 a number of upgrades were incorporated for improved operation and recording. In this poster we give an overview of recent chamber work on sensor calibrations, calibration with the CLD unit, some measurements with different porous hoses and work with impulse sources.

  19. Reconnaissance assessment of erosion and sedimentation in the Canada de los Alamos basin, Los Angeles and Ventura Counties, California

    USGS Publications Warehouse

    Knott, J.M.

    1980-01-01

    An assessment of present erosion and sedimentation conditions in the Ca?ada de los Alamos basin was made to aid in estimating the impact of off-road-vehicle use on the sediment yield of the basin. Impacts of off-road vehicles were evaluated by reconnaissance techniques and by comparing the study area with other offroad-vehicle sites in California. Major-storm sediment yields for the basin were estimated using empirical equations developed for the Transverse Ranges and measurements of gully erosion in a representative off-road-vehicle basin. Normal major-storm yields of 73,200 cubic yards would have to be increased to about 98,000 cubic yards to account for the existing level of accelerated erosion caused by off-road vehicles. Long-term sediment yield of the Ca?ada de los Alamos basin upstream from its confluence with Gorman Creek, under present conditions of off-road-vehicle use, is approximately 420 cubic yards per square mile per year--a rate that is considerably lower than a previous estimate of 1,270 cubic yards per square mile per year for the total catchment area above Pyramid Lake.

  20. Geothermal investigation of spring and well waters of the Los Alamos Region, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goff, F.E.; Sayer, S.

    1980-04-01

    The chemical and isotopic characters of 20 springs and wells in the Los Alamos area were investigated for indications of geothermal potential. These waters were compared with known hot and mineral springs from adjacent Valles Caldera and San Ysidro. All waters in the Los Alamos area are composed of meteoric water. Isotopic data show that the two primary aquifers beneath the Los Alamos region have different recharge areas. Relatively high concentrations of lithium, arsenic, chlorine, boron, and fluorine in some of the Los Alamos wells suggest these waters may contain a small fraction of thermal/mineral water of deep origin. Thermalmore » water probably rises up high-angle faults associated with a graben of the Rio Grande rift now buried by the Pajarito Plateau.« less

  1. Penetrating radiation: applications at Los Alamos National Laboratory

    NASA Astrophysics Data System (ADS)

    Watson, Scott; Hunter, James; Morris, Christopher

    2013-09-01

    Los Alamos has used penetrating radiography extensively throughout its history dating back to the Manhattan Project where imaging dense, imploding objects was the subject of intense interest. This interest continues today as major facilities like DARHT1 have become the mainstay of the US Stockpile Stewardship Program2 and the cornerstone of nuclear weapons certification. Meanwhile, emerging threats to national security from cargo containers and improvised explosive devices (IEDs) have invigorated inspection efforts using muon tomography, and compact x-ray radiography. Additionally, unusual environmental threats, like those from underwater oil spills and nuclear power plant accidents, have caused renewed interest in fielding radiography in severe operating conditions. We review the history of penetrating radiography at Los Alamos and survey technologies as presently applied to these important problems.

  2. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-10-01

    This document is the first volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, exceptmore » for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of an introduction, summary/conclusion, site description and assessment, description of facility, and description of operation.« less

  3. Los Alamos National Laboratory Human and Intellectual Capital for Sustaining Nuclear Deterrence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAlpine, Bradley

    2015-04-01

    This paper provides an overview of the current human and intellectual capital at Los Alamos National Laboratory, through specific research into the statistics and demographics as well as numerous personal interviews at all levels of personnel. Based on this information, a series of recommendations are provided to assist Los Alamos National Laboratory in ensuring the future of the human and intellectual capital for the nuclear deterrence mission. While the current human and intellectual capital is strong it stands on the precipice and action must be taken to ensure Los Alamos National Laboratory maintains leadership in developing and sustaining national nuclearmore » capabilities. These recommendations may be applicable to other areas of the nuclear enterprise, including the Air Force, after further research and study.« less

  4. Los Alamos Before and After the Fire

    NASA Technical Reports Server (NTRS)

    2002-01-01

    On May 4, 2000, a prescribed fire was set at Bandelier National Monument, New Mexico, to clear brush and dead and dying undergrowth to prevent a larger, subsequent wildfire. Unfortunately, due to high winds and extremely dry conditions in the surrounding area, the prescribed fire quickly raged out of control and, by May 10, the blaze had spread into the nearby town of Los Alamos. In all, more than 20,000 people were evacuated from their homes and more than 200 houses were destroyed as the flames consumed about 48,000 acres in and around the Los Alamos area. The pair of images above were acquired by the Enhanced Thematic Mapper Plus (ETM+) sensor, flying aboard NASA's Landsat 7 satellite, shortly before the Los Alamos fire (top image, acquired April 14) and shortly after the fire was extinguished (lower image, June 17). The images reveal the extent of the damage caused by the fire. Combining ETM+ channels 7, 4, and 2 (one visible and two infrared channels) results in a false-color image where vegetation appears as bright to dark green. Forested areas are generally dark green while herbaceous vegetation is light green. Rangeland or more open areas appear pink to light purple. Areas with extensive pavement or urban development appear light blue or white to purple. Less densely-developed residential areas appear light green and golf courses are very bright green. In the lower image, the areas recently burned appear bright red. Landsat 7 data courtesy United States Geological Survey EROS DataCenter. Images by Robert Simmon, NASA GSFC.

  5. Critical partnerships: Los Alamos, universities, and industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berger, C.L.

    1997-04-01

    Los Alamos National Laboratory, situated 35 miles northwest of Santa Fe, NM, is one of the Department of Energy`s three Defense Programs laboratories. It encompasses 43 square miles, employees approximately 10,000 people, and has a budget of approximately $1.1B in FY97. Los Alamos has a strong post-cold war mission, that of reducing the nuclear danger. But even with that key role in maintaining the nation`s security, Los Alamos views partnerships with universities and industry as critical to its future well being. Why is that? As the federal budget for R&D comes under continued scrutiny and certain reduction, we believe thatmore » the triad of science and technology contributors to the national system of R&D must rely on and leverage each others capabilities. For us this means that we will rely on these partners to help us in 5 key ways: We expect that partnerships will help us maintain and enhance our core competencies. In doing so, we will be able to attract the best scientists and engineers. To keep on the cutting edge of research and development, we have found that partnerships maintain the excellence of staff through new and exciting challenges. Additionally, we find that from our university and corporate partners we often learn and incorporate {open_quotes}best practices{close_quotes} in organizational management and operations. Finally, we believe that a strong national system of R&D will ensure and enhance our ability to generate revenues.« less

  6. Los Alamos National Laboratory Facility Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Ronald Owen

    2015-06-05

    This series of slides depicts the Los Alamos Neutron Science Center (LANSCE). The Center's 800-MeV linac produces H + and H - beams as well as beams of moderated (cold to 1 MeV) and unmoderated (0.1 to 600 MeV) neutrons. Experimental facilities and their capabilities and characteristics are outlined. Among these are LENZ, SPIDER, and DANCE.

  7. Computational Accelerator Physics. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisognano, J.J.; Mondelli, A.A.

    1997-04-01

    The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less

  8. Los Alamos National Laboratory Prepares for Fire Season

    ScienceCinema

    L’Esperance, Manny

    2018-01-16

    Through the establishment of a Wildland Fire Program Office, and the Interagency Fire Base located on Laboratory property, Los Alamos National Laboratory is continuing and improving a program to prepare for wildland fire.

  9. Los Alamos National Laboratory Prepares for Fire Season

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L’Esperance, Manny

    Through the establishment of a Wildland Fire Program Office, and the Interagency Fire Base located on Laboratory property, Los Alamos National Laboratory is continuing and improving a program to prepare for wildland fire.

  10. Enabling cost-effective high-current burst-mode operation in superconducting accelerators

    DOE PAGES

    Sheffield, Richard L.

    2015-06-01

    Superconducting (SC) accelerators are very efficient for CW or long-pulse operation, and normal conducting (NC) accelerators are cost effective for short-pulse operation. The addition of a short NC linac section to a SC linac can correct for the energy droop that occurs when pulsed high-current operation is required that exceeds the capability of the klystrons to replenish the cavity RF fields due to the long field fill-times of SC structures, or a requirement to support a broad range of beam currents results in variable beam loading. This paper describes the implementation of this technique to enable microseconds of high beam-current,more » 90 mA or more, in a 12 GeV SC long-pulse accelerator designed for the MaRIE 42-keV XFEL proposed for Los Alamos National Laboratory.« less

  11. History of Los Alamos Participation in Active Experiments in Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pongratz, Morris B.

    Beginning with the Teak nuclear test in 1958, Los Alamos has a long history of participation in active experiments in space. The last pertinent nuclear tests were the five explosions as part of the Dominic series in 1962. The Partial Test Ban Treaty signed in August 1963 prohibited all test detonations of nuclear weapons except for those conducted underground. Beginning with the “Apple” thermite barium release in June 1968 Los Alamos has participated in nearly 100 non-nuclear experiments in space, the last being the NASA-sponsored “AA-2” strontium and europium doped barium thermite releases in the Arecibo beam in July ofmore » 1992. The rationale for these experiments ranged from studying basic plasma processes such as gradientdriven structuring and velocity-space instabilities to illuminating the convection of plasmas in the ionosphere and polar cap to ionospheric depletion experiments to the B.E.A.R. 1-MeV neutral particle beam test in 1989. This report reviews the objectives, techniques and diagnostics of Los Alamos participation in active experiments in space.« less

  12. Water Supply at Los Alamos during 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. N. Maes; S. G. McLin; W. D. Purtymun

    1998-12-01

    Production of potable municipal water supplies during 1997 totaled about 1,285.9 million gallons from wells in the Guaje, Pajarito, and Otowi well fields. There was no water used from the spring gallery in Water Canyon or from Guaje Reservoir during 1997. About 2.4 million gallons of water from Los Alamos Reservoir was used to irrigate public parks and recreational lands. The total water usage in 1997 was about 1,288.3 million gallons, or about 135 gallons per day per person living in Los Alamos County. Groundwater pumpage was down about 82.2 million gallons in 1997 compared with the pumpage in 1996.more » Four new replacement wells were drilled and cased in Guaje Canyon between October 1997 and March 1998. These wells are currently being developed and aquifer tests are being performed. A special report summarizing the geological, geophysical, and well construction logs will be issued in the near future for these new wells.« less

  13. Annual Report on the Activities and Publications of the DHS-DNDO-NTNFC Sponsored Post-doctoral Fellow at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rim, Jung Ho; Tandon, Lav

    This report is a summary of the projects Jung Rim is working on as a DHS postdoctoral fellow at Los Alamos National Laboratory. These research projects are designed to explore different radioanalytical methods to support nuclear forensics applications. The current projects discussed here include development of alpha spectroscopy method for 240/239Pu Isotopic ratio measurement, non-destructive uranium assay method using gamma spectroscopy, and 236U non-destructive uranium analysis using FRAM code. This report documents the work that has been performed since the start of the postdoctoral appointment.

  14. Status of MAPA (Modular Accelerator Physics Analysis) and the Tech-X Object-Oriented Accelerator Library

    NASA Astrophysics Data System (ADS)

    Cary, J. R.; Shasharina, S.; Bruhwiler, D. L.

    1998-04-01

    The MAPA code is a fully interactive accelerator modeling and design tool consisting of a GUI and two object-oriented C++ libraries: a general library suitable for treatment of any dynamical system, and an accelerator library including many element types plus an accelerator class. The accelerator library inherits directly from the system library, which uses hash tables to store any relevant parameters or strings. The GUI can access these hash tables in a general way, allowing the user to invoke a window displaying all relevant parameters for a particular element type or for the accelerator class, with the option to change those parameters. The system library can advance an arbitrary number of dynamical variables through an arbitrary mapping. The accelerator class inherits this capability and overloads the relevant functions to advance the phase space variables of a charged particle through a string of elements. Among other things, the GUI makes phase space plots and finds fixed points of the map. We discuss the object hierarchy of the two libraries and use of the code.

  15. Audit Report, "Fire Protection Deficiencies at Los Alamos National Laboratory"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-06-01

    The Department of Energy's Los Alamos National Laboratory (Los Alamos) maintains some of the Nation's most important national security assets, including nuclear materials. Many of Los Alamos' facilities are located in close proximity to one another, are occupied by large numbers of contract and Federal employees, and support activities ranging from nuclear weapons design to science-related activities. Safeguarding against fires, regardless of origin, is essential to protecting employees, surrounding communities, and national security assets. On June 1, 2006, Los Alamos National Security, LLC (LANS), became the managing and operating contractor for Los Alamos, under contract with the Department's National Nuclearmore » Security Administration (NNSA). In preparation for assuming its management responsibilities at Los Alamos, LANS conducted walk-downs of the Laboratory's facilities to identify pre-existing deficiencies that could give rise to liability, obligation, loss or damage. The walk-downs, which identified 812 pre-existing fire protection deficiencies, were conducted by subject matter professionals, including fire protection experts. While the Los Alamos Site Office has overall responsibility for the effectiveness of the fire protection program, LANS, as the Laboratory's operating contractor, has a major, day-to-day role in minimizing fire-related risks. The issue of fire protection at Los Alamos is more than theoretical. In May 2000, the 'Cerro Grande' fire burned about 43,000 acres, including 7,700 acres of Laboratory property. Due to the risk posed by fire to the Laboratory's facilities, workforce, and surrounding communities, we initiated this audit to determine whether pre-existing fire protection deficiencies had been addressed. Our review disclosed that LANS had not resolved many of the fire protection deficiencies that had been identified in early 2006: (1) Of the 296 pre-existing deficiencies we selected for audit, 174 (59 percent) had not been

  16. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  17. Los Alamos on Radio Café: Nina Lanza

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lanza, Nina; Domandi, Mary-Charlotte

    2017-04-11

    First up in the new series is Los Alamos National Laboratory’s Nina Lanza from the Space and Remote Sensing group. Lanza is a planetary geologist who has been part of the Mars Curiosity Rover “ChemCam” team since 2012.

  18. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  19. James L. Tuck Los Alamos ball lightning pioneer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, D.A.

    1999-07-01

    James Tuck was well known for starting the Project Sherwood group at Los Alamos Scientific Laboratory in 1952. This group was formed to study and develop concepts for controlled fusion energy. In his later years after retiring from Controlled Fusion Division, he continued research at Los Alamos on the topic of ball lightning. He traveled widely giving lectures on both observations of others and his own experimental efforts. He collected anecdotal observations obtained from those in his lecture audiences during his travels and from responses from newspaper articles where he asked for specific information from ball lightning observers. He finallymore » cut off this collection of data when the number of responses became overwhelming. The author's primary publication on ball lightning was a short laboratory report. He planned on publishing a book on the subject but this was never completed before his death. Tuck focused his experimental effort on attempting to duplicate the production of plasma balls claimed to be observed in US Navy submarines when a switch was opened under overload conditions with battery power. During lunch breaks he made use of a Los Alamos N-division battery bank facility to mock up a submarine power pack and switch gear. This non-funded effort was abruptly terminated when an explosion occurred in the facility. An overview of Tuck's research and views will be given. The flavor Jim's personality as well as a ball produced with his experimental apparatus will be shown using video chips.« less

  20. Upgrades and Enclosure of Building 15 at Technical Area 40: Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Kathryn D; Garcia, Kari L. M; Brunette, Jeremy Christopher

    The U.S. Department of Energy, National Nuclear Security Administration, Los Alamos Field Office (Field Office) proposes to upgrade and enclose Building 15 at Technical Area (TA) 40, Los Alamos National Laboratory. Building TA-40-15, a Cold War-era firing site, was determined eligible for listing in the National Register of Historic Places (Register) in DX Division’s Facility Strategic Plan: Consolidation and Revitalization at Technical Areas 6, 8, 9, 14, 15, 22, 36, 39, 40, 60, and 69 (McGehee et al. 2005). Building TA-40-15 was constructed in 1950 to support detonator testing. The firing site will be enclosed by a steel building tomore » create a new indoor facility that will allow for year-round mission capability. Enclosing TA-40-15 will adversely affect the building by altering the characteristics that make it eligible for the Register. In compliance with Section 106 of the National Historic Preservation Act of 1966, as amended, the Field Office is initiating consultation for this proposed undertaking. The Field Office is also requesting concurrence with the use of standard practices to resolve adverse effects as defined in the Programmatic Agreement among the U.S. Department of Energy, National Nuclear Security Administration, Los Alamos Field Office, the New Mexico State Historic Preservation Office and the Advisory Council on Historic Preservation Concerning Management of the Historic Properties at Los Alamos National Laboratory, Los Alamos, New Mexico.« less

  1. Beam-dynamics codes used at DARHT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Jr., Carl August

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  2. Los Alamos on Radio Café: Ludmil Alexandrov

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Domandi, Mary-Charlotte; Alexandrov, Ludmil

    In a creative breakthrough in cancer research, Ludmil Alexandrov, the J. Robert Oppenheimer Distinguished Postdoctoral Fellow at Los Alamos National Laboratory, combines Big Data, supercomputing and machine-learning to identify the telltale mutations of cancer. Knowing these mutational signatures can help researchers develop new methods of prevention.

  3. Dynamic Monte Carlo simulations of radiatively accelerated GRB fireballs

    NASA Astrophysics Data System (ADS)

    Chhotray, Atul; Lazzati, Davide

    2018-05-01

    We present a novel Dynamic Monte Carlo code (DynaMo code) that self-consistently simulates the Compton-scattering-driven dynamic evolution of a plasma. We use the DynaMo code to investigate the time-dependent expansion and acceleration of dissipationless gamma-ray burst fireballs by varying their initial opacities and baryonic content. We study the opacity and energy density evolution of an initially optically thick, radiation-dominated fireball across its entire phase space - in particular during the Rph < Rsat regime. Our results reveal new phases of fireball evolution: a transition phase with a radial extent of several orders of magnitude - the fireball transitions from Γ ∝ R to Γ ∝ R0, a post-photospheric acceleration phase - where fireballs accelerate beyond the photosphere and a Thomson-dominated acceleration phase - characterized by slow acceleration of optically thick, matter-dominated fireballs due to Thomson scattering. We quantify the new phases by providing analytical expressions of Lorentz factor evolution, which will be useful for deriving jet parameters.

  4. A Wildfire Behavior Modeling System at Los Alamos National Laboratory for Operational Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.W. Koch; R.G.Balice

    2004-11-01

    To support efforts to protect facilities and property at Los Alamos National Laboratory from damages caused by wildfire, we completed a multiyear project to develop a system for modeling the behavior of wildfires in the Los Alamos region. This was accomplished by parameterizing the FARSITE wildfire behavior model with locally gathered data representing topography, fuels, and weather conditions from throughout the Los Alamos region. Detailed parameterization was made possible by an extensive monitoring network of permanent plots, weather towers, and other data collection facilities. We also incorporated a database of lightning strikes that can be used individually as repeatable ignitionmore » points or can be used as a group in Monte Carlo simulation exercises and in other randomization procedures. The assembled modeling system was subjected to sensitivity analyses and was validated against documented fires, including the Cerro Grande Fire. The resulting modeling system is a valuable tool for research and management. It also complements knowledge based on professional expertise and information gathered from other modeling technologies. However, the modeling system requires frequent updates of the input data layers to produce currently valid results, to adapt to changes in environmental conditions within the Los Alamos region, and to allow for the quick production of model outputs during emergency operations.« less

  5. Induction Inserts at the Los Alamos PSR

    NASA Astrophysics Data System (ADS)

    Ng, K. Y.

    2002-12-01

    Ferrite-loaded induction tuners installed in the Los Alamos Proton Storage Ring have been successful in compensating space-charge effects. However, the resistive part of the ferrite introduces unacceptable microwave instability and severe bunch lengthening. An effective cure was found by heating the ferrite cores up to ˜ 130°C. An understanding of the instability and cure is presented.

  6. Los Alamos Team Demonstrates Bottle Scanner Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espy, Michelle; Schultz, Larry

    2014-05-06

    Los Alamos scientists are demonstrating a Nuclear Magnetic Resonance Imaging (NMR) technology that may provide a breakthrough for screening liquids at airport security. By adding low-power X-ray data to the NMR mix, scientists believe they have unlocked a new detection technology. Funded in part by the Department of Homeland Security's Science and Technology Directorate, the new technology is called MagRay.

  7. Los Alamos Team Demonstrates Bottle Scanner Technology

    ScienceCinema

    Espy, Michelle; Schultz, Larry

    2018-02-13

    Los Alamos scientists are demonstrating a Nuclear Magnetic Resonance Imaging (NMR) technology that may provide a breakthrough for screening liquids at airport security. By adding low-power X-ray data to the NMR mix, scientists believe they have unlocked a new detection technology. Funded in part by the Department of Homeland Security's Science and Technology Directorate, the new technology is called MagRay.

  8. Dissemination and support of ARGUS for accelerator applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User's Guide that documents the use of the code for all users. To release the code and the User's Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  9. Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Marquart, Jed E.

    2005-01-01

    The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the

  10. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in

  11. 75 FR 1793 - Study Team for the Los Alamos Historical Document Retrieval and Assessment (LAHDRA) Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-13

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Study Team for the Los Alamos Historical Document Retrieval and Assessment (LAHDRA) Project The Centers for Disease... the following meeting. Name: Public Meeting of the Study Team for the Los Alamos Historical Document...

  12. Total electron content (TEC) variability at Los Alamos, New Mexico: A comparative study: FORTE-derived TEC analysis

    NASA Astrophysics Data System (ADS)

    Huang, Zhen; Roussel-Dupré, Robert

    2005-12-01

    Data collected from Fast On-Orbit Recording of Transient Events (FORTE) satellite-received Los Alamos Portable Pulser (LAPP) signals during 1997-2002 are used to derive the total electron content (TEC) at Los Alamos, New Mexico. The LAPP-derived TECs at Los Alamos are analyzed for diurnal, seasonal, interannual, and 27-day solar cycle variations. Several aspects in deriving TEC are analyzed, including slant to vertical TEC conversion, quartic effects on transionosperic signals, and geomagnetic storm effects on the TEC variance superimposed on the averaged TEC values.

  13. Los Alamos Novel Rocket Design Flight Tested

    ScienceCinema

    Tappan, Bryce

    2018-04-16

    Los Alamos National Laboratory scientists recently flight tested a new rocket design that includes a high-energy fuel and a motor design that also delivers a high degree of safety. Researchers will now work to scale-up the design, as well as explore miniaturization of the system, in order to exploit all potential applications that would require high-energy, high-velocity, and correspondingly high safety margins.

  14. Los Alamos Novel Rocket Design Flight Tested

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tappan, Bryce

    Los Alamos National Laboratory scientists recently flight tested a new rocket design that includes a high-energy fuel and a motor design that also delivers a high degree of safety. Researchers will now work to scale-up the design, as well as explore miniaturization of the system, in order to exploit all potential applications that would require high-energy, high-velocity, and correspondingly high safety margins.

  15. Los Alamos Science: The Human Genome Project. Number 20, 1992

    DOE R&D Accomplishments Database

    Cooper, N. G.; Shea, N. eds.

    1992-01-01

    This document provides a broad overview of the Human Genome Project, with particular emphasis on work being done at Los Alamos. It tries to emphasize the scientific aspects of the project, compared to the more speculative information presented in the popular press. There is a brief introduction to modern genetics, including a review of classic work. There is a broad overview of the Genome Project, describing what the project is, what are some of its major five-year goals, what are major technological challenges ahead of the project, and what can the field of biology, as well as society expect to see as benefits from this project. Specific results on the efforts directed at mapping chromosomes 16 and 5 are discussed. A brief introduction to DNA libraries is presented, bearing in mind that Los Alamos has housed such libraries for many years prior to the Genome Project. Information on efforts to do applied computational work related to the project are discussed, as well as experimental efforts to do rapid DNA sequencing by means of single-molecule detection using applied spectroscopic methods. The article introduces the Los Alamos staff which are working on the Genome Project, and concludes with brief discussions on ethical, legal, and social implications of this work; a brief glimpse of genetics as it may be practiced in the next century; and a glossary of relevant terms.

  16. Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Derek Elswick

    This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less

  17. Optical velocimetry at the Los Alamos Proton Radiography Facility

    NASA Astrophysics Data System (ADS)

    Tupa, Dale; Tainter, Amy; Neukirch, Levi; Hollander, Brian; Buttler, William; Holtkamp, David; The Los Alamos Proton Radiography Team Team

    2016-05-01

    The Los Alamos Proton Radiography Facility (pRad) employs a high-energy proton beam to image the properties and behavior of materials driven by high explosives. We will discuss features of pRad and describe some recent experiments, highlighting optical diagnostics for surface velocity measurements.

  18. Aqueous Nitrate Recovery Line at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finstad, Casey Charles

    2016-06-15

    This powerpoint is part of the ADPSM Plutonium Engineering Lecture Series, which is an opportunity for new hires at LANL to get an overview of work done at TA55. It goes into detail about the aqueous nitrate recovery line at Los Alamos National Laboratory.

  19. Shielding calculations for industrial 5/7.5MeV electron accelerators using the MCNP Monte Carlo Code

    NASA Astrophysics Data System (ADS)

    Peri, Eyal; Orion, Itzhak

    2017-09-01

    High energy X-rays from accelerators are used to irradiate food ingredients to prevent growth and development of unwanted biological organisms in food, and by that extend the shelf life of the products. The production of X-rays is done by accelerating 5 MeV electrons and bombarding them into a heavy target (high Z). Since 2004, the FDA has approved using 7.5 MeV energy, providing higher production rates with lower treatments costs. In this study we calculated all the essential data needed for a straightforward concrete shielding design of typical food accelerator rooms. The following evaluation is done using the MCNP Monte Carlo code system: (1) Angular dependence (0-180°) of photon dose rate for 5 MeV and 7.5 MeV electron beams bombarding iron, aluminum, gold, tantalum, and tungsten targets. (2) Angular dependence (0-180°) spectral distribution simulations of bremsstrahlung for gold, tantalum, and tungsten bombarded by 5 MeV and 7.5 MeV electron beams. (3) Concrete attenuation calculations in several photon emission angles for the 5 MeV and 7.5 MeV electron beams bombarding a tantalum target. Based on the simulation, we calculated the expected increase in dose rate for facilities intending to increase the energy from 5 MeV to 7.5 MeV, and the concrete width needed to be added in order to keep the existing dose rate unchanged.

  20. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-10-01

    This document is the second volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, exceptmore » for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of failure modes and effects analysis; accident analysis; operational safety requirements; quality assurance program; ES&H management program; environmental, safety, and health systems critical to safety; summary of waste-management program; environmental monitoring program; facility expansion, decontamination, and decommissioning; summary of emergency response plan; summary plan for employee training; summary plan for operating procedures; glossary; and appendices A and B.« less

  1. The physics design of accelerator-driven transmutation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venneri, F.

    1995-10-01

    Nuclear systems under study in the Los Alamos Accelerator-Driven Transmutation Technology program (ADTT) will allow the destruction of nuclear spent fuel and weapons-return plutonium, as well as the production of nuclear energy from the thorium cycle, without a long-lived radioactive waste stream. The subcritical systems proposed represent a radical departure from traditional nuclear concepts (reactors), yet the actual implementation of ADTT systems is based on modest extrapolations of existing technology. These systems strive to keep the best that the nuclear technology has developed over the years, within a sensible conservative design envelope and eventually manage to offer a safe, lessmore » expensive and more environmentally sound approach to nuclear power.« less

  2. GPU-Accelerated Large-Scale Electronic Structure Theory on Titan with a First-Principles All-Electron Code

    NASA Astrophysics Data System (ADS)

    Huhn, William Paul; Lange, Björn; Yu, Victor; Blum, Volker; Lee, Seyong; Yoon, Mina

    Density-functional theory has been well established as the dominant quantum-mechanical computational method in the materials community. Large accurate simulations become very challenging on small to mid-scale computers and require high-performance compute platforms to succeed. GPU acceleration is one promising approach. In this talk, we present a first implementation of all-electron density-functional theory in the FHI-aims code for massively parallel GPU-based platforms. Special attention is paid to the update of the density and to the integration of the Hamiltonian and overlap matrices, realized in a domain decomposition scheme on non-uniform grids. The initial implementation scales well across nodes on ORNL's Titan Cray XK7 supercomputer (8 to 64 nodes, 16 MPI ranks/node) and shows an overall speed up in runtime due to utilization of the K20X Tesla GPUs on each Titan node of 1.4x, with the charge density update showing a speed up of 2x. Further acceleration opportunities will be discussed. Work supported by the LDRD Program of ORNL managed by UT-Battle, LLC, for the U.S. DOE and by the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.

  3. Evaluation of Macroinvertebrate Communities and Habitat for Selected Stream Reaches at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L.J. Henne; K.J. Buckley

    2005-08-12

    This is the second aquatic biological monitoring report generated by Los Alamos National Laboratory's (LANL's) Water Quality and Hydrology Group. The study has been conducted to generate impact-based assessments of habitat and water quality for LANL waterways. The monitoring program was designed to allow for the detection of spatial and temporal trends in water and habitat quality through ongoing, biannual monitoring of habitat characteristics and benthic aquatic macroinvertebrate communities at six key sites in Los Alamos, Sandia, Water, Pajarito, and Starmer's Gulch Canyons. Data were collected on aquatic habitat characteristics, channel substrate, and macroinvertebrate communities during 2001 and 2002. Aquaticmore » habitat scores were stable between 2001 and 2002 at all locations except Starmer's Gulch and Pajarito Canyon, which had lower scores in 2002 due to low flow conditions. Channel substrate changes were most evident at the upper Los Alamos and Pajarito study reaches. The macroinvertebrate Stream Condition Index (SCI) indicated moderate to severe impairment at upper Los Alamos Canyon, slight to moderate impairment at upper Sandia Canyon, and little or no impairment at lower Sandia Canyon, Starmer's Gulch, and Pajarito Canyon. Habitat, substrate, and macroinvertebrate data from the site in upper Los Alamos Canyon indicated severe impacts from the Cerro Grande Fire of 2000. Impairment in the macroinvertebrate community at upper Sandia Canyon was probably due to effluent-dominated flow at that site. The minimal impairment SCI scores for the lower Sandia site indicated that water quality improved with distance downstream from the outfall at upper Sandia Canyon.« less

  4. Los Alamos - A Short History

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meade, Roger A.

    At 5:45 am on the morning of July 16, 1945, the world’s first atomic bomb exploded over a remote section of the southern New Mexican desert known as the Jornada del Muerto, the Journey of Death. Three weeks later, the atomic bombs known as Little Boy and Fat Man brought World War II to an end. Working literally around the clock, these first atomic bombs were designed and built in just thirty months by scientists working at a secret scientific laboratory in the mountains of New Mexico known by its codename, Project Y, better known to the world as Losmore » Alamos.« less

  5. Structural testing of the Los Alamos National Laboratory Heat Source/Radioisotopic Thermoelectric Generator shipping container

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronowski, D.R.; Madsen, M.M.

    The Heat Source/Radioisotopic Thermoelectric Generator shipping container is a Type B packaging design currently under development by Los Alamos National Laboratory. Type B packaging for transporting radioactive material is required to maintain containment and shielding after being exposed to the normal and hypothetical accident environments defined in Title 10 Code of Federal Regulations Part 71. A combination of testing and analysis is used to verify the adequacy of this package design. This report documents the test program portion of the design verification, using several prototype packages. Four types of testing were performed: 30-foot hypothetical accident condition drop tests in threemore » orientations, 40-inch hypothetical accident condition puncture tests in five orientations, a 21 psi external overpressure test, and a normal conditions of transport test consisting of a water spray and a 4 foot drop test. 18 refs., 104 figs., 13 tabs.« less

  6. Los Alamos Science: The Human Genome Project. Number 20, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, N G; Shea, N

    1992-01-01

    This article provides a broad overview of the Human Genome Project, with particular emphasis on work being done at Los Alamos. It tries to emphasize the scientific aspects of the project, compared to the more speculative information presented in the popular press. There is a brief introduction to modern genetics, including a review of classic work. There is a broad overview of the Genome Project, describing what the project is, what are some of its major five-year goals, what are major technological challenges ahead of the project, and what can the field of biology, as well as society expect tomore » see as benefits from this project. Specific results on the efforts directed at mapping chromosomes 16 and 5 are discussed. A brief introduction to DNA libraries is presented, bearing in mind that Los Alamos has housed such libraries for many years prior to the Genome Project. Information on efforts to do applied computational work related to the project are discussed, as well as experimental efforts to do rapid DNA sequencing by means of single-molecule detection using applied spectroscopic methods. The article introduces the Los Alamos staff which are working on the Genome Project, and concludes with brief discussions on ethical, legal, and social implications of this work; a brief glimpse of genetics as it may be practiced in the next century; and a glossary of relevant terms.« less

  7. An independent evaluation of plutonium body burdens in populations near Los Alamos Laboratory using human autopsy data.

    PubMed

    Gaffney, Shannon H; Donovan, Ellen P; Shonka, Joseph J; Le, Matthew H; Widner, Thomas E

    2013-06-01

    In the mid-1940s, the United States began producing atomic weapon components at the Los Alamos National Laboratory (LANL). In an attempt to better understand historical exposure to nearby residents, this study evaluates plutonium activity in human tissue relative to residential location and length of time at residence. Data on plutonium activity in the lung, vertebrae, and liver of nearby residents were obtained during autopsies as a part of the Los Alamos Tissue Program. Participant residential histories and the distance from each residence to the primary plutonium processing buildings at LANL were evaluated in the analysis. Summary statistics, including Student t-tests and simple regressions, were calculated. Because the biological half-life of plutonium can vary significantly by organ, data were analyzed separately by tissue type (lung, liver, vertebrae). The ratios of plutonium activity (vertebrae:liver; liver:lung) were also analyzed in order to evaluate the importance of timing of exposure. Tissue data were available for 236 participants who lived in a total of 809 locations, of which 677 were verified postal addresses. Residents of Los Alamos were found to have higher plutonium activities in the lung than non-residents. Further, those who moved to Los Alamos before 1955 had higher lung activities than those who moved there later. These trends were not observed with the liver, vertebrae, or vertebrae:liver and liver:lung ratio data, however, and should be interpreted with caution. Although there are many limitations to this study, including the amount of available data and the analytical methods used to analyze the tissue, the overall results indicate that residence (defined as the year that the individual moved to Los Alamos) may have had a strong correlation to plutonium activity in human tissue. This study is the first to present the results of Los Alamos Autopsy Program in relation to residential status and location in Los Alamos. Copyright © 2012

  8. Carbon isotope chemostratigraphy and precise dating of middle Frasnian (lower Upper Devonian) Alamo Breccia, Nevada, USA

    USGS Publications Warehouse

    Morrow, J.R.; Sandberg, C.A.; Malkowski, K.; Joachimski, M.M.

    2009-01-01

    At Hancock Summit West, Nevada, western USA, uppermost Givetian (upper Middle Devonian) and lower and middle Frasnian (lower Upper Devonian) rocks of the lower Guilmette Formation include, in stratigraphic sequence, carbonate-platform facies of the conodont falsiovalis, transitans, and punctata Zones; the type Alamo Breccia Member of the middle punctata Zone; and slope facies of the punctata and hassi Zones. The catastrophically deposited Alamo Breccia and related phenomena record the ~ 382??Ma Alamo event, produced by a km-scale bolide impact into a marine setting seaward of an extensive carbonate platform fringing western North America. Re-evaluation of conodonts from the lower Guilmette Formation and Alamo Breccia Member, together with regional sedimentologic and conodont biofacies comparisons, now firmly locates the onset of the Johnson et al. (1985) transgressive-regressive (T-R) cycle IIc, which occurred after the start of the punctata Zone, within a parautochthonous megablock low in the Alamo Breccia. Whole-rock carbon isotope analyses through the lower Guilmette Formation and Alamo Breccia Member reveal two positive ??13Ccarb excursions: (1) a small, 3??? excursion, which is possibly correlative with the falsiovalis Event previously identified from sections in Western Europe and Australia, occurs below the breccia in the Upper falsiovalis Zone to early part of the transitans Zone; and (2) a large, multi-part excursion, dominated by a 6??? positive shift, begins above the start of the punctata Zone and onset of T-R cycle IIc and continues above the Alamo Breccia, ending near the punctata- hassi zonal boundary. This large excursion correlates with the punctata Event, a major positive ??13C excursion previously recognized in eastern Laurussia and northern Gondwana. Consistent with previous studies, at Hancock Summit West the punctata Event is apparently not associated with any regional extinctions or ecosystem reorganizations. In the study area, onset of the

  9. Predictive coding accelerates word recognition and learning in the early stages of language development.

    PubMed

    Ylinen, Sari; Bosseler, Alexis; Junttila, Katja; Huotilainen, Minna

    2017-11-01

    The ability to predict future events in the environment and learn from them is a fundamental component of adaptive behavior across species. Here we propose that inferring predictions facilitates speech processing and word learning in the early stages of language development. Twelve- and 24-month olds' electrophysiological brain responses to heard syllables are faster and more robust when the preceding word context predicts the ending of a familiar word. For unfamiliar, novel word forms, however, word-expectancy violation generates a prediction error response, the strength of which significantly correlates with children's vocabulary scores at 12 months. These results suggest that predictive coding may accelerate word recognition and support early learning of novel words, including not only the learning of heard word forms but also their mapping to meanings. Prediction error may mediate learning via attention, since infants' attention allocation to the entire learning situation in natural environments could account for the link between prediction error and the understanding of word meanings. On the whole, the present results on predictive coding support the view that principles of brain function reported across domains in humans and non-human animals apply to language and its development in the infant brain. A video abstract of this article can be viewed at: http://hy.fi/unitube/video/e1cbb495-41d8-462e-8660-0864a1abd02c. [Correction added on 27 January 2017, after first online publication: The video abstract link was added.]. © 2016 John Wiley & Sons Ltd.

  10. CICE, The Los Alamos Sea Ice Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunke, Elizabeth; Lipscomb, William; Jones, Philip

    The Los Alamos sea ice model (CICE) is the result of an effort to develop a computationally efficient sea ice component for a fully coupled atmosphere–land–ocean–ice global climate model. It was originally designed to be compatible with the Parallel Ocean Program (POP), an ocean circulation model developed at Los Alamos National Laboratory for use on massively parallel computers. CICE has several interacting components: a vertical thermodynamic model that computes local growth rates of snow and ice due to vertical conductive, radiative and turbulent fluxes, along with snowfall; an elastic-viscous-plastic model of ice dynamics, which predicts the velocity field of themore » ice pack based on a model of the material strength of the ice; an incremental remapping transport model that describes horizontal advection of the areal concentration, ice and snow volume and other state variables; and a ridging parameterization that transfers ice among thickness categories based on energetic balances and rates of strain. It also includes a biogeochemical model that describes evolution of the ice ecosystem. The CICE sea ice model is used for climate research as one component of complex global earth system models that include atmosphere, land, ocean and biogeochemistry components. It is also used for operational sea ice forecasting in the polar regions and in numerical weather prediction models.« less

  11. Estimation of dose delivered to accelerator devices from stripping of 18.5 MeV/n 238U ions using the FLUKA code

    NASA Astrophysics Data System (ADS)

    Oranj, Leila Mokhtari; Lee, Hee-Seock; Leitner, Mario Santana

    2017-12-01

    In Korea, a heavy ion accelerator facility (RAON) has been designed for production of rare isotopes. The 90° bending section of this accelerator includes a 1.3- μm-carbon stripper followed by two dipole magnets and other devices. An incident beam is 18.5 MeV/n 238U33+,34+ ions passing through the carbon stripper at the beginning of the section. The two dipoles are tuned to transport 238U ions with specific charge states of 77+, 78+, 79+, 80+ and 81+. Then other ions will be deflected at the bends and cause beam losses. These beam losses are a concern to the devices of transport/beam line. The absorbed dose in devices and prompt dose in the tunnel were calculated using the FLUKA code in order to estimate radiation damage of such devices located at the 90° bending section and for the radiation protection. A novel method to transport multi-charged 238U ions beam was applied in the FLUKA code by using charge distribution of 238U ions after the stripper obtained from LISE++ code. The calculated results showed that the absorbed dose in the devices is influenced by the geometrical arrangement. The maximum dose was observed at the coils of first, second, fourth and fifth quadruples placed after first dipole magnet. The integrated doses for 30 years of operation with 9.5 p μA 238U ions were about 2 MGy for those quadrupoles. In conclusion, the protection of devices particularly, quadruples would be necessary to reduce the damage to devices. Moreover, results showed that the prompt radiation penetrated within the first 60 - 120 cm of concrete.

  12. Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2017-04-01

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. © The Author 2016

  13. Critical Infrastructure Protection- Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bofman, Ryan K.

    Los Alamos National Laboratory (LANL) has been a key facet of Critical National Infrastructure since the nuclear bombing of Hiroshima exposed the nature of the Laboratory’s work in 1945. Common knowledge of the nature of sensitive information contained here presents a necessity to protect this critical infrastructure as a matter of national security. This protection occurs in multiple forms beginning with physical security, followed by cybersecurity, safeguarding of classified information, and concluded by the missions of the National Nuclear Security Administration.

  14. Amphibians and Reptiles of Los Alamos County

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teralene S. Foxx; Timothy K. Haarmann; David C. Keller

    Recent studies have shown that amphibians and reptiles are good indicators of environmental health. They live in terrestrial and aquatic environments and are often the first animals to be affected by environmental change. This publication provides baseline information about amphibians and reptiles that are present on the Pajarito Plateau. Ten years of data collection and observations by researchers at Los Alamos National Laboratory, the University of New Mexico, the New Mexico Department of Game and Fish, and hobbyists are represented.

  15. C++ Coding Standards for the AMP Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas M; Clarno, Kevin T

    2009-09-01

    This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that ismore » written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].« less

  16. Portable MRI developed at Los Alamos

    ScienceCinema

    Espy, Michelle

    2018-02-14

    Scientists at Los Alamos National Laboratory are developing an ultra-low-field Magnetic Resonance Imaging (MRI) system that could be low-power and lightweight enough for forward deployment on the battlefield and to field hospitals in the World's poorest regions. "MRI technology is a powerful medical diagnostic tool," said Michelle Espy, the Battlefield MRI (bMRI) project leader, "ideally suited for imaging soft-tissue injury, particularly to the brain." But hospital-based MRI devices are big and expensive, and require considerable infrastructure, such as large quantities of cryogens like liquid nitrogen and helium, and they typically use a large amount of energy. "Standard MRI machines just can't go everywhere," said Espy. "Soldiers wounded in battle usually have to be flown to a large hospital and people in emerging nations just don't have access to MRI at all. We've been in contact with doctors who routinely work in the Third World and report that MRI would be extremely valuable in treating pediatric encephalopathy, and other serious diseases in children." So the Los Alamos team started thinking about a way to make an MRI device that could be relatively easy to transport, set up, and use in an unconventional setting. Conventional MRI machines use very large magnetic fields that align the protons in water molecules to then create magnetic resonance signals, which are detected by the machine and turned into images. The large magnetic fields create exceptionally detailed images, but they are difficult and expensive to make. Espy and her team wanted to see if images of sufficient quality could be made with ultra-low-magnetic fields, similar in strength to the Earth's magnetic field. To achieve images at such low fields they use exquisitely sensitive detectors called Superconducting Quantum Interference Devices, or SQUIDs. SQUIDs are among the most sensitive magnetic field detectors available, so interference with the signal is the primary stumbling block. "SQUIDs are

  17. Portable MRI developed at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espy, Michelle

    Scientists at Los Alamos National Laboratory are developing an ultra-low-field Magnetic Resonance Imaging (MRI) system that could be low-power and lightweight enough for forward deployment on the battlefield and to field hospitals in the World's poorest regions. "MRI technology is a powerful medical diagnostic tool," said Michelle Espy, the Battlefield MRI (bMRI) project leader, "ideally suited for imaging soft-tissue injury, particularly to the brain." But hospital-based MRI devices are big and expensive, and require considerable infrastructure, such as large quantities of cryogens like liquid nitrogen and helium, and they typically use a large amount of energy. "Standard MRI machines justmore » can't go everywhere," said Espy. "Soldiers wounded in battle usually have to be flown to a large hospital and people in emerging nations just don't have access to MRI at all. We've been in contact with doctors who routinely work in the Third World and report that MRI would be extremely valuable in treating pediatric encephalopathy, and other serious diseases in children." So the Los Alamos team started thinking about a way to make an MRI device that could be relatively easy to transport, set up, and use in an unconventional setting. Conventional MRI machines use very large magnetic fields that align the protons in water molecules to then create magnetic resonance signals, which are detected by the machine and turned into images. The large magnetic fields create exceptionally detailed images, but they are difficult and expensive to make. Espy and her team wanted to see if images of sufficient quality could be made with ultra-low-magnetic fields, similar in strength to the Earth's magnetic field. To achieve images at such low fields they use exquisitely sensitive detectors called Superconducting Quantum Interference Devices, or SQUIDs. SQUIDs are among the most sensitive magnetic field detectors available, so interference with the signal is the primary stumbling block

  18. GPU Optimizations for a Production Molecular Docking Code*

    PubMed Central

    Landaverde, Raphael; Herbordt, Martin C.

    2015-01-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users. PMID:26594667

  19. GPU Optimizations for a Production Molecular Docking Code.

    PubMed

    Landaverde, Raphael; Herbordt, Martin C

    2014-09-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users.

  20. CH-TRU Waste Content Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2008-01-16

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  1. New features in the design code Tlie

    NASA Astrophysics Data System (ADS)

    van Zeijts, Johannes

    1993-12-01

    We present features recently installed in the arbitrary-order accelerator design code Tlie. The code uses the MAD input language, and implements programmable extensions modeled after the C language that make it a powerful tool in a wide range of applications: from basic beamline design to high precision-high order design and even control room applications. The basic quantities important in accelerator design are easily accessible from inside the control language. Entities like parameters in elements (strength, current), transfer maps (either in Taylor series or in Lie algebraic form), lines, and beams (either as sets of particles or as distributions) are among the type of variables available. These variables can be set, used as arguments in subroutines, or just typed out. The code is easily extensible with new datatypes.

  2. Geohydrology and simulation of ground-water flow near Los Alamos, north-central New Mexico

    USGS Publications Warehouse

    Frenzel, P.F.

    1995-01-01

    An existing model was modified in recognition of new geohydrologic interpretations and adjusted to simulate hydrographs in well fields in the Los Alamos area. Hydraulic-head drawdowns at the Buckman well field resulting from two projected ground-water-withdrawal alternatives were estimated with the modified model. The Chaquehui formation (informal usage) is the main new feature of recent hydrologic interpretations for the Los Alamos area. The Chaquehui occupies a 'channel' that was eroded or faulted into the Tesuque Formation, and the Chaquehui is more permeable than the Tesuque. The Chaquehui is a major producing zone in the Pajarito Mesa well field and to a lesser extent in the Guaje well field. Model modification included splitting the four layers of the McAda-Wasiolek model (McAda, D.P., and Wasiolek, Maryann, 1988, Simulation of the regional geohydrology of the Tesuque aquifer system near Santa Fe, New Mexico: U.S. Geological Survey Water- Resources Investigations Report 87-4056, 71 p.) into eight layers to better simulate vertical ground-water movement. Other model modifications were limited as much as possible to the area of interest near Los Alamos and consisted mainly of adjusting hydraulic-conductivity values representing the Tesuque Formation, Chaquehui formation (informal usage), and Puye Formation, and adjusting simulated recharge along the Pajarito Fault Zone west of Los Alamos. Adjustments were based mainly on simulation of fluctuations in measured hydraulic heads near Los Alamos. Two possible alternative plans for replacing Guaje well field production were suggested by Los Alamos National Laboratory. In the first plan (Guaje alternative), the Guaje field would be renewed with four new wells replacing the existing production wells in the Guaje field. In the second plan (Pajarito-Otowi alternative), the Guaje well field would be retired and its former production would be made up by additional withdrawals from the Pajarito Mesa and Otowi well fields. A

  3. Two-dimensional spatiotemporal coding of linear acceleration in vestibular nuclei neurons

    NASA Technical Reports Server (NTRS)

    Angelaki, D. E.; Bush, G. A.; Perachio, A. A.

    1993-01-01

    Response properties of vertical (VC) and horizontal (HC) canal/otolith-convergent vestibular nuclei neurons were studied in decerebrate rats during stimulation with sinusoidal linear accelerations (0.2-1.4 Hz) along different directions in the head horizontal plane. A novel characteristic of the majority of tested neurons was the nonzero response often elicited during stimulation along the "null" direction (i.e., the direction perpendicular to the maximum sensitivity vector, Smax). The tuning ratio (Smin gain/Smax gain), a measure of the two-dimensional spatial sensitivity, depended on stimulus frequency. For most vestibular nuclei neurons, the tuning ratio was small at the lowest stimulus frequencies and progressively increased with frequency. Specifically, HC neurons were characterized by a flat Smax gain and an approximately 10-fold increase of Smin gain per frequency decade. Thus, these neurons encode linear acceleration when stimulated along their maximum sensitivity direction, and the rate of change of linear acceleration (jerk) when stimulated along their minimum sensitivity direction. While the Smax vectors were distributed throughout the horizontal plane, the Smin vectors were concentrated mainly ipsilaterally with respect to head acceleration and clustered around the naso-occipital head axis. The properties of VC neurons were distinctly different from those of HC cells. The majority of VC cells showed decreasing Smax gains and small, relatively flat, Smin gains as a function of frequency. The Smax vectors were distributed ipsilaterally relative to the induced (apparent) head tilt. In type I anterior or posterior VC neurons, Smax vectors were clustered around the projection of the respective ipsilateral canal plane onto the horizontal head plane. These distinct spatial and temporal properties of HC and VC neurons during linear acceleration are compatible with the spatiotemporal organization of the horizontal and the vertical/torsional ocular responses

  4. The Los Alamos universe: Using multimedia to promote laboratory capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kindel, J.

    2000-03-01

    This project consists of a multimedia presentation that explains the technological capabilities of Los Alamos National Laboratory. It takes the form of a human-computer interface built around the metaphor of the universe. The project is intended promote Laboratory capabilities to a wide audience. Multimedia is simply a means of communicating information through a diverse set of tools--be they text, sound, animation, video, etc. Likewise, Los Alamos National Laboratory is a collection of diverse technologies, projects, and people. Given the ample material available at the Laboratory, there are tangible benefits to be gained by communicating across media. This paper consists ofmore » three parts. The first section provides some basic information about the Laboratory, its mission, and its needs. The second section introduces this multimedia presentation and the metaphor it is based on along with some basic concepts of color and user interaction used in the building of this project. The final section covers construction of the project, pitfalls, and future improvements.« less

  5. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  6. A portable platform for accelerated PIC codes and its application to GPUs using OpenACC

    NASA Astrophysics Data System (ADS)

    Hariri, F.; Tran, T. M.; Jocksch, A.; Lanti, E.; Progsch, J.; Messmer, P.; Brunner, S.; Gheller, C.; Villard, L.

    2016-10-01

    We present a portable platform, called PIC_ENGINE, for accelerating Particle-In-Cell (PIC) codes on heterogeneous many-core architectures such as Graphic Processing Units (GPUs). The aim of this development is efficient simulations on future exascale systems by allowing different parallelization strategies depending on the application problem and the specific architecture. To this end, this platform contains the basic steps of the PIC algorithm and has been designed as a test bed for different algorithmic options and data structures. Among the architectures that this engine can explore, particular attention is given here to systems equipped with GPUs. The study demonstrates that our portable PIC implementation based on the OpenACC programming model can achieve performance closely matching theoretical predictions. Using the Cray XC30 system, Piz Daint, at the Swiss National Supercomputing Centre (CSCS), we show that PIC_ENGINE running on an NVIDIA Kepler K20X GPU can outperform the one on an Intel Sandy bridge 8-core CPU by a factor of 3.4.

  7. Tritium concentrations in bees and honey at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fresquez, P.R.; Armstrong, D.R.; Salazar, J.G.

    Los Alamos National Laboratory (LANL) has maintained a network of honey bee colonies at LANL, perimeter (Los Alamos townsite and White Rock/Pajarito Acres) and regional (background) areas for over 15 years; the main objective of this honey bee network was to help determine the bioavailability of certain radionuclides in the environment. Of all the radionuclides studied ({sup 3}H, {sup 57}Co, {sup 7}Be, {sup 22}Na, {sup 54}Mn, {sup 83}Rb, {sup 137}Cs, {sup 238}Pu, {sup 239}Pu, {sup 90}Sr and total U), tritium was consistently detected in bees and was most readily transferred to the honey. In fact, honey collected from hives locatedmore » at TA-21, TA-33, TA-50, TA-53, and TA-54 and from White Rock/Pajarito Acres contained significantly higher concentrations of {sup 3}H than regional background hives. Based on the average concentration of all radionuclides measured over the years, the effective dose equivalent (EDE) from consuming 5 kg (11 lb) of honey collected from Los Alamos (townsite) and White Rock/Pajarito Acres, after regional background has been subtracted, was 0.0186 ({+-}0.0507) and 0.0016 ({+-}0.0010) mrem/yr, respectively. The highest EDE, based on the mean + 2SD (95% confidence level), was 0.1200 mrem/y; this was <0.2% of the International Commission on Radiological Protection permissible dose limit of 100 mrem/yr from all pathways.« less

  8. Examination of the home destruction in Los Alamos associated with the Cerro Grande Fire - July 10, 2000

    Treesearch

    Jack D. Cohen

    2000-01-01

    I arrived at Los Alamos on May 14, 2000 to conduct an examination of the home destruction associated with the Cerro Grande Fire. My examination occurred between the afternoon of 5/14 and late afternoon on 5/16. I had contact with the southern command post incident management team, the Los Alamos Fire Department, and the Santa Fe National Forest.The...

  9. GPU Acceleration of the Locally Selfconsistent Multiple Scattering Code for First Principles Calculation of the Ground State and Statistical Physics of Materials

    NASA Astrophysics Data System (ADS)

    Eisenbach, Markus

    The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code. This work has been sponsored by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Material Sciences and Engineering Division and by the Office of Advanced Scientific Computing. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.

  10. ORBIT: A Code for Collective Beam Dynamics in High-Intensity Rings

    NASA Astrophysics Data System (ADS)

    Holmes, J. A.; Danilov, V.; Galambos, J.; Shishlo, A.; Cousineau, S.; Chou, W.; Michelotti, L.; Ostiguy, J.-F.; Wei, J.

    2002-12-01

    We are developing a computer code, ORBIT, specifically for beam dynamics calculations in high-intensity rings. Our approach allows detailed simulation of realistic accelerator problems. ORBIT is a particle-in-cell tracking code that transports bunches of interacting particles through a series of nodes representing elements, effects, or diagnostics that occur in the accelerator lattice. At present, ORBIT contains detailed models for strip-foil injection, including painting and foil scattering; rf focusing and acceleration; transport through various magnetic elements; longitudinal and transverse impedances; longitudinal, transverse, and three-dimensional space charge forces; collimation and limiting apertures; and the calculation of many useful diagnostic quantities. ORBIT is an object-oriented code, written in C++ and utilizing a scripting interface for the convenience of the user. Ongoing improvements include the addition of a library of accelerator maps, BEAMLINE/MXYZPTLK; the introduction of a treatment of magnet errors and fringe fields; the conversion of the scripting interface to the standard scripting language, Python; and the parallelization of the computations using MPI. The ORBIT code is an open source, powerful, and convenient tool for studying beam dynamics in high-intensity rings.

  11. GPU accelerated manifold correction method for spinning compact binaries

    NASA Astrophysics Data System (ADS)

    Ran, Chong-xi; Liu, Song; Zhong, Shuang-ying

    2018-04-01

    The graphics processing unit (GPU) acceleration of the manifold correction algorithm based on the compute unified device architecture (CUDA) technology is designed to simulate the dynamic evolution of the Post-Newtonian (PN) Hamiltonian formulation of spinning compact binaries. The feasibility and the efficiency of parallel computation on GPU have been confirmed by various numerical experiments. The numerical comparisons show that the accuracy on GPU execution of manifold corrections method has a good agreement with the execution of codes on merely central processing unit (CPU-based) method. The acceleration ability when the codes are implemented on GPU can increase enormously through the use of shared memory and register optimization techniques without additional hardware costs, implying that the speedup is nearly 13 times as compared with the codes executed on CPU for phase space scan (including 314 × 314 orbits). In addition, GPU-accelerated manifold correction method is used to numerically study how dynamics are affected by the spin-induced quadrupole-monopole interaction for black hole binary system.

  12. 2D Implosion Simulations with a Kinetic Particle Code

    NASA Astrophysics Data System (ADS)

    Sagert, Irina; Even, Wesley; Strother, Terrance

    2017-10-01

    Many problems in laboratory and plasma physics are subject to flows that move between the continuum and the kinetic regime. We discuss two-dimensional (2D) implosion simulations that were performed using a Monte Carlo kinetic particle code. The application of kinetic transport theory is motivated, in part, by the occurrence of non-equilibrium effects in inertial confinement fusion (ICF) capsule implosions, which cannot be fully captured by hydrodynamics simulations. Kinetic methods, on the other hand, are able to describe both, continuum and rarefied flows. We perform simple 2D disk implosion simulations using one particle species and compare the results to simulations with the hydrodynamics code RAGE. The impact of the particle mean-free-path on the implosion is also explored. In a second study, we focus on the formation of fluid instabilities from induced perturbations. I.S. acknowledges support through the Director's fellowship from Los Alamos National Laboratory. This research used resources provided by the LANL Institutional Computing Program.

  13. Numerical predictions of EML (electromagnetic launcher) system performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnurr, N.M.; Kerrisk, J.F.; Davidson, R.F.

    1987-01-01

    The performance of an electromagnetic launcher (EML) depends on a large number of parameters, including the characteristics of the power supply, rail geometry, rail and insulator material properties, injection velocity, and projectile mass. EML system performance is frequently limited by structural or thermal effects in the launcher (railgun). A series of computer codes has been developed at the Los Alamos National Laboratory to predict EML system performance and to determine the structural and thermal constraints on barrel design. These codes include FLD, a two-dimensional electrostatic code used to calculate the high-frequency inductance gradient and surface current density distribution for themore » rails; TOPAZRG, a two-dimensional finite-element code that simultaneously analyzes thermal and electromagnetic diffusion in the rails; and LARGE, a code that predicts the performance of the entire EML system. Trhe NIKE2D code, developed at the Lawrence Livermore National Laboratory, is used to perform structural analyses of the rails. These codes have been instrumental in the design of the Lethality Test System (LTS) at Los Alamos, which has an ultimate goal of accelerating a 30-g projectile to a velocity of 15 km/s. The capabilities of the individual codes and the coupling of these codes to perform a comprehensive analysis is discussed in relation to the LTS design. Numerical predictions are compared with experimental data and presented for the LTS prototype tests.« less

  14. Multilayer Semiconductor Charged-Particle Spectrometers for Accelerator Experiments

    NASA Astrophysics Data System (ADS)

    Gurov, Yu. B.; Lapushkin, S. V.; Sandukovsky, V. G.; Chernyshev, B. A.

    2018-03-01

    The current state of studies in the field of development of multilayer semiconductor systems (semiconductor detector (SCD) telescopes), which allow the energy to be precisely measured within a large dynamic range (from a few to a few hundred MeV) and the particles to be identified in a wide mass range (from pions to multiply charged nuclear fragments), is presented. The techniques for manufacturing the SCD telescopes from silicon and high-purity germanium are described. The issues of measuring characteristics of the constructed detectors and their impact on the energy resolution of the SCD telescopes and on the quality of the experimental data are considered. Much attention is given to the use of the constructed semiconductor devices in experimental studies at accelerators of PNPI (Gatchina), LANL (Los Alamos) and CELSIUS (Uppsala).

  15. Proton Radiography at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, Alexander

    2017-02-28

    The proton radiography (pRad) facility at Los Alamos National Lab uses high energy protons to acquire multiple frame flash radiographic sequences at megahertz speeds: that is, it can make movies of the inside of explosions as they happen. The facility is primarily used to study the damage to and failure of metals subjected to the shock forces of high explosives as well as to study the detonation of the explosives themselves. Applications include improving our understanding of the underlying physical processes that drive the performance of the nuclear weapons in the United States stockpile and developing novel armor technologies inmore » collaboration with the Army Research Lab. The principle and techniques of pRad will be described, and examples of some recent results will be shown.« less

  16. Keeping the Momentum and Nuclear Forensics at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dion, Heather M.; Dry, Donald E.

    LANL has 70 years of experience in nuclear forensics and supports the community through a wide variety of efforts and leveraged capabilities: Expanding the understanding of nuclear forensics, providing training on nuclear forensics methods, and developing bilateral relationships to expand our understanding of nuclear forensic science. LANL remains highly supportive of several key organizations tasked with carrying forth the Nuclear Security Summit messages: IAEA, GICNT, and INTERPOL. Analytical chemistry measurements on plutonium and uranium matrices are critical to numerous programs including safeguards accountancy verification measurements. Los Alamos National Laboratory operates capable actinide analytical chemistry and material science laboratories suitable formore » nuclear material and environmental forensic characterization. Los Alamos National Laboratory uses numerous means to validate and independently verify that measurement data quality objectives are met. Numerous LANL nuclear facilities support the nuclear material handling, preparation, and analysis capabilities necessary to evaluate samples containing nearly any mass of an actinide (attogram to kilogram levels).« less

  17. Dissolved pesticides in the Alamo River and the Salton Sea, California, 1996-97

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Kuivila, Kathryn; Bergamaschi, Brian A.

    2002-01-01

    Water samples were collected from the Alamo River and the Salton Sea, California, in autumn 1996 and late winter/early spring 1997 and analyzed for dissolved pesticides. The two seasons chosen for sampling were during pesticide application periods in the Imperial Valley. Pesticide concentrations were measured in filtered water samples using solid-phase extraction and analyzed by gas chromatography/mass spectrometry. Generally, the highest concentrations were measured in the Alamo River. The concentrations of carbaryl, chlorpyrifos, cycloate, dacthal, diazinon, and eptam were highest in samples collected in autumn 1996. In contrast, the concentrations of atrazine, carbofuran, and malathion were highest in samples collected in late winter/early spring 1997. The highest concentrations measured of atrazine, carbofuran, dacthal, eptam, and malathion all exceeded 1,000 nanograms per liter.

  18. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The Com

  19. Los Alamos Scientific Laboratory energy-related history, research, managerial reorganization proposals, actions taken, and results. History report, 1945--1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammel, E.F.

    1997-03-01

    This report documents the development of major energy-related programs at the Los Alamos Scientific Laboratory between 1945 and 1979. Although the Laboratory`s primary mission during that era was the design and development of nuclear weapons and most of the Laboratory`s funding came from a single source, a number of factors were at work that led to the development of these other programs. Some of those factors were affected by the Laboratory`s internal management structure and organization; others were the result of increasing environmental awareness within the general population and the political consequences of that awareness; still others were related tomore » the increasing demand for energy and the increasing turmoil in the energy-rich Middle East. This report also describes the various activities in Los Alamos, in Washington, and in other areas of the world that contributed to the development of major energy-related programs at Los Alamos. The author has a unique historical perspective because of his involvement as a scientist and manager at the Los Alamos Scientific Laboratory during the time period described within the report. In addition, in numerous footnotes and references, he cites a large body of documents that include the opinions and perspectives of many others who were involved at one time or another in these programs. Finally the report includes a detailed chronology of geopolitical events that led to the development of energy-related programs at Los Alamos.« less

  20. EDITORIAL: Laser and plasma accelerators Laser and plasma accelerators

    NASA Astrophysics Data System (ADS)

    Bingham, Robert

    2009-02-01

    as photon deceleration and acceleration and is the result of a modulational instability. Simulations reported by Trines et al using a photon-in-cell code or wave kinetic code agree extremely well with experimental observation. Ion acceleration is actively studied; for example the papers by Robinson, Macchi, Marita and Tripathi all discuss different types of acceleration mechanisms from direct laser acceleration, Coulombic explosion and double layers. Ion acceleration is an exciting development that may have great promise in oncology. The surprising application is in muon acceleration, demonstrated by Peano et al who show that counterpropagating laser beams with variable frequencies drive a beat structure with variable phase velocity, leading to particle trapping and acceleration with possible application to a future muon collider and neutrino factory. Laser and plasma accelerators remain one of the exciting areas of plasma physics with applications in many areas of science ranging from laser fusion, novel high-brightness radiation sources, particle physics and medicine. The guest editor would like to thank all authors and referees for their invaluable contributions to this special issue.

  1. An Analysis on the TEC Variability and Ionospheric Scintillation at Los Alamos, New Mexico Derived from FORTE-Received LAPP Signals

    NASA Astrophysics Data System (ADS)

    Huang, Z.; Roussel-Dupre, R.

    2003-12-01

    The total electron content (TEC) of ionosphere and its electron density irregularities (scintillation) have effects of degradation and disruption on radio signals passed between ground stations and orbiting man-made satellites. With the rapid increase in operational reliance on UHF/VHF satellite communication, it is desirable to obtain understandings of ionosphere TEC variability and scintillation characteristics to improve our ability of predicting satellite communication outages. In this work, data collected from FORTE satellite received LAPP (Los Alamos Portable Pulser) signals during 1998-2002 are used to derive TEC and ionospheric scintillation index at Los Alamos, New Mexico. To characterize in-situ TEC variability at Los Alamos, the FORTE-LAPP derived TECs are analyzed against diurnal, seasonal, solar activity, magnetic storm, and stratospheric warming. The results are also compared with the TEC estimates from the Los Alamos ionospheric transfer function (ITF) implemented with the global ionospheric models (IRI, PIM), and GPS -derived TEC maps. The FORTE-LAPP signals are also analyzed against two important measures of the effect of scintillation on broadband signals, the mean time delay and the time delay jitter. The results are used to examine coherence frequency bandwidth and compared with the predictions from a global scintillation model (WBMOD). The FORTE-LAPP analyzed and WBMOD predicted scintillation characteristics are used to investigate temporal and seasonal behavior of scintillation at Los Alamos.

  2. Igniting the Light Elements: The Los Alamos Thermonuclear Weapon Project, 1942-1952

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzpatrick, Anne C.

    1999-07-01

    The American system of nuclear weapons research and development was conceived and developed not as a result of technological determinism, but by a number of individual architects who promoted the growth of this large technologically-based complex. While some of the technological artifacts of this system, such as the fission weapons used in World War II, have been the subject of many historical studies, their technical successors--fusion (or hydrogen) devices--are representative of the largely unstudied highly secret realms of nuclear weapons science and engineering. In the postwar period a small number of Los Alamos Scientific Laboratory's staff and affiliates were responsiblemore » for theoretical work on fusion weapons, yet the program was subject to both the provisions and constraints of the US Atomic Energy Commission, of which Los Alamos was a part. The Commission leadership's struggle to establish a mission for its network of laboratories, least of all to keep them operating, affected Los Alamos's leaders' decisions as to the course of weapons design and development projects. Adapting Thomas P. Hughes's ''large technological systems'' thesis, I focus on the technical, social, political, and human problems that nuclear weapons scientists faced while pursuing the thermonuclear project, demonstrating why the early American thermonuclear bomb project was an immensely complicated scientific and technological undertaking. I concentrate mainly on Los Alamos Scientific Laboratory's Theoretical, or T, Division, and its members' attempts to complete an accurate mathematical treatment of the ''Super''--the most difficult problem in physics in the postwar period--and other fusion weapon theories. Although tackling a theoretical problem, theoreticians had to address technical and engineering issues as well. I demonstrate the relative value and importance of H-bomb research over time in the postwar era to scientific, politician, and military participants in this project

  3. Determination of the Shock Properties of Ceramic Corbit 98: 98% Alumina

    DTIC Science & Technology

    2010-06-01

    sapphire or aluminum. A single stage three inch bore gas gun was used to accelerate the projectile for experiments at NPS. Los Alamos National Lab used...stage three inch bore gas gun was used to accelerate the projectile for experiments at NPS. Los Alamos National Lab used a higher performance gun...Gigapascals, one billion pascals of pressure or force per unit area HEL Hugoniot elastic limit LANL Los Alamos National Lab mm Millimeter, or one

  4. Integrating Safety with Science,Technology and Innovation at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rich, Bethany M

    2012-04-02

    The mission of Los Alamos National Laboratory (LANL) is to develop and apply science, technology and engineering solutions to ensure the safety, security, and reliability of the U.S. nuclear deterrent; reduce global threats; and solve emerging national security challenges. The most important responsibility is to direct and conduct efforts to meet the mission with an emphasis on safety, security, and quality. In this article, LANL Environmental, Safety, and Health (ESH) trainers discuss how their application and use of a kinetic learning module (learn by doing) with a unique fall arrest system is helping to address one the most common industrialmore » safety challenges: slips and falls. A unique integration of Human Performance Improvement (HPI), Behavior Based Safety (BBS) and elements of the Voluntary Protection Program (VPP) combined with an interactive simulator experience is being used to address slip and fall events at Los Alamos.« less

  5. Environmental surveillance at Los Alamos during 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2006-09-30

    Environmental Surveillance at Los Alamos reports are prepared annually by the Los Alamos National Laboratory (LANL or the Laboratory) environmental organization, as required by US Department of Energy Order 5400.1, General Environmental Protection Program, and US Department of Energy Order 231.IA, Environment, Safety, and Health Reporting. These annual reports summarize environmental data that are used to determine compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and departmental policies. Additional data, beyond the minimum required, are also gathered and reported as part of the Laboratory's efforts to ensure public safety and to monitor environmental quality atmore » and near the Laboratory. Chapter 1 provides an overview of the Laboratory's major environmental programs. Chapter 2 reports the Laboratory's compliance status for 2005. Chapter 3 provides a summary of the maximum radiological dose the public and biota populations could have potentially received from Laboratory operations. The environmental surveillance and monitoring data are organized by environmental media (Chapter 4, Air; Chapters 5 and 6, Water and Sediments; Chapter 7, Soils; and Chapter 8, Foodstuffs and Biota) in a format to meet the needs of a general and scientific audience. Chapter 9, new for this year, provides a summary of the status of environmental restoration work around LANL. A glossary and a list ofacronyms and abbreviations are in the back of the report. Appendix A explains the standards for environmental contaminants, Appendix B explains the units of measurements used in this report, Appendix C describes the Laboratory's technical areas and their associated programs, and Appendix D provides web links to more information.« less

  6. Surface water data at Los Alamos National Laboratory: 2009 water year

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz, David; McCullough, Betsy

    2010-05-01

    The principal investigators collected and computed surface water discharge data from 73 stream-gage stations that cover most of Los Alamos National Laboratory and one at Bandelier National Monument. Also included are discharge data from three springs— two that flow into Cañon de Valle and one that flows into Water Canyon.

  7. Surface water data at Los Alamos National Laboratory: 2008 water year

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz, David; Cata, Betsy; Kuyumjian, Gregory

    2009-09-01

    The principal investigators collected and computed surface water discharge data from 69 stream-gage stations that cover most of Los Alamos National Laboratory and one at Bandelier National Monument. Also included are discharge data from three springs— two that flow into Cañon de Valle and one that flows into Water Canyon.

  8. Acceleration of Semiempirical QM/MM Methods through Message Passage Interface (MPI), Hybrid MPI/Open Multiprocessing, and Self-Consistent Field Accelerator Implementations.

    PubMed

    Ojeda-May, Pedro; Nam, Kwangho

    2017-08-08

    The strategy and implementation of scalable and efficient semiempirical (SE) QM/MM methods in CHARMM are described. The serial version of the code was first profiled to identify routines that required parallelization. Afterward, the code was parallelized and accelerated with three approaches. The first approach was the parallelization of the entire QM/MM routines, including the Fock matrix diagonalization routines, using the CHARMM message passage interface (MPI) machinery. In the second approach, two different self-consistent field (SCF) energy convergence accelerators were implemented using density and Fock matrices as targets for their extrapolations in the SCF procedure. In the third approach, the entire QM/MM and MM energy routines were accelerated by implementing the hybrid MPI/open multiprocessing (OpenMP) model in which both the task- and loop-level parallelization strategies were adopted to balance loads between different OpenMP threads. The present implementation was tested on two solvated enzyme systems (including <100 QM atoms) and an S N 2 symmetric reaction in water. The MPI version exceeded existing SE QM methods in CHARMM, which include the SCC-DFTB and SQUANTUM methods, by at least 4-fold. The use of SCF convergence accelerators further accelerated the code by ∼12-35% depending on the size of the QM region and the number of CPU cores used. Although the MPI version displayed good scalability, the performance was diminished for large numbers of MPI processes due to the overhead associated with MPI communications between nodes. This issue was partially overcome by the hybrid MPI/OpenMP approach which displayed a better scalability for a larger number of CPU cores (up to 64 CPUs in the tested systems).

  9. Fuels Inventories in the Los Alamos National Laboratory Region: 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balice, R.G.; Oswald, B.P.; Martin, C.

    1999-03-01

    Fifty-four sites were surveyed for fuel levels, vegetational structures, and topographic characteristics. Most of the surveyed sites were on Los Alamos National Laboratory property, however, some surveys were also conducted on U.S. Forest Service property. The overall vegetation of these sites ranged from pinon-juniper woodlands to ponderosa pine forests to mixed conifer forests, and the topographic positions included canyons, mesas, and mountains. The results of these surveys indicate that the understory fuels are the greatest in mixed conifer forests and that overstory fuels are greatest in both mixed conifer forests and ponderosa pine forests on mesas. The geographic distribution ofmore » these fuels would suggest a most credible wildfire scenario for the Los Alamos region. Three major fires have occurred since 1954 and these fires behaved in a manner that is consistent with this scenario. The most credible wildfire scenario was also supported by the results of BEHAVE modeling that used the fuels inventory data as inputs. Output from the BEHAVE model suggested that catastrophic wildfires would continue to occur during any season with sufficiently dry, windy weather.« less

  10. Los Alamos National Laboratory Economic Analysis Capability Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Edwards, Brian Keith; Pasqualini, Donatella

    Los Alamos National Laboratory has developed two types of models to compute the economic impact of infrastructure disruptions. FastEcon is a fast running model that estimates first-­order economic impacts of large scale events such as hurricanes and floods and can be used to identify the amount of economic activity that occurs in a specific area. LANL’s Computable General Equilibrium (CGE) model estimates more comprehensive static and dynamic economic impacts of a broader array of events and captures the interactions between sectors and industries when estimating economic impacts.

  11. CFD Code Survey for Thrust Chamber Application

    NASA Technical Reports Server (NTRS)

    Gross, Klaus W.

    1990-01-01

    In the quest fo find analytical reference codes, responses from a questionnaire are presented which portray the current computational fluid dynamics (CFD) program status and capability at various organizations, characterizing liquid rocket thrust chamber flow fields. Sample cases are identified to examine the ability, operational condition, and accuracy of the codes. To select the best suited programs for accelerated improvements, evaluation criteria are being proposed.

  12. Floodplain Assessment for the North Ancho Canyon Aggregate Area Cleanup in Technical Area 39 at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hathcock, Charles Dean

    This floodplain assessment was prepared in accordance with 10 Code of Federal Regulations (CFR) 1022 Compliance with Floodplain and Wetland Environmental Review Requirements, which was promulgated to implement the U.S. Department of Energy (DOE) requirements under Executive Order 11988 Floodplain Management and Executive Order 11990 Wetlands Protection. According to 10 CFR 1022, a 100-year floodplain is defined as “the lowlands adjoining inland and coastal waters and relatively flat areas and flood prone areas of offshore islands.” In this action, DOE is proposing to collect soil investigation samples and remove contaminated soil within and around selected solid waste management units (SWMUs)more » near and within the 100-year floodplain (hereafter “floodplain”) in north Ancho Canyon at Los Alamos National Laboratory (LANL). The work is being performed to comply with corrective action requirements under the 2016 Compliance Order on Consent.« less

  13. Direct Laser Acceleration in Laser Wakefield Accelerators

    NASA Astrophysics Data System (ADS)

    Shaw, J. L.; Froula, D. H.; Marsh, K. A.; Joshi, C.; Lemos, N.

    2017-10-01

    The direct laser acceleration (DLA) of electrons in a laser wakefield accelerator (LWFA) has been investigated. We show that when there is a significant overlap between the drive laser and the trapped electrons in a LWFA cavity, the accelerating electrons can gain energy from the DLA mechanism in addition to LWFA. The properties of the electron beams produced in a LWFA, where the electrons are injected by ionization injection, have been investigated using particle-in-cell (PIC) code simulations. Particle tracking was used to demonstrate the presence of DLA in LWFA. Further PIC simulations comparing LWFA with and without DLA show that the presence of DLA can lead to electron beams that have maximum energies that exceed the estimates given by the theory for the ideal blowout regime. The magnitude of the contribution of DLA to the energy gained by the electron was found to be on the order of the LWFA contribution. The presence of DLA in a LWFA can also lead to enhanced betatron oscillation amplitudes and increased divergence in the direction of the laser polarization. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  14. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-08-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  15. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-06-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  16. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-09-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  17. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-06-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  18. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-01-18

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  19. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-08-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  20. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-12-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  1. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-02-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  2. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-09-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled

  3. Empirical evidence for site coefficients in building code provisions

    USGS Publications Warehouse

    Borcherdt, R.D.

    2002-01-01

    Site-response coefficients, Fa and Fv, used in U.S. building code provisions are based on empirical data for motions up to 0.1 g. For larger motions they are based on theoretical and laboratory results. The Northridge earthquake of 17 January 1994 provided a significant new set of empirical data up to 0.5 g. These data together with recent site characterizations based on shear-wave velocity measurements provide empirical estimates of the site coefficients at base accelerations up to 0.5 g for Site Classes C and D. These empirical estimates of Fa and Fnu; as well as their decrease with increasing base acceleration level are consistent at the 95 percent confidence level with those in present building code provisions, with the exception of estimates for Fa at levels of 0.1 and 0.2 g, which are less than the lower confidence bound by amounts up to 13 percent. The site-coefficient estimates are consistent at the 95 percent confidence level with those of several other investigators for base accelerations greater than 0.3 g. These consistencies and present code procedures indicate that changes in the site coefficients are not warranted. Empirical results for base accelerations greater than 0.2 g confirm the need for both a short- and a mid- or long-period site coefficient to characterize site response for purposes of estimating site-specific design spectra.

  4. GPU-accelerated atmospheric chemical kinetics in the ECHAM/MESSy (EMAC) Earth system model (version 2.52)

    NASA Astrophysics Data System (ADS)

    Alvanos, Michail; Christoudias, Theodoros

    2017-10-01

    This paper presents an application of GPU accelerators in Earth system modeling. We focus on atmospheric chemical kinetics, one of the most computationally intensive tasks in climate-chemistry model simulations. We developed a software package that automatically generates CUDA kernels to numerically integrate atmospheric chemical kinetics in the global climate model ECHAM/MESSy Atmospheric Chemistry (EMAC), used to study climate change and air quality scenarios. A source-to-source compiler outputs a CUDA-compatible kernel by parsing the FORTRAN code generated by the Kinetic PreProcessor (KPP) general analysis tool. All Rosenbrock methods that are available in the KPP numerical library are supported.Performance evaluation, using Fermi and Pascal CUDA-enabled GPU accelerators, shows achieved speed-ups of 4. 5 × and 20. 4 × , respectively, of the kernel execution time. A node-to-node real-world production performance comparison shows a 1. 75 × speed-up over the non-accelerated application using the KPP three-stage Rosenbrock solver. We provide a detailed description of the code optimizations used to improve the performance including memory optimizations, control code simplification, and reduction of idle time. The accuracy and correctness of the accelerated implementation are evaluated by comparing to the CPU-only code of the application. The median relative difference is found to be less than 0.000000001 % when comparing the output of the accelerated kernel the CPU-only code.The approach followed, including the computational workload division, and the developed GPU solver code can potentially be used as the basis for hardware acceleration of numerous geoscientific models that rely on KPP for atmospheric chemical kinetics applications.

  5. Hydrologic transport of depleted uranium associated with open air dynamic range testing at Los Alamos National Laboratory, New Mexico, and Eglin Air Force Base, Florida

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, N.M.; Vanta, E.B.

    Hydrologic investigations on depleted uranium fate and transport associated with dynamic testing activities were instituted in the 1980`s at Los Alamos National Laboratory and Eglin Air Force Base. At Los Alamos, extensive field watershed investigations of soil, sediment, and especially runoff water were conducted. Eglin conducted field investigations and runoff studies similar to those at Los Alamos at former and active test ranges. Laboratory experiments complemented the field investigations at both installations. Mass balance calculations were performed to quantify the mass of expended uranium which had transported away from firing sites. At Los Alamos, it is estimated that more thanmore » 90 percent of the uranium still remains in close proximity to firing sites, which has been corroborated by independent calculations. At Eglin, we estimate that 90 to 95 percent of the uranium remains at test ranges. These data demonstrate that uranium moves slowly via surface water, in both semi-arid (Los Alamos) and humid (Eglin) environments.« less

  6. Refinements in the Los Alamos model of the prompt fission neutron spectrum

    DOE PAGES

    Madland, D. G.; Kahler, A. C.

    2017-01-01

    This paper presents a number of refinements to the original Los Alamos model of the prompt fission neutron spectrum and average prompt neutron multiplicity as derived in 1982. The four refinements are due to new measurements of the spectrum and related fission observables many of which were not available in 1982. Here, they are also due to a number of detailed studies and comparisons of the model with previous and present experimental results including not only the differential spectrum, but also integal cross sections measured in the field of the differential spectrum. The four refinements are (a) separate neutron contributionsmore » in binary fission, (b) departure from statistical equilibrium at scission, (c) fission-fragment nuclear level-density models, and (d) center-of-mass anisotropy. With these refinements, for the first time, good agreement has been obtained for both differential and integral measurements using the same Los Alamos model spectrum.« less

  7. Surface Water Data at Los Alamos National Laboratory: 2002 Water Year

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.A. Shaull; D. Ortiz; M.R. Alexander

    2003-03-03

    The principal investigators collected and computed surface water discharge data from 34 stream-gaging stations that cover most of Los Alamos National Laboratory and one at Bandelier National Monument. Also included are discharge data from three springs--two that flow into Canon de Valle and one that flows into Water Canyon--and peak flow data from 16 stations.

  8. Surface Water Data at Los Alamos National Laboratory 2006 Water Year

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.P. Romero, D. Ortiz, G. Kuyumjian

    2007-08-01

    The principal investigators collected and computed surface water discharge data from 44 stream-gaging stations that cover most of Los Alamos National Laboratory and one at Bandelier National Monument. Also included are discharge data from three springs--two that flow into Canon de Valle and one that flows into Water Canyon--and peak flow data for 44 stations.

  9. Simplifying Complexity: Miriam Blake--Los Alamos National Laboratory Research Library, NM

    ERIC Educational Resources Information Center

    Library Journal, 2004

    2004-01-01

    The holy grail for many research librarians is one-stop searching: seamless access to all the library's resources on a topic, regardless of the source. Miriam Blake, Library Without Walls Project Leader at Los Alamos National laboratory (LANL), is making this vision a reality. Blake is part of a growing cadre of experts: a techie who is becoming a…

  10. AMBER: a PIC slice code for DARHT

    NASA Astrophysics Data System (ADS)

    Vay, Jean-Luc; Fawley, William

    1999-11-01

    The accelerator for the second axis of the Dual Axis Radiographic Hydrodynamic Test (DARHT) facility will produce a 4-kA, 20-MeV, 2-μ s output electron beam with a design goal of less than 1000 π mm-mrad normalized transverse emittance and less than 0.5-mm beam centroid motion. In order to study the beam dynamics throughout the accelerator, we have developed a slice Particle-In-Cell code named AMBER, in which the beam is modeled as a time-steady flow, subject to self, as well as external, electrostatic and magnetostatic fields. The code follows the evolution of a slice of the beam as it propagates through the DARHT accelerator lattice, modeled as an assembly of pipes, solenoids and gaps. In particular, we have paid careful attention to non-paraxial phenomena that can contribute to nonlinear forces and possible emittance growth. We will present the model and the numerical techniques implemented, as well as some test cases and some preliminary results obtained when studying emittance growth during the beam propagation.

  11. Resource Management Technology: Los Alamos Technical Capabilities for Emergency Management,

    DTIC Science & Technology

    1983-07-18

    synthetic fuels from coal (analogous to the Fischer-Tropsch process), olefin polymerization, and flue - gas desulfurization . In order to successfully...world. It has been a major research effort here for decades. Also, in the area of desulfurization of flue gases, Los Alamos scientists have been...Tectonic and Geochemical Controls on Copper-Molybdenum Porphyry Mineralization in the Southwestern United States (M. J. Aldrich and A. W. Laughlin) 1.0.6

  12. Los Alamos Canyon Ice Rink Parking Flood Plain Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hathcock, Charles Dean; Keller, David Charles

    2015-02-10

    The project location is in Los Alamos Canyon east of the ice rink facility at the intersection of West and Omega roads (Figure 1). Forty eight parking spaces will be constructed on the north and south side of Omega Road, and a lighted walking path will be constructed to the ice rink. Some trees will be removed during this action. A guardrail of approximately 400 feet will be constructed along the north side of West Road to prevent unsafe parking in that area.

  13. Environmental surveillance at Los Alamos during 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuehne, David; Gallagher, Pat; Hjeresen, Denny

    2009-09-30

    Environmental Surveillance at Los Alamos reports are prepared annually by the Los Alamos National Laboratory (the Laboratory) Environmental Programs Directorate, as required by US Department of Energy Order 450.1, General Environmental Protection Program, and US Department of Energy Order 231.1A, Environment, Safety, and Health Reporting. These annual reports summarize environmental data that are used to determine compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and departmental policies. Additional data, beyond the minimum required, are also gathered and reported as part of the Laboratory’s efforts to ensure public safety and to monitor environmental quality at andmore » near the Laboratory. Chapter 1 provides an overview of the Laboratory’s major environmental programs and explains the risks and the actions taken to reduce risks at the Laboratory from environmental legacies and waste management operations. Chapter 2 reports the Laboratory’s compliance status for 2007. Chapter 3 provides a summary of the maximum radiological dose the public and biota populations could have potentially received from Laboratory operations and discusses chemical exposures. The environmental surveillance and monitoring data are organized by environmental media (Chapter 4, air; Chapters 5 and 6, water and sediments; Chapter 7, soils; and Chapter 8, foodstuffs and biota) in a format to meet the needs of a general and scientific audience. Chapter 9 provides a summary of the status of environmental restoration work around LANL. A glossary and a list of acronyms and abbreviations are in the back of the report. Appendix A explains the standards for environmental contaminants, Appendix B explains the units of measurements used in this report, Appendix C describes the Laboratory’s technical areas and their associated programs, and Appendix D provides web links to more information.« less

  14. An organizational survey of the Los Alamos Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shurberg, D.A.; Haber, S.B.

    An Organizational Survey (OS) was administered at the Los Alamos Site that queried employees on the subjects of organizational culture, various aspects of communications, employee commitment, work group cohesion, coordination of work, environmental, safety, and health concern, hazardous nature of work, safety and overall job satisfaction. The purpose of the OS is to measure in a quantitative and objective way the notion of culture;'' that is, the values, attitudes, and beliefs of the individuals working within the organization. In addition, through the OS, a broad sample of individuals can be reached that would probably not be interviewed or observed duringmore » the course of a typical assessment. The OS also provides a descriptive profile of the organization at one point in time that can then be compared to a profile taken at a different point in time to assess changes in the culture of the organization. While comparisons among groups are made, it is not the purpose of this report to make evaluative statements of which profile may be positive or negative. However, using the data presented in this report in conjunction with other evaluative activities, may provide useful insight into the organization. The OS administration at the Los Alamos Site was the ninth to occur at a Department of Energy (DOE) facility. All data from the OS is presented in group summaries, by organization, department or directorate within organization, supervisory level both overall and within organization, and staff classification within organization. Statistically significant differences between groups are identified and discussed. 9 refs., 94 figs., 11 tabs.« less

  15. An organizational survey of the Los Alamos Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shurberg, D.A.; Haber, S.B.

    An Organizational Survey (OS) was administered at the Los Alamos Site that queried employees on the subjects of organizational culture, various aspects of communications, employee commitment, work group cohesion, coordination of work, environmental, safety, and health concern, hazardous nature of work, safety and overall job satisfaction. The purpose of the OS is to measure in a quantitative and objective way the notion of ``culture;`` that is, the values, attitudes, and beliefs of the individuals working within the organization. In addition, through the OS, a broad sample of individuals can be reached that would probably not be interviewed or observed duringmore » the course of a typical assessment. The OS also provides a descriptive profile of the organization at one point in time that can then be compared to a profile taken at a different point in time to assess changes in the culture of the organization. While comparisons among groups are made, it is not the purpose of this report to make evaluative statements of which profile may be positive or negative. However, using the data presented in this report in conjunction with other evaluative activities, may provide useful insight into the organization. The OS administration at the Los Alamos Site was the ninth to occur at a Department of Energy (DOE) facility. All data from the OS is presented in group summaries, by organization, department or directorate within organization, supervisory level both overall and within organization, and staff classification within organization. Statistically significant differences between groups are identified and discussed. 9 refs., 94 figs., 11 tabs.« less

  16. Calculated criticality for sup 235 U/graphite systems using the VIM Monte Carlo code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, P.J.; Grasseschi, G.L.; Olsen, D.N.

    1992-01-01

    Calculations for highly enriched uranium and graphite systems gained renewed interest recently for the new production modular high-temperature gas-cooled reactor (MHTGR). Experiments to validate the physics calculations for these systems are being prepared for the Transient Reactor Test Facility (TREAT) reactor at Argonne National Laboratory (ANL-West) and in the Compact Nuclear Power Source facility at Los Alamos National Laboratory. The continuous-energy Monte Carlo code VIM, or equivalently the MCNP code, can utilize fully detailed models of the MHTGR and serve as benchmarks for the approximate multigroup methods necessary in full reactor calculations. Validation of these codes and their associated nuclearmore » data did not exist for highly enriched {sup 235}U/graphite systems. Experimental data, used in development of more approximate methods, dates back to the 1960s. The authors have selected two independent sets of experiments for calculation with the VIM code. The carbon-to-uranium (C/U) ratios encompass the range of 2,000, representative of the new production MHTGR, to the ratio of 10,000 in the fuel of TREAT. Calculations used the ENDF/B-V data.« less

  17. Chaotic dynamics in accelerator physics

    NASA Astrophysics Data System (ADS)

    Cary, J. R.

    1992-11-01

    Substantial progress was made in several areas of accelerator dynamics. We have completed a design of an FEL wiggler with adiabatic trapping and detrapping sections to develop an understanding of longitudinal adiabatic dynamics and to create efficiency enhancements for recirculating free-electron lasers. We developed a computer code for analyzing the critical KAM tori that binds the dynamic aperture in circular machines. Studies of modes that arise due to the interaction of coating beams with a narrow-spectrum impedance have begun. During this research educational and research ties with the accelerator community at large have been strengthened.

  18. Physics in ;Real Life;: Accelerator-based Research with Undergraduates

    NASA Astrophysics Data System (ADS)

    Klay, J. L.

    All undergraduates in physics and astronomy should have access to significant research experiences. When given the opportunity to tackle challenging open-ended problems outside the classroom, students build their problem-solving skills in ways that better prepare them for the workplace or future research in graduate school. Accelerator-based research on fundamental nuclear and particle physics can provide a myriad of opportunities for undergraduate involvement in hardware and software development as well as ;big data; analysis. The collaborative nature of large experiments exposes students to scientists of every culture and helps them begin to build their professional network even before they graduate. This paper presents an overview of my experiences - the good, the bad, and the ugly - engaging undergraduates in particle and nuclear physics research at the CERN Large Hadron Collider and the Los Alamos Neutron Science Center.

  19. Single-interface Richtmyer-Meshkov turbulent mixing at the Los Alamos Vertical Shock Tube

    DOE PAGES

    Wilson, Brandon Merrill; Mejia Alvarez, Ricardo; Prestridge, Katherine Philomena

    2016-04-12

    We studied Mach number and initial conditions effects on Richtmyer–Meshkov (RM) mixing by the vertical shock tube (VST) at Los Alamos National Laboratory (LANL). At the VST, a perturbed stable light-to-heavy (air–SF 6, A=0.64) interface is impulsively accelerated with a shock wave to induce RM mixing. We investigate changes to both large and small scales of mixing caused by changing the incident Mach number (Ma=1.3 and 1.45) and the three-dimensional (3D) perturbations on the interface. Simultaneous density (quantitative planar laser-induced fluorescence (PLIF)) and velocity (particle image velocimetry (PIV)) measurements are used to characterize preshock initial conditions and the dynamic shockedmore » interface. Initial conditions and fluid properties are characterized before shock. Using two types of dynamic measurements, time series (N=5 realizations at ten locations) and statistics (N=100 realizations at a single location) of the density and velocity fields, we calculate several mixing quantities. Mix width, density-specific volume correlations, density–vorticity correlations, vorticity, enstrophy, strain, and instantaneous dissipation rate are examined at one downstream location. Results indicate that large-scale mixing, such as the mix width, is strongly dependent on Mach number, whereas small scales are strongly influenced by initial conditions. Lastly, the enstrophy and strain show focused mixing activity in the spike regions.« less

  20. Neptune: An astrophysical smooth particle hydrodynamics code for massively parallel computer architectures

    NASA Astrophysics Data System (ADS)

    Sandalski, Stou

    Smooth particle hydrodynamics is an efficient method for modeling the dynamics of fluids. It is commonly used to simulate astrophysical processes such as binary mergers. We present a newly developed GPU accelerated smooth particle hydrodynamics code for astrophysical simulations. The code is named neptune after the Roman god of water. It is written in OpenMP parallelized C++ and OpenCL and includes octree based hydrodynamic and gravitational acceleration. The design relies on object-oriented methodologies in order to provide a flexible and modular framework that can be easily extended and modified by the user. Several pre-built scenarios for simulating collisions of polytropes and black-hole accretion are provided. The code is released under the MIT Open Source license and publicly available at http://code.google.com/p/neptune-sph/.

  1. Computer modeling of test particle acceleration at oblique shocks

    NASA Technical Reports Server (NTRS)

    Decker, Robert B.

    1988-01-01

    The present evaluation of the basic techniques and illustrative results of charged particle-modeling numerical codes suitable for particle acceleration at oblique, fast-mode collisionless shocks emphasizes the treatment of ions as test particles, calculating particle dynamics through numerical integration along exact phase-space orbits. Attention is given to the acceleration of particles at planar, infinitessimally thin shocks, as well as to plasma simulations in which low-energy ions are injected and accelerated at quasi-perpendicular shocks with internal structure.

  2. New geochronologic and stratigraphic evidence confirms the paleocene age of the dinosaur-bearing ojo alamo sandstone and animas formation in the San Juan Basin, New Mexico and Colorado

    USGS Publications Warehouse

    Fassett, J.E.

    2009-01-01

    Dinosaur fossils are present in the Paleocene Ojo Alamo Sandstone and Animas Formation in the San Juan Basin, New Mexico, and Colorado. Evidence for the Paleo-cene age of the Ojo Alamo Sandstone includes palynologic and paleomagnetic data. Palynologic data indicate that the entire Ojo Alamo Sandstone, including the lower dinosaur-bearing part, is Paleocene in age. All of the palynomorph-productive rock samples collected from the Ojo Alamo Sandstone at multiple localities lacked Creta-ceous index palynomorphs (except for rare, reworked specimens) and produced Paleocene index palynomorphs. Paleocene palynomorphs have been identified strati-graphically below dinosaur fossils at two separate localities in the Ojo Alamo Sand-stone in the central and southern parts of the basin. The Animas Formation in the Colorado part of the basin also contains dinosaur fossils, and its Paleocene age has been established based on fossil leaves and palynology. Magnetostratigraphy provides independent evidence for the Paleocene age of the Ojo Alamo Sandstone and its dinosaur-bearing beds. Normal-polarity magnetochron C29n (early Paleocene) has been identified in the Ojo Alamo Sandstone at six localities in the southern part of the San Juan Basin. An assemblage of 34 skeletal elements from a single hadrosaur, found in the Ojo Alamo Sandstone in the southern San Juan Basin, provided conclusive evidence that this assemblage could not have been reworked from underlying Cretaceous strata. In addition, geochemical studies of 15 vertebrate bones from the Paleocene Ojo Alamo Sandstone and 15 bone samples from the underlying Kirtland Formation of Late Creta-ceous (Campanian) age show that each sample suite contained distinctly different abundances of uranium and rare-earth elements, indicating that the bones were miner-alized in place soon after burial, and that none of the Paleocene dinosaur bones ana-lyzed had been reworked. ?? U.S. Geological Survey, Public Domain April 2009.

  3. Validation of the WIMSD4M cross-section generation code with benchmark results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leal, L.C.; Deen, J.R.; Woodruff, W.L.

    1995-02-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment for Research and Test (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the procedure to generatemore » cross-section libraries for reactor analyses and calculations utilizing the WIMSD4M code. To do so, the results of calculations performed with group cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory(ORNL) unreflected critical spheres, the TRX critical experiments, and calculations of a modified Los Alamos highly-enriched heavy-water moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less

  4. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational

  5. Synergia: an accelerator modeling tool with 3-D space charge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amundson, James F.; Spentzouris, P.; /Fermilab

    2004-07-01

    High precision modeling of space-charge effects, together with accurate treatment of single-particle dynamics, is essential for designing future accelerators as well as optimizing the performance of existing machines. We describe Synergia, a high-fidelity parallel beam dynamics simulation package with fully three dimensional space-charge capabilities and a higher order optics implementation. We describe the computational techniques, the advanced human interface, and the parallel performance obtained using large numbers of macroparticles. We also perform code benchmarks comparing to semi-analytic results and other codes. Finally, we present initial results on particle tune spread, beam halo creation, and emittance growth in the Fermilab boostermore » accelerator.« less

  6. Efficient modeling of laser-plasma accelerator staging experiments using INF&RNO

    NASA Astrophysics Data System (ADS)

    Benedetti, C.; Schroeder, C. B.; Geddes, C. G. R.; Esarey, E.; Leemans, W. P.

    2017-03-01

    The computational framework INF&RNO (INtegrated Fluid & paRticle simulatioN cOde) allows for fast and accurate modeling, in 2D cylindrical geometry, of several aspects of laser-plasma accelerator physics. In this paper, we present some of the new features of the code, including the quasistatic Particle-In-Cell (PIC)/fluid modality, and describe using different computational grids and time steps for the laser envelope and the plasma wake. These and other features allow for a speedup of several orders of magnitude compared to standard full 3D PIC simulations while still retaining physical fidelity. INF&RNO is used to support the experimental activity at the BELLA Center, and we will present an example of the application of the code to the laser-plasma accelerator staging experiment.

  7. Tiger Team Assessment of the Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-11-01

    The purpose of the safety and health assessment was to determine the effectiveness of representative safety and health programs at the Los Alamos National Laboratory (LANL). Within the safety and health programs at LANL, performance was assessed in the following technical areas: Organization and Administration, Quality Verification, Operations, Maintenance, Training and Certification, Auxiliary Systems, Emergency Preparedness, Technical Support, Packaging and Transportation, Nuclear Criticality Safety, Security/Safety Interface, Experimental Activities, Site/Facility Safety Review, Radiological Protection, Personnel Protection, Worker Safety and Health (OSHA) Compliance, Fire Protection, Aviation Safety, Explosives Safety, Natural Phenomena, and Medical Services.

  8. Validation of the WIMSD4M cross-section generation code with benchmark results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section librariesmore » for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less

  9. Growth promotion and colonization of switchgrass (Panicum virgatum) cv. Alamo by bacterial endophyte Burkholderia phytofirmans strain PsJN

    PubMed Central

    2012-01-01

    Background Switchgrass is one of the most promising bioenergy crop candidates for the US. It gives relatively high biomass yield and can grow on marginal lands. However, its yields vary from year to year and from location to location. Thus it is imperative to develop a low input and sustainable switchgrass feedstock production system. One of the most feasible ways to increase biomass yields is to harness benefits of microbial endophytes. Results We demonstrate that one of the most studied plant growth promoting bacterial endophytes, Burkholderia phytofirmans strain PsJN, is able to colonize and significantly promote growth of switchgrass cv. Alamo under in vitro, growth chamber, and greenhouse conditions. In several in vitro experiments, the average fresh weight of PsJN-inoculated plants was approximately 50% higher than non-inoculated plants. When one-month-old seedlings were grown in a growth chamber for 30 days, the PsJN-inoculated Alamo plants had significantly higher shoot and root biomass compared to controls. Biomass yield (dry weight) averaged from five experiments was 54.1% higher in the inoculated treatment compared to non-inoculated control. Similar results were obtained in greenhouse experiments with transplants grown in 4-gallon pots for two months. The inoculated plants exhibited more early tillers and persistent growth vigor with 48.6% higher biomass than controls. We also found that PsJN could significantly promote growth of switchgrass cv. Alamo under sub-optimal conditions. However, PsJN-mediated growth promotion in switchgrass is genotype specific. Conclusions Our results show B. phytofirmans strain PsJN significantly promotes growth of switchgrass cv. Alamo under different conditions, especially in the early growth stages leading to enhanced production of tillers. This phenomenon may benefit switchgrass establishment in the first year. Moreover, PsJN significantly stimulated growth of switchgrass cv. Alamo under sub-optimal conditions

  10. Design of Linear Accelerator (LINAC) tanks for proton therapy via Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellano, T.; De Palma, L.; Laneve, D.

    2015-07-01

    A homemade computer code for designing a Side- Coupled Linear Accelerator (SCL) is written. It integrates a simplified model of SCL tanks with the Particle Swarm Optimization (PSO) algorithm. The computer code main aim is to obtain useful guidelines for the design of Linear Accelerator (LINAC) resonant cavities. The design procedure, assisted via the aforesaid approach seems very promising, allowing future improvements towards the optimization of actual accelerating geometries. (authors)

  11. Radonuclide concentrations in bees and honey in the vicinity of Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fresquez, P.R.; Armstrong, D.R.

    Honeybees are effective monitors of environmental pollution; they forage for P len and nectar over a large area ({congruent}7 km{sup 2}), accumulate contaminants from air, water, plants, and soil, and return to a fixed location (the hive) for sampling. Los Alamos National Laboratory (LANL), in fact, has maintained a network of honeybee colonies within and around LANL for 16 years (1979 to 1994); the objectives for maintaining this honeybee network were to (1) determine the bioavailability of radionuclides in the environment, and (2) the committed effective dose equivalent (CEDE) to people who may consume honey from these beehives (Los Alamosmore » and White Rock/Pajarito Acres lownsites). Of all the radionuclides studied over the years, tritium (314) was consistently picked up by the bees and was most readily transferred to the honey. Tritium in honey collected from hives located within LANL, for example, ranged in concentration from 0.07 Bq mL{sup -1} (1.9 pCi mL{sup -1}) to 27.75 Bq mL{sup -1} (749.9 pCi mL{sup -1}) (LANL Neutron Science Center); the average concentration of {sup 3}H in honey Collected from hives located around the LANL area (perimeter) ranged in concentration from 0.34 Bq mL{sup -1} (9.3 pCi mL{sup -1}) (White Rock/Pajarito Acres townsite) to 3.67 Bq mL{sup -1} (99.3 pCi mL{sup -1}) (Los Alamos townsite). Overall, the CEDE-based on the average concentration of all radionuclides measured over the years-from consuming 5 kg (11 lbs) of honey collected from hives located within the townsites of Los Alamos and White Rock/Pajarito Acres, after regional (background) as been subtracted, was 0.074 {mu}Sv y{sup -1} (0.0074 mrem y{sup -1}) and 0.024 pSv y{sup -1} (0.0024 mrem y{sup -1}), respectively. The highest CEDE, based on the mean + 2 standard deviations (95% confidence level), was 0.334 fiSv y{sup -1} (0.0334 mrem y{sup -1}) (Los Alamos townsitc).« less

  12. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  13. Environmental Survey preliminary report, Los Alamos National Laboratory, Los Alamos, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-01-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy's (DOE) Los Alamos National Laboratory (LANL), conducted March 29, 1987 through April 17, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team components are outside experts being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the LANL. The Survey covers all environmental media andmore » all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. The on-site phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at the LANL, and interviews with site personnel. The Survey team developed Sampling and Analysis Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The Sampling and Analysis Plan will be executed by the Idaho National Engineering Laboratory. When completed, the results will be incorporated into the LANL Environmental Survey Interim Report. The Interim Report will reflect the final determinations of the Survey for the LANL. 65 refs., 68 figs., 73 tabs.« less

  14. Extraordinary Tools for Extraordinary Science: The Impact ofSciDAC on Accelerator Science&Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert D.

    2006-08-10

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook''. Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now takemore » hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.« less

  15. Stormwater Pollution Prevention Plan for the TA-60-02 Salvage Warehouse, Los Alamos National Laboratory, Revision 3, January 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgin, Jillian Elizabeth

    This Storm Water Pollution Prevention Plan (SWPPP) was developed in accordance with the provisions of the Clean Water Act (33 U.S.C. §§1251 et seq., as amended), and the Multi-Sector General Permit for Storm Water Discharges Associated with Industrial Activity (U.S. EPA, June 2015) issued by the U.S. Environmental Protection Agency (EPA) for the National Pollutant Discharge Elimination System (NPDES) and using the industry specific permit requirements for Sector P-Land Transportation and Warehousing as a guide. The applicable stormwater discharge permit is EPA General Permit Registration Number NMR053915 (Los Alamos National Security (LANS) (U.S. EPA, June 2015). Contents of the Junemore » 4, 2015 Multi-sector General Permit can be viewed at: https://www.epa.gov/sites/production/files/2015- 10/documents/msgp2015_finalpermit.pdf This SWPPP applies to discharges of stormwater from the operational areas of the TA-60-02 Salvage and Warehouse facility at Los Alamos National Laboratory. Los Alamos National Laboratory (also referred to as LANL or the “Laboratory”) is owned by the Department of Energy (DOE), and is operated by Los Alamos National Security, LLC (LANS). Throughout this document, the term “facility” refers to the TA-60-02 Salvage/ Warehouse and associated areas. The current permit expires at midnight on June 4, 2020. A copy of the facility NOI and LANS Delegation of Authority Letter are located in Appendix C of this SWPPP.« less

  16. Environmental surveillance at Los Alamos during 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuehne, David; Poff, Ben; Hjeresen, Denny

    2010-09-30

    Environmental Surveillance at Los Alamos reports are prepared annually by the Los Alamos National Laboratory (the Laboratory) environmental organization, as required by US Department of Energy Order 5400.1, General Environmental Protection Program, and US Department of Energy Order 231.1A, Environment, Safety, and Health Reporting. These annual reports summarize environmental data that are used to determine compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and departmental policies. Additional data, beyond the minimum required, are also gathered and reported as part of the Laboratory’s efforts to ensure public safety and to monitor environmental quality at and nearmore » the Laboratory. Chapter 1 provides an overview of the Laboratory’s major environmental programs and explains the risks and the actions taken to reduce risks at the Laboratory from environmental legacies and waste management operations. Chapter 2 reports the Laboratory’s compliance status for 2009. Chapter 3 provides a summary of the maximum radiological dose the public and biota populations could have potentially received from Laboratory operations and discusses chemical exposures. The environmental surveillance and monitoring data are organized by environmental media (air in Chapter 4; water and sediments in Chapters 5 and 6; soils in Chapter 7; and foodstuffs and biota in Chapter 8) in a format to meet the needs of a general and scientific audience. Chapter 9 provides a summary of the status of environmental restoration work around LANL. The new Chapter 10 describes the Laboratory’s environmental stewardship efforts and provides an overview of the health of the Rio Grande. A glossary and a list of acronyms and abbreviations are in the back of the report. Appendix A explains the standards for environmental contaminants, Appendix B explains the units of measurements used in this report, Appendix C describes the Laboratory

  17. Environmental surveillance at Los Alamos during 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohen, K.; Stoker, A.; Stone, G.

    1994-07-01

    This report describes the environmental surveillance program at Los Alamos National Laboratory during 1992. The Laboratory routinely monitors for radiation and for radioactive and nonradioactive materials at (or on) Laboratory sites as well as in the surrounding region. LANL uses the monitoring results to determine compliance with appropriate standards and to identify potentially undesirable trends. Data were collected in 1992 to assess external penetrating radiation; quantities of airborne emissions and liquid effluents; concentrations of chemicals and radionuclides in ambient air, surface waters and groundwaters, municipal water supply, soils and sediments, and foodstuffs; and environmental compliance. Using comparisons with standards, regulations,more » and background levels, this report concludes that environmental effects from Laboratory operations are small and do not pose a demonstrable threat to the public, laboratory employees, or the environment.« less

  18. Dissemination and support of ARGUS for accelerator applications. Technical progress report, April 24, 1991--January 20, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User`s Guide that documents the use of the code for all users. To release the code and the User`s Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  19. Accurate and efficient spin integration for particle accelerators

    DOE PAGES

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; ...

    2015-02-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations.We evaluate their performance and accuracy in quantitative detail for individual elements as well as formore » the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.« less

  20. Graphics Processing Unit Acceleration of Gyrokinetic Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Hause, Benjamin; Parker, Scott

    2012-10-01

    We find a substantial increase in on-node performance using Graphics Processing Unit (GPU) acceleration in gyrokinetic delta-f particle-in-cell simulation. Optimization is performed on a two-dimensional slab gyrokinetic particle simulation using the Portland Group Fortran compiler with the GPU accelerator compiler directives. We have implemented the GPU acceleration on a Core I7 gaming PC with a NVIDIA GTX 580 GPU. We find comparable, or better, acceleration relative to the NERSC DIRAC cluster with the NVIDIA Tesla C2050 computing processor. The Tesla C 2050 is about 2.6 times more expensive than the GTX 580 gaming GPU. Optimization strategies and comparisons between DIRAC and the gaming PC will be presented. We will also discuss progress on optimizing the comprehensive three dimensional general geometry GEM code.

  1. Transport calculations and accelerator experiments needed for radiation risk assessment in space.

    PubMed

    Sihver, Lembit

    2008-01-01

    The major uncertainties on space radiation risk estimates in humans are associated to the poor knowledge of the biological effects of low and high LET radiation, with a smaller contribution coming from the characterization of space radiation field and its primary interactions with the shielding and the human body. However, to decrease the uncertainties on the biological effects and increase the accuracy of the risk coefficients for charged particles radiation, the initial charged-particle spectra from the Galactic Cosmic Rays (GCRs) and the Solar Particle Events (SPEs), and the radiation transport through the shielding material of the space vehicle and the human body, must be better estimated Since it is practically impossible to measure all primary and secondary particles from all possible position-projectile-target-energy combinations needed for a correct risk assessment in space, accurate particle and heavy ion transport codes must be used. These codes are also needed when estimating the risk for radiation induced failures in advanced microelectronics, such as single-event effects, etc., and the efficiency of different shielding materials. It is therefore important that the models and transport codes will be carefully benchmarked and validated to make sure they fulfill preset accuracy criteria, e.g. to be able to predict particle fluence, dose and energy distributions within a certain accuracy. When validating the accuracy of the transport codes, both space and ground based accelerator experiments are needed The efficiency of passive shielding and protection of electronic devices should also be tested in accelerator experiments and compared to simulations using different transport codes. In this paper different multipurpose particle and heavy ion transport codes will be presented, different concepts of shielding and protection discussed, as well as future accelerator experiments needed for testing and validating codes and shielding materials.

  2. Multitasking the three-dimensional shock wave code CTH on the Cray X-MP/416

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGlaun, J.M.; Thompson, S.L.

    1988-01-01

    CTH is a software system under development at Sandia National Laboratories Albuquerque that models multidimensional, multi-material, large-deformation, strong shock wave physics. CTH was carefully designed to both vectorize and multitask on the Cray X-MP/416. All of the physics routines are vectorized except the thermodynamics and the interface tracer. All of the physics routines are multitasked except the boundary conditions. The Los Alamos National Laboratory multitasking library was used for the multitasking. The resulting code is easy to maintain, easy to understand, gives the same answers as the unitasked code, and achieves a measured speedup of approximately 3.5 on the fourmore » cpu Cray. This document discusses the design, prototyping, development, and debugging of CTH. It also covers the architecture features of CTH that enhances multitasking, granularity of the tasks, and synchronization of tasks. The utility of system software and utilities such as simulators and interactive debuggers are also discussed. 5 refs., 7 tabs.« less

  3. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less

  4. A Handbook for Derivative Classifiers at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinkula, Barbara Jean

    The Los Alamos Classification Office (within the SAFE-IP group) prepared this handbook as a resource for the Laboratory’s derivative classifiers (DCs). It contains information about United States Government (USG) classification policy, principles, and authorities as they relate to the LANL Classification Program in general, and to the LANL DC program specifically. At a working level, DCs review Laboratory documents and material that are subject to classification review requirements, while the Classification Office provides the training and resources for DCs to perform that vital function.

  5. Modeling multi-GeV class laser-plasma accelerators with INF&RNO

    NASA Astrophysics Data System (ADS)

    Benedetti, Carlo; Schroeder, Carl; Bulanov, Stepan; Geddes, Cameron; Esarey, Eric; Leemans, Wim

    2016-10-01

    Laser plasma accelerators (LPAs) can produce accelerating gradients on the order of tens to hundreds of GV/m, making them attractive as compact particle accelerators for radiation production or as drivers for future high-energy colliders. Understanding and optimizing the performance of LPAs requires detailed numerical modeling of the nonlinear laser-plasma interaction. We present simulation results, obtained with the computationally efficient, PIC/fluid code INF&RNO (INtegrated Fluid & paRticle simulatioN cOde), concerning present (multi-GeV stages) and future (10 GeV stages) LPA experiments performed with the BELLA PW laser system at LBNL. In particular, we will illustrate the issues related to the guiding of a high-intensity, short-pulse, laser when a realistic description for both the laser driver and the background plasma is adopted. Work Supported by the U.S. Department of Energy under contract No. DE-AC02-05CH11231.

  6. pycola: N-body COLA method code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Eisenstein, Daniel J.; Wandelt, Benjamin D.; Zaldarriagag, Matias

    2015-09-01

    pycola is a multithreaded Python/Cython N-body code, implementing the Comoving Lagrangian Acceleration (COLA) method in the temporal and spatial domains, which trades accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing. The COLA method achieves its speed by calculating the large-scale dynamics exactly using LPT while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos.

  7. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  8. Use of color-coded sleeve shutters accelerates oscillograph channel selection

    NASA Technical Reports Server (NTRS)

    Bouchlas, T.; Bowden, F. W.

    1967-01-01

    Sleeve-type shutters mechanically adjust individual galvanometer light beams onto or away from selected channels on oscillograph papers. In complex test setups, the sleeve-type shutters are color coded to separately identify each oscillograph channel. This technique could be used on any equipment using tubular galvanometer light sources.

  9. The ZPIC educational code suite

    NASA Astrophysics Data System (ADS)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  10. Laser-driven dielectric electron accelerator for radiobiology researches

    NASA Astrophysics Data System (ADS)

    Koyama, Kazuyoshi; Matsumura, Yosuke; Uesaka, Mitsuru; Yoshida, Mitsuhiro; Natsui, Takuya; Aimierding, Aimidula

    2013-05-01

    In order to estimate the health risk associated with a low dose radiation, the fundamental process of the radiation effects in a living cell must be understood. It is desired that an electron bunch or photon pulse precisely knock a cell nucleus and DNA. The required electron energy and electronic charge of the bunch are several tens keV to 1 MeV and 0.1 fC to 1 fC, respectively. The smaller beam size than micron is better for the precise observation. Since the laser-driven dielectric electron accelerator seems to suite for the compact micro-beam source, a phase-modulation-masked-type laser-driven dielectric accelerator was studied. Although the preliminary analysis made a conclusion that a grating period and an electron speed must satisfy the matching condition of LG/λ = v/c, a deformation of a wavefront in a pillar of the grating relaxed the matching condition and enabled the slow electron to be accelerated. The simulation results by using the free FDTD code, Meep, showed that the low energy electron of 20 keV felt the acceleration field strength of 20 MV/m and gradually felt higher field as the speed was increased. Finally the ultra relativistic electron felt the field strength of 600 MV/m. The Meep code also showed that a length of the accelerator to get energy of 1 MeV was 3.8 mm, the required laser power and energy were 11 GW and 350 mJ, respectively. Restrictions on the laser was eased by adopting sequential laser pulses. If the accelerator is illuminated by sequential N pulses, the pulse power, pulse width and the pulse energy are reduced to 1/N, 1/N and 1/N2, respectively. The required laser power per pulse is estimated to be 2.2 GW when ten pairs of sequential laser pulse is irradiated.

  11. Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.

    1990-01-01

    Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less

  12. Seismic site coefficients and acceleration design response spectra based on conditions in South Carolina : final report.

    DOT National Transportation Integrated Search

    2014-11-15

    The simplified procedure in design codes for determining earthquake response spectra involves : estimating site coefficients to adjust available rock accelerations to site accelerations. Several : investigators have noted concerns with the site coeff...

  13. Extraordinary tools for extraordinary science: the impact of SciDAC on accelerator science and technology

    NASA Astrophysics Data System (ADS)

    Ryne, Robert D.

    2006-09-01

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook.'' Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now take hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.

  14. Empirical evidence for acceleration-dependent amplification factors

    USGS Publications Warehouse

    Borcherdt, R.D.

    2002-01-01

    Site-specific amplification factors, Fa and Fv, used in current U.S. building codes decrease with increasing base acceleration level as implied by the Loma Prieta earthquake at 0.1g and extrapolated using numerical models and laboratory results. The Northridge earthquake recordings of 17 January 1994 and subsequent geotechnical data permit empirical estimates of amplification at base acceleration levels up to 0.5g. Distance measures and normalization procedures used to infer amplification ratios from soil-rock pairs in predetermined azimuth-distance bins significantly influence the dependence of amplification estimates on base acceleration. Factors inferred using a hypocentral distance norm do not show a statistically significant dependence on base acceleration. Factors inferred using norms implied by the attenuation functions of Abrahamson and Silva show a statistically significant decrease with increasing base acceleration. The decrease is statistically more significant for stiff clay and sandy soil (site class D) sites than for stiffer sites underlain by gravely soils and soft rock (site class C). The decrease in amplification with increasing base acceleration is more pronounced for the short-period amplification factor, Fa, than for the midperiod factor, Fv.

  15. A history of the working group to address Los Alamos community health concerns - A case study of community involvement and risk communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harry Otway; Jon Johnson

    2000-01-01

    In May 1991, at a Department of Energy (DOE) public hearing at Los Alamos, New Mexico, a local artist claimed there had been a recent brain tumor cluster in a small Los Alamos neighborhood. He suggested the cause was radiation from past operations of Los Alamos National Laboratory. Data from the Laboratory's extensive environmental monitoring program gave no reason to believe this charge to be true but also could not prove it false. These allegations, reported in the local and regional media, alarmed the community and revealed an unsuspected lack of trust in the Laboratory. Having no immediate and definitivemore » response, the Laboratory offered to collaborate with the community to address this concern. The Los Alamos community accepted this offer and a joint Community-Laboratory Working Group met for the first time 29 days later. The working group set as its primary goal the search for possible carcinogens in the local environment. Meanwhile, the DOE announced its intention to fund the New Mexico Department of Health to perform a separate and independent epidemiological study of all Los Alamos cancer rates. In early 1994, after commissioning 17 environmental studies and meeting 34 times, the working group decided that the public health concerns had been resolved to the satisfaction of the community and voted to disband. This paper tells the story of the artist and the working group, and how the media covered their story. It summarizes the environmental studies directed by the working group and briefly reviews the main findings of the epidemiology study. An epilogue records the present-day recollections of some of the key players in this environmental drama.« less

  16. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  17. Bradbury science museum: your window to Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deck, Linda Theresa

    The Bradbury Science Museum is the public's window to Los Alamos National Laboratory and supports the Community Program Office's mission to develop community support to accomplish LANL's national security and science mission. It does this by stimulating interest in and increasing basic knowledge of science and technology in northern New Mexico audiences, and increasing public understanding and appreciation of how LANL science and technology solve our global problems. In performing these prime functions, the Museum also preserves the history of scientific accomplishment at the Lab by collecting and preserving artifacts of scientific and historical importance.

  18. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  19. Light element opacities of astrophysical interest from ATOMIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colgan, J.; Kilcrease, D. P.; Magee, N. H. Jr.

    We present new calculations of local-thermodynamic-equilibrium (LTE) light element opacities from the Los Alamos ATOMIC code for systems of astrophysical interest. ATOMIC is a multi-purpose code that can generate LTE or non-LTE quantities of interest at various levels of approximation. Our calculations, which include fine-structure detail, represent a systematic improvement over previous Los Alamos opacity calculations using the LEDCOP legacy code. The ATOMIC code uses ab-initio atomic structure data computed from the CATS code, which is based on Cowan's atomic structure codes, and photoionization cross section data computed from the Los Alamos ionization code GIPPER. ATOMIC also incorporates a newmore » equation-of-state (EOS) model based on the chemical picture. ATOMIC incorporates some physics packages from LEDCOP and also includes additional physical processes, such as improved free-free cross sections and additional scattering mechanisms. Our new calculations are made for elements of astrophysical interest and for a wide range of temperatures and densities.« less

  20. UFO: A THREE-DIMENSIONAL NEUTRON DIFFUSION CODE FOR THE IBM 704

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auerbach, E.H.; Jewett, J.P.; Ketchum, M.A.

    A description of UFO, a code for the solution of the fewgroup neutron diffusion equation in three-dimensional Cartesian coordinates on the IBM 704, is given. An accelerated Liebmann flux iteration scheme is used, and optimum parameters can be calculated by the code whenever they are required. The theory and operation of the program are discussed. (auth)

  1. Environmental surveillance at Los Alamos during 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-05-01

    This report describes the environmental surveillance program conducted by Los Alamos National Laboratory during 1987. Routine monitoring for radiation and radioactive or chemical materials is conducted on the Laboratory site as well as in the surrounding region. Monitoring results are used to determine compliance with appropriate standards and to permit early identification of potentially undesirable trends. Results and interpretation of data for 1987 cover: external penetrating radiation; quantities of airborne emissions and liquid effluents; concentrations of chemicals and radionuclides in ambient air, surface and ground waters, municipal water supply, soils and sediments, and foodstuffs; and environmental compliance. Comparisons with appropriatemore » standards, regulations, and background levels provide the basis for concluding that environmental effects from Laboratory operations are insignificant and do not pose a threat to the public, Laboratory employees, or the environment. 113 refs., 33 figs., 120 tabs.« less

  2. Environmental surveillance at Los Alamos during 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-10-01

    This report describes the environmental surveillance program at Los Alamos National Laboratory (LANL or the Laboratory) during 1995. The Laboratory routinely monitors for radiation and for radioactive and nonradioactive materials at (or on) Laboratory sites as well as in the surrounding region. LANL uses the monitoring result to determine compliance with appropriate standards and to identify potentially undesirable trends. Data were collected in 1995 to assess external penetrating radiation; quantities of airborne emissions and liquid effluents; concentrations of chemicals and radionuclides in ambient air, surface waters and groundwaters, municipal water supply, soils and sediments, and foodstuffs; and environmental compliance. Usingmore » comparisons with standards, regulations, and background levels, this report concludes that environmental effects from Laboratory operations are small and do not pose a demonstrable threat to the public, Laboratory employees, or the environment.« less

  3. Los Alamos Plutonium Facility Waste Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, K.; Montoya, A.; Wieneke, R.

    1997-02-01

    This paper describes the new computer-based transuranic (TRU) Waste Management System (WMS) being implemented at the Plutonium Facility at Los Alamos National Laboratory (LANL). The Waste Management System is a distributed computer processing system stored in a Sybase database and accessed by a graphical user interface (GUI) written in Omnis7. It resides on the local area network at the Plutonium Facility and is accessible by authorized TRU waste originators, count room personnel, radiation protection technicians (RPTs), quality assurance personnel, and waste management personnel for data input and verification. Future goals include bringing outside groups like the LANL Waste Management Facilitymore » on-line to participate in this streamlined system. The WMS is changing the TRU paper trail into a computer trail, saving time and eliminating errors and inconsistencies in the process.« less

  4. 2013 Los Alamos National Laboratory Hazardous Waste Minimization Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salzman, Sonja L.; English, Charles J.

    2015-08-24

    Waste minimization and pollution prevention are inherent goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE) and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program (a component of the overall Waste Minimization/Pollution Prevention [WMin/PP] Program) administered by the Environmentalmore » Stewardship Group (ENV-ES). This report also supports the waste minimization and pollution prevention goals of the Environmental Programs Directorate (EP) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. LANS was very successful in fiscal year (FY) 2013 (October 1-September 30) in WMin/PP efforts. Staff funded four projects specifically related to reduction of waste with hazardous constituents, and LANS won four national awards for pollution prevention efforts from the National Nuclear Security Administration (NNSA). In FY13, there was no hazardous, mixedtransuranic (MTRU), or mixed low-level (MLLW) remediation waste generated at the Laboratory. More hazardous waste, MTRU waste, and MLLW was generated in FY13 than in FY12, and the majority of the increase was related to MTRU processing or lab cleanouts. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.« less

  5. Increasing the power of accelerated molecular dynamics methods and plans to exploit the coming exascale

    NASA Astrophysics Data System (ADS)

    Voter, Arthur

    Many important materials processes take place on time scales that far exceed the roughly one microsecond accessible to molecular dynamics simulation. Typically, this long-time evolution is characterized by a succession of thermally activated infrequent events involving defects in the material. In the accelerated molecular dynamics (AMD) methodology, known characteristics of infrequent-event systems are exploited to make reactive events take place more frequently, in a dynamically correct way. For certain processes, this approach has been remarkably successful, offering a view of complex dynamical evolution on time scales of microseconds, milliseconds, and sometimes beyond. We have recently made advances in all three of the basic AMD methods (hyperdynamics, parallel replica dynamics, and temperature accelerated dynamics (TAD)), exploiting both algorithmic advances and novel parallelization approaches. I will describe these advances, present some examples of our latest results, and discuss what should be possible when exascale computing arrives in roughly five years. Funded by the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, and by the Los Alamos Laboratory Directed Research and Development program.

  6. Efficacy of Alamo for prophylactic and therapeutic treatment of oak wilt in red oaks, 2004

    Treesearch

    K. Ward; J. Juzwik; S. Bernick

    2004-01-01

    An experiment (prophylactic study) to determine the efficacy of Alamo in preventing spread of C. fagacearum through grafted roots of oak wilt-affected and of apparently healthy red oaks was initiated in eight locations in east-central and southeastern Minnesota in Jul 2002.

  7. Environmental assessment for effluent reduction, Los Alamos National Laboratory, Los Alamos, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-09-11

    The Department of Energy (DOE) proposes to eliminate industrial effluent from 27 outfalls at Los Alamos National Laboratory (LANL). The Proposed Action includes both simple and extensive plumbing modifications, which would result in the elimination of industrial effluent being released to the environment through 27 outfalls. The industrial effluent currently going to about half of the 27 outfalls under consideration would be rerouted to LANL`s sanitary sewer system. Industrial effluent from other outfalls would be eliminated by replacing once-through cooling water systems with recirculation systems, or, in a few instances, operational changes would result in no generation of industrial effluent.more » After the industrial effluents have been discontinued, the affected outfalls would be removed from the NPDES Permit. The pipes from the source building or structure to the discharge point for the outfalls may be plugged, or excavated and removed. Other outfalls would remain intact and would continue to discharge stormwater. The No Action alternative, which would maintain the status quo for LANL`s outfalls, was also analyzed. An alternative in which industrial effluent would be treated at the source facilities was considered but dismissed from further analysis because it would not reasonably meet the DOE`s purpose for action, and its potential environmental effects were bounded by the analysis of the Proposed Action and the No Action alternatives.« less

  8. Accelerator-based validation of shielding codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitlin, Cary; Heilbronn, Lawrence; Miller, Jack

    2002-08-12

    The space radiation environment poses risks to astronaut health from a diverse set of sources, ranging from low-energy protons and electrons to highly-charged, high-energy atomic nuclei and their associated fragmentation products, including neutrons. The low-energy protons and electrons are the source of most of the radiation dose to Shuttle and ISS crews, while the more energetic particles that comprise the Galactic Cosmic Radiation (protons, He, and heavier nuclei up to Fe) will be the dominant source for crews on long-duration missions outside the earth's magnetic field. Because of this diversity of sources, a broad ground-based experimental effort is required tomore » validate the transport and shielding calculations used to predict doses and dose-equivalents under various mission scenarios. The experimental program of the LBNL group, described here, focuses principally on measurements of charged particle and neutron production in high-energy heavy-ion fragmentation. Other aspects of the program include measurements of the shielding provided by candidate spacesuit materials against low-energy protons (particularly relevant to extra-vehicular activities in low-earth orbit), and the depth-dose relations in tissue for higher-energy protons. The heavy-ion experiments are performed at the Brookhaven National Laboratory's Alternating Gradient Synchrotron and the Heavy-Ion Medical Accelerator in Chiba in Japan. Proton experiments are performed at the Lawrence Berkeley National Laboratory's 88'' Cyclotron with a 55 MeV beam, and at the Loma Linda University Proton Facility with 100 to 250 MeV beam energies. The experimental results are an important component of the overall shielding program, as they allow for simple, well-controlled tests of the models developed to handle the more complex radiation environment in space.« less

  9. A preliminary design of the collinear dielectric wakefield accelerator

    NASA Astrophysics Data System (ADS)

    Zholents, A.; Gai, W.; Doran, S.; Lindberg, R.; Power, J. G.; Strelnikov, N.; Sun, Y.; Trakhtenberg, E.; Vasserman, I.; Jing, C.; Kanareykin, A.; Li, Y.; Gao, Q.; Shchegolkov, D. Y.; Simakov, E. I.

    2016-09-01

    A preliminary design of the multi-meter long collinear dielectric wakefield accelerator that achieves a highly efficient transfer of the drive bunch energy to the wakefields and to the witness bunch is considered. It is made from 0.5 m long accelerator modules containing a vacuum chamber with dielectric-lined walls, a quadrupole wiggler, an rf coupler, and BPM assembly. The single bunch breakup instability is a major limiting factor for accelerator efficiency, and the BNS damping is applied to obtain the stable multi-meter long propagation of a drive bunch. Numerical simulations using a 6D particle tracking computer code are performed and tolerances to various errors are defined.

  10. Ecological baseline studies in Los Alamos and Guaje Canyons County of Los Alamos, New Mexico. A two-year study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foxx, T.S.

    1995-11-01

    During the summers of 1993 and 1994, the Biological Resource Evaluations Team (BRET) of the Environmental Protection Group (ESH-8) conducted baseline studies within two canyon systems, Los Alamos and Guaje Canyons. Biological data was collected within each canyon to provide background and baseline information for Ecological Risk models. Baseline studies included establishment of permanent vegetation plots within each canyon along the elevational gradient. Then, in association with the various vegetation types, surveys were conducted for ground dwelling insects, birds, and small mammals. The stream channels associated with the permanent vegetation plots were characterized and aquatic macroinvertebrates collected within the streammore » monthly throughout a six-month period. The Geographic Position System (GPS) in combination with ARC INFO was used to map the study areas. Considerable data was collected during these surveys and are summarized in individual chapters.« less

  11. Los Alamos Science, Number 25 -- 1997: Celebrating the Neutrino

    DOE R&D Accomplishments Database

    Cooper, N. G. ed.

    1997-01-01

    This issue is devoted to the neutrino and its remaining mysteries. It is divided into the following areas: (1) The Reines-Cowan experiment -- detecting the poltergeist; (2) The oscillating neutrino -- an introduction to neutrino masses and mixing; (3) A brief history of neutrino experiments at LAMPF; (4) A thousand eyes -- the story of LSND (Los Alamos neutrino oscillation experiment); (5) The evidence for oscillations; (6) The nature of neutrinos in muon decay and physics beyond the Standard Model; (7) Exorcising ghosts -- in pursuit of the missing solar neutrinos; (8) MSW -- a possible solution to the solar neutrino problem; (8) Neutrinos and supernovae; and (9) Dark matter and massive neutrinos.

  12. Hybrid parallel code acceleration methods in full-core reactor physics calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Courau, T.; Plagne, L.; Ponicot, A.

    2012-07-01

    When dealing with nuclear reactor calculation schemes, the need for three dimensional (3D) transport-based reference solutions is essential for both validation and optimization purposes. Considering a benchmark problem, this work investigates the potential of discrete ordinates (Sn) transport methods applied to 3D pressurized water reactor (PWR) full-core calculations. First, the benchmark problem is described. It involves a pin-by-pin description of a 3D PWR first core, and uses a 8-group cross-section library prepared with the DRAGON cell code. Then, a convergence analysis is performed using the PENTRAN parallel Sn Cartesian code. It discusses the spatial refinement and the associated angular quadraturemore » required to properly describe the problem physics. It also shows that initializing the Sn solution with the EDF SPN solver COCAGNE reduces the number of iterations required to converge by nearly a factor of 6. Using a best estimate model, PENTRAN results are then compared to multigroup Monte Carlo results obtained with the MCNP5 code. Good consistency is observed between the two methods (Sn and Monte Carlo), with discrepancies that are less than 25 pcm for the k{sub eff}, and less than 2.1% and 1.6% for the flux at the pin-cell level and for the pin-power distribution, respectively. (authors)« less

  13. Los Alamos Guns Take Aim at Material's Mysteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byers, Mark; Moore, David; Dimarino, Steve

    Los Alamos National Laboratory scientists and technicians conduct thousands of experiments a year, delving into the fundamental nature of everything from supernovas to subatomic particles. One set of instruments used to better understand the fundamental nature of various materials are 10 scientific gun systems that fire various projectiles at high-tech targets to create enormous velocities, pressures, and temperatures - and using laser, x-ray, and other diagnostics - explore the very nature of metals and other materials. The hundreds of gun-based experiments conducted every year at the Laboratory require a highly-skilled staff of scientists and technicians, and has given rise tomore » a special organization called the "gun working group" to foster open communications, cooperation, problem-solving, and a healthy safety culture.« less

  14. Smoking patterns among Los Alamos National Laboratory employees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, M.C.; Wilkinson, G.S.

    Smoking patterns among 5507 employees at Los Alamos National Laboratory were investigated for those who underwent physical examinations by occupational physicians from 1978 to 1983. More male than female employees smoked, although differences in smoking rates between the sexes were not as large as differences observed for national smoking rates. Employees over 40 were more likely to smoke than younger employees, males consumed more cigarettes than did females, and Anglo employees smoked more cigarettes than did Hispanic employees. Highly educated employees smoked less than did less-educated workers, and staff members exhibited the lowest rates of smoking. Smoking cessation programs formore » Laboratory employees should be directed toward those subpopulations with the highest rates of smoking. 31 refs., 8 figs., 1 tab.« less

  15. Los Alamos Guns Take Aim at Material's Mysteries

    ScienceCinema

    Byers, Mark; Moore, David; Dimarino, Steve

    2018-05-30

    Los Alamos National Laboratory scientists and technicians conduct thousands of experiments a year, delving into the fundamental nature of everything from supernovas to subatomic particles. One set of instruments used to better understand the fundamental nature of various materials are 10 scientific gun systems that fire various projectiles at high-tech targets to create enormous velocities, pressures, and temperatures - and using laser, x-ray, and other diagnostics - explore the very nature of metals and other materials. The hundreds of gun-based experiments conducted every year at the Laboratory require a highly-skilled staff of scientists and technicians, and has given rise to a special organization called the "gun working group" to foster open communications, cooperation, problem-solving, and a healthy safety culture.

  16. Stochastic Particle Acceleration in the Hot Spots of FRII Radio Galaxies

    NASA Astrophysics Data System (ADS)

    Liu, Siming; Fan, Z.; Wang, J.; Fryer, C. L.; Li, H.

    2007-12-01

    Chandra, XMM-Newton, and HST observations of FRII radio galaxies, in combination with traditional radio studies, have advanced our understanding of the nature of jets, hot spots, and lobes significantly. The observed radio to optical emission has been attributed to the synchrotron processes. The X-ray emission can be produced through synchrotron, synchrotron self-Comptonization, and inverse Comptonization of the CMB or other background photos. Phenomenologically modelings of the observed broadband spectra have led to good constraints on the magnetic field and electron distribution. However, the matter and energy contents of the relativistic outflows driven by the central black holes, which power these sources, are still not well-constrained, and we also lack an understanding of the physical processes that determine the energy partition between the electrons and the magnetic field, the low energy cutoff of the electron spectrum, and the electron acceleration rate in these strongly magnetized relativistic plasmas. In the context of stochastic particle acceleration, we propose a model for the hot spots of radio galaxies and show how it may help us to address the above issues. This work was funded in part under the auspices of the US Department of Energy, and supported by its contract W-7405-ENG-36 to Los Alamos National Laboratory.

  17. Floodplain Assessment for the Upper Cañon de Valle Watershed Enhancement Project in Technical Area 16 at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hathcock, Charles Dean; Keller, David Charles; Sartor, Karla A.

    This floodplain assessment was prepared in accordance with 10 Code of Federal Regulations (CFR) 1022 Compliance with Floodplain and Wetland Environmental Review Requirements, which was promulgated to implement the U.S. Department of Energy (DOE) requirements under Executive Order 11988 Floodplain Management and Executive Order 11990 Wetlands Protection. According to 10 CFR 1022, a 100-year floodplain is defined as “the lowlands adjoining inland and coastal waters and relatively flat areas and flood prone areas of offshore islands.” In this action, DOE is proposing to control the run-on of storm water by slowing water velocity and managing sediments from the upper portionsmore » of the Cañon de Valle watershed on Los Alamos National Laboratory (LANL) property with a number of new watershed controls near and within the 100-year floodplain (hereafter floodplain). The proposed work will comply with requirements under the Settlement Agreement and Stipulated Final Compliance Order (Settlement Agreement) Number HWB-14-20.« less

  18. 77 FR 3257 - Transfer of Land Tracts Located at Los Alamos National Laboratory, New Mexico

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-23

    ... DEPARTMENT OF ENERGY Transfer of Land Tracts Located at Los Alamos National Laboratory, New Mexico AGENCY: National Nuclear Security Administration, U.S. Department of Energy. ACTION: Amended Record of Decision. SUMMARY: The U.S. Department of Energy's National Nuclear Security Administration (DOE/NNSA) is...

  19. Leading Change: A Case Study of Alamo Academies--An Industry-Driven Workforce Partnership Program

    ERIC Educational Resources Information Center

    Hu, Xiaodan; Bowman, Gene

    2016-01-01

    In this study, the authors focus on the initiation and development of the Alamo Academies, aiming to illustrate an exemplary industry-driven model that addresses workforce development in local community. After a brief introduction of the context, the authors summarized major factors that contribute to the success of the collaboration model,…

  20. Accelerator driven reactors and nuclear waste management projects in the Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janouch, Frantisek; Mach, Rostislav; Institute of Nuclear Physics, Rez near Prague

    1995-09-15

    The Czech Republic is almost the only country in the central Europe which continues with the construction of nuclear power reactors. Its small territory and dense population causes public worries concerning the disposal of the spent nuclear fuel. The Czech nuclear scientists and the power companies and the nuclear industries are therefore looking for alterative solutions. The Los Alamos ATW project had received a positive response in the Czech mass-media and even in the industrial and governmental quarters. The recent scientific symposium ''Accelerator driven reactors and nuclear waste management'' convened at the Liblice castle near Prague, 27-29.6. 1994 and sponsoredmore » by the Czech Energy Company CEZ, reviewed the competencies and experimental basis in the Czech republic and made the first attempt to formulate the national approach and to establish international collaboration in this area.« less

  1. Using Kokkos for Performant Cross-Platform Acceleration of Liquid Rocket Simulations

    DTIC Science & Technology

    2017-05-08

    NUMBER (Include area code) 08 May 2017 Briefing Charts 05 April 2017 - 08 May 2017 Using Kokkos for Performant Cross-Platform Acceleration of Liquid ...ERC Incorporated RQRC AFRL-West Using Kokkos for Performant Cross-Platform Acceleration of Liquid Rocket Simulations 2DISTRIBUTION A: Approved for... Liquid Rocket Combustion Simulation SPACE simulation of rotating detonation engine (courtesy of Dr. Christopher Lietz) 3DISTRIBUTION A: Approved

  2. Stormwater Pollution Prevention Plan for the TA-03-38 Metals Fabrication Shop, Los Alamos National Laboratory, Revision 3, January 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgin, Jillian Elizabeth

    This Storm Water Pollution Prevention Plan (SWPPP) was developed in accordance with the provisions of the Clean Water Act (33 U.S.C. §§1251 et seq., as amended), and the Multi-Sector General Permit for Storm Water Discharges Associated with Industrial Activity (U.S. EPA, June 2015) issued by the U.S. Environmental Protection Agency (EPA) for the National Pollutant Discharge Elimination System (NPDES) and using the industry specific permit requirements for Sector AA-Fabricated Metal Products as a guide. This SWPPP applies to discharges of stormwater from the operational areas of the TA-03-38 Metals Fabrication Shop at Los Alamos National Laboratory. Los Alamos National Laboratorymore » (also referred to as LANL or the “Laboratory”) is owned by the Department of Energy (DOE), and is operated by Los Alamos National Security, LLC (LANS). Throughout this document, the term “facility” refers to the TA-03-38 Metals Fabrication Shop and associated areas. The current permit expires at midnight on June 4, 2020.« less

  3. Stormwater Pollution Prevention Plan for the TA-60-01 Heavy Equipment Shop, Los Alamos National Laboratory, Revision 3, January 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgin, Jillian Elizabeth

    This Storm Water Pollution Prevention Plan (SWPPP) was developed in accordance with the provisions of the Clean Water Act (33 U.S.C. §§1251 et seq., as amended), and the Multi-Sector General Permit for Storm Water Discharges Associated with Industrial Activity (U.S. EPA, June 2015) issued by the U.S. Environmental Protection Agency (EPA) for the National Pollutant Discharge Elimination System (NPDES) and using the industry specific permit requirements for Sector P-Land Transportation and Warehousing as a guide. This SWPPP applies to discharges of stormwater from the operational areas of the TA-60-01 Heavy Equipment Shop at Los Alamos National Laboratory. Los Alamos Nationalmore » Laboratory (also referred to as LANL or the “Laboratory”) is owned by the Department of Energy (DOE), and is operated by Los Alamos National Security, LLC (LANS). Throughout this document, the term “facility” refers to the TA-60-01 Heavy Equipment Shop and associated areas. The current permit expires at midnight on June 4, 2020.« less

  4. Environmental Surveillance at Los Alamos during 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Environmental Surveillance at Los Alamos reports are prepared annually by the Los Alamos National Laboratory (the Laboratory) Environmental Directorate, as required by US Department of Energy Order 450.1, General Environmental Protection Program, and US Department of Energy Order 231.1A, Environment, Safety, and Health Reporting. These annual reports summarize environmental data that are used to determine compliance with applicable federal, state, and local environmental laws and regulations, executive orders, and departmental policies. Additional data, beyond the minimum required, are also gathered and reported as part of the Laboratory’s efforts to ensure public safety and to monitor environmental quality at and nearmore » the Laboratory. Chapter 1 provides an overview of the Laboratory’s major environmental programs and explains the risks and the actions taken to reduce risks at the Laboratory from environmental legacies and waste management operations. Chapter 2 reports the Laboratory’s compliance status for 2007. Chapter 3 provides a summary of the maximum radiological dose the public and biota populations could have potentially received from Laboratory operations and discusses chemical exposures. The environmental surveillance and monitoring data are organized by environmental media (Chapter 4, air; Chapters 5 and 6, water and sediments; Chapter 7, soils; and Chapter 8, foodstuffs and biota) in a format to meet the needs of a general and scientific audience. Chapter 9 provides a summary of the status of environmental restoration work around LANL. A glossary and a list of acronyms and abbreviations are in the back of the report. Appendix A explains the standards for environmental contaminants, Appendix B explains the units of measurements used in this report, Appendix C describes the laboratory’s technical areas and their associated programs, and Appendix D provides web links to more information. In printed copies of this report or Executive Summary

  5. The "El Alamo" project (1990-1997): two consecutive hospital-based studies of breast cancer outcomes in Spain.

    PubMed

    Martín, M; Mahillo, E; Llombart-Cussac, A; Lluch, A; Munarriz, B; Pastor, M; Alba, E; Ruiz, A; Antón, A; Bermejo, B

    2006-07-01

    The "Alamo" project is a retrospective analysis of 14,854 patients diagnosed of breast cancer between 1990 and 1997 in 50 Spanish hospitals. Alamo I (AI) consisted of 4,532 patients diagnosed with breast cancer between 1990 and 1993. Data were collected in 2000. Alamo II (AII) consisted of 10,322 patients diagnosed between 1994 and 1997. Data were collected in 2003. At presentation, there were (AI vs. AII) 17.6% vs. 24.3% at stage I; 55.5% vs. 53.1% at stage II; 18.7% vs. 15% at stage III; 7.2% vs. 5.9 at stage IV. Median age was 57 (AI) vs. 58 years (AII) and 65.9% vs. 67.2% (AI vs. AII) were post-menopausal. Firstline treatment for disease stages I, II and III was surgery in 91% of patients in both studies. Breast conserving surgery rate increased from 20.2% (AI) to 32.7% (AII). Adjuvant systemic treatments were administered to 87.6% (AI) and 92.8% (AII) of patients. Recurrence rate diminished from 36.6% (AI) to 22.5% (AII) and the 9-year survival rate increased from 63.2% (95% CI: 61.5-64.9) to 70.1% (95% CI: 68.5-71.8). Breast cancer outcomes in Spain have improved from 1990-1993 to 1994-1997, likely because of breast cancer screening program implementation and new therapies.

  6. Erosion and Deposition Monitoring Using High-Density Aerial Lidar and Geomorphic Change Detection Software Analysis at Los Alamos National Laboratory, Los Alamos New Mexico, LA-UR-17-26743

    NASA Astrophysics Data System (ADS)

    Walker, T.; Kostrubala, T. L.; Muggleton, S. R.; Veenis, S.; Reid, K. D.; White, A. B.

    2017-12-01

    The Los Alamos National Laboratory storm water program installed sediment transport mitigation structures to reduce the migration of contaminants within the Los Alamos and Pueblo (LA/P) watershed in Los Alamos, NM. The goals of these structures are to minimize storm water runoff and erosion, enhance deposition, and reduce mobility of contaminated sediments. Previous geomorphological monitoring used GPS surveyed cross-sections on a reach scale to interpolate annual geomorphic change in sediment volumes. While monitoring has confirmed the LA/P watershed structures are performing as designed, the cross-section method proved difficult to estimate uncertainty and the coverage area was limited. A new method, using the Geomorphic Change Detection (GCD) plugin for ESRI ArcGIS developed by Wheaton et al. (2010), with high-density aerial lidar data, has been used to provide high confidence uncertainty estimates and greater areal coverage. Following the 2014 monsoon season, airborne lidar data has been collected annually and the resulting DEMs processed using the GCD method. Additionally, a more accurate characterization of low-amplitude geomorphic changes, typical of low-flow/low-rainfall monsoon years, has been documented by applying a spatially variable error to volume change calculations using the GCD based fuzzy inference system (FIS). The FIS method allows for the calculation of uncertainty based on data set quality and density e.g. point cloud density, ground slope, and degree of surface roughness. At the 95% confidence level, propagated uncertainty estimates of the 2015 and 2016 lidar DEM comparisons yielded detectable changes greater than 0.3 m - 0.46 m. Geomorphic processes identified and verified in the field are typified by low-amplitude, within-channel aggradation and incision and out of channel bank collapse that over the course of a monsoon season result in localized and dectetable change. While the resulting reach scale volume change from 2015 - 2016 was often

  7. Alamos: An International Collaboration to Provide a Space Based Environmental Monitoring Solution for the Deep Space Network

    NASA Astrophysics Data System (ADS)

    Kennedy, S. O.; Dunn, A.; Lecomte, J.; Buchheim, K.; Johansson, E.; Berger, T.

    2018-02-01

    This abstract proposes the advantages of an externally mounted instrument in support of the human physiology, space biology, and human health and performance key science area. Alamos provides Space-Based Environmental Monitoring capabilities.

  8. Pinon Pine Tree Study, Los Alamos National Laboratory: Source document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. R. Fresquez; J. D. Huchton; M. A. Mullen

    One of the dominant tree species growing within and around Los Alamos National Laboratory (LANL), Los Alamos, NM, lands is the pinon pine (Pinus edulis) tree. Pinon pine is used for firewood, fence posts, and building materials and is a source of nuts for food--the seeds are consumed by a wide variety of animals and are also gathered by people in the area and eaten raw or roasted. This study investigated the (1) concentration of {sup 3}H, {sup 137}Cs, {sup 90}Sr, {sup tot}U, {sup 238}Pu, {sup 239,240}Pu, and {sup 241}Am in soils (0- to 12-in. [31 cm] depth underneath themore » tree), pinon pine shoots (PPS), and pinon pine nuts (PPN) collected from LANL lands and regional background (BG) locations, (2) concentrations of radionuclides in PPN collected in 1977 to present data, (3) committed effective dose equivalent (CEDE) from the ingestion of nuts, and (4) soil to PPS to PPN concentration ratios (CRs). Most radionuclides, with the exception of {sup 3}H in soils, were not significantly higher (p < 0.10) in soils, PPS, and PPN collected from LANL as compared to BG locations, and concentrations of most radionuclides in PPN from LANL have decreased over time. The maximum net CEDE (the CEDE plus two sigma minus BG) at the most conservative ingestion rate (10 lb [4.5 kg]) was 0.0018 mrem (0.018 {micro}Sv). Soil-to-nut CRs for most radionuclides were within the range of default values in the literature for common fruits and vegetables.« less

  9. Feral Cattle in the White Rock Canyon Reserve at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hathcock, Charles D.; Hansen, Leslie A.

    2014-03-27

    At the request of the Los Alamos Field Office (the Field Office), Los Alamos National Security (LANS) biologists placed remote-triggered wildlife cameras in and around the mouth of Ancho Canyon in the White Rock Canyon Reserve (the Reserve) to monitor use by feral cattle. The cameras were placed in October 2012 and retrieved in January 2013. Two cameras were placed upstream in Ancho Canyon away from the Rio Grande along the perennial flows from Ancho Springs, two cameras were placed at the north side of the mouth to Ancho Canyon along the Rio Grande, and two cameras were placed atmore » the south side of the mouth to Ancho Canyon along the Rio Grande. The cameras recorded three different individual feral cows using this area as well as a variety of local native wildlife. This report details our results and issues associated with feral cattle in the Reserve. Feral cattle pose significant risks to human safety, impact cultural and biological resources, and affect the environmental integrity of the Reserve. Regional stakeholders have communicated to the Field Office that they support feral cattle removal.« less

  10. Los Alamos National Security, LLC Request for Information from industrial entities that desire to commercialize Laboratory-developed Extremely Low Resource Optical Identifier (ELROI) tech

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Michael Charles

    Los Alamos National Security, LLC (LANS) is the manager and operator of the Los Alamos National Laboratory for the U.S. Department of Energy National Nuclear Security Administration under contract DE-AC52-06NA25396. LANS is a mission-centric Federally Funded Research and Development Center focused on solving the most critical national security challenges through science and engineering for both government and private customers.

  11. Critical assembly: A technical history of Los Alamos during the Oppenheimer years, 1943--1945

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoddeson, L.; Henriksen, P.W.; Meade, R.A.

    1993-11-01

    This volume treats the technical research that led to the first atomic bombs. The authors explore how the ``critical assembly`` of scientists, engineers, and military Personnel at Los Alamos collaborated during World War II, blending their traditions to create a new approach to large-scale research. The research was characterized by strong mission orientation, multidisciplinary teamwork, expansion of the scientists` traditional methodology with engineering techniques, and a trail-and-error methodology responding to wartime deadlines. The book opens with an introduction laying out major themes. After a synopsis of the prehistory of the bomb project, from the discovery of nuclear fission to themore » start of the Manhattan Engineer District, and an overview of the early materials program, the book examines the establishment of the Los Alamos Laboratory, the implosion and gun assembly programs, nuclear physics research, chemistry and metallurgy, explosives, uranium and plutonium development, confirmation of spontaneous fission in pile-produced plutonium, the thermonuclear bomb, critical assemblies, the Trinity test, and delivery of the combat weapons.« less

  12. A general multiblock Euler code for propulsion integration. Volume 3: User guide for the Euler code

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Su, T. Y.; Kao, T. J.

    1991-01-01

    This manual explains the procedures for using the general multiblock Euler (GMBE) code developed under NASA contract NAS1-18703. The code was developed for the aerodynamic analysis of geometrically complex configurations in either free air or wind tunnel environments (vol. 1). The complete flow field is divided into a number of topologically simple blocks within each of which surface fitted grids and efficient flow solution algorithms can easily be constructed. The multiblock field grid is generated with the BCON procedure described in volume 2. The GMBE utilizes a finite volume formulation with an explicit time stepping scheme to solve the Euler equations. A multiblock version of the multigrid method was developed to accelerate the convergence of the calculations. This user guide provides information on the GMBE code, including input data preparations with sample input files and a sample Unix script for program execution in the UNICOS environment.

  13. Supplement Analysis for the Site-Wide Environmental Impact Statement for Continued Operation of Los Alamos National Laboratory -- Recovery and Storage of Strontium-90 Fueled Radioisotope Thermal Electric Generators at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N /A

    2004-01-22

    This Supplement Analysis (SA) has been prepared to determine if the Site-Wide Environmental Impact Statement for Continued Operations of Los Alamos National Laboratory (SWEIS) (DOE/EIS-0238) adequately addresses the environmental effects of recovery and storage for disposal of six strontium-90 (Sr-90) fueled radioisotope thermal electric generators (RTGs) at the Los Alamos National Laboratory (LANL) Technical Area (TA)-54, Area G, or if the SWEIS needs to be supplemented. DOE's National Nuclear Security Administration (NNSA) proposed to recover and store six Sr-90 RTGs from the commercial sector as part of its Offsite-Source Recovery Project (OSRP). The OSRP focuses on the proactive recovery andmore » storage of unwanted radioactive sealed sources exceeding the US Nuclear Regulatory Commission (NRC) limits for Class C low-level waste (also known as Greater than Class C waste, or GTCC). In response to the events of September 11, 2001, NRC conducted a risk-based evaluation of potential vulnerabilities to terrorist threats involving NRC-licensed nuclear facilities and materials. NRC's evaluation concluded that possession of unwanted radioactive sealed sources with no disposal outlet presents a potential vulnerability (NRC 2002). In a November 25, 2003 letter to the manager of the NNSA's Los Alamos Site Office, the NRC Office of Nuclear Security and Incident Response identified recovery of several Sr-90 RTGs as the highest priority and requested that DOE take whatever actions necessary to recovery these sources as soon as possible. This SA specifically compares key impact assessment parameters of this proposal to the offsite source recovery program evaluated in the SWEIS and a subsequent SA that evaluated a change to the approach of a portion of the recovery program. It also provides an explanation of any differences between the Proposed Action and activities described in the previous SWEIS and SA analyses.« less

  14. Post-Cold War Science and Technology at Los Alamos

    NASA Astrophysics Data System (ADS)

    Browne, John C.

    2002-04-01

    Los Alamos National Laboratory serves the nation through the development and application of leading-edge science and technology in support of national security. Our mission supports national security by: ensuring the safety, security, and reliability of the U.S. nuclear stockpile; reducing the threat of weapons of mass destruction in support of counter terrorism and homeland defense; and solving national energy, environment, infrastructure, and health security problems. We require crosscutting fundamental and advanced science and technology research to accomplish our mission. The Stockpile Stewardship Program develops and applies, advanced experimental science, computational simulation, and technology to ensure the safety and reliability of U.S. nuclear weapons in the absence of nuclear testing. This effort in itself is a grand challenge. However, the terrorist attack of September 11, 2001, reminded us of the importance of robust and vibrant research and development capabilities to meet new and evolving threats to our national security. Today through rapid prototyping we are applying new, innovative, science and technology for homeland defense, to address the threats of nuclear, chemical, and biological weapons globally. Synergistically, with the capabilities that we require for our core mission, we contribute in many other areas of scientific endeavor. For example, our Laboratory has been part of the NASA effort on mapping water on the moon and NSF/DOE projects studying high-energy astrophysical phenomena, understanding fundamental scaling phenomena of life, exploring high-temperature superconductors, investigating quantum information systems, applying neutrons to condensed-matter and nuclear physics research, developing large-scale modeling and simulations to understand complex phenomena, and exploring nanoscience that bridges the atomic to macroscopic scales. In this presentation, I will highlight some of these post-cold war science and technology advances

  15. 2016 Los Alamos National Laboratory Hazardous Waste Minimization Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salzman, Sonja L.; English, Charles Joe

    Waste minimization and pollution prevention are goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE), inclusive of the National Nuclear Security Administration (NNSA) and the Office of Environmental Management, and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program, whichmore » is a component of the overall Pollution Prevention (P2) Program, administered by the Environmental Stewardship Group (EPC-ES). This report also supports the waste minimization and P2 goals of the Associate Directorate of Environmental Management (ADEM) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. This report includes data for all waste shipped offsite from LANL during fiscal year (FY) 2016 (October 1, 2015 – September 30, 2016). LANS was active during FY2016 in waste minimization and P2 efforts. Multiple projects were funded that specifically related to reduction of hazardous waste. In FY2016, there was no hazardous, mixed-transuranic (MTRU), or mixed low-level (MLLW) remediation waste shipped offsite from the Laboratory. More non-remediation hazardous waste and MLLW was shipped offsite from the Laboratory in FY2016 compared to FY2015. Non-remediation MTRU waste was not shipped offsite during FY2016. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.« less

  16. Particle acceleration at a reconnecting magnetic separator

    NASA Astrophysics Data System (ADS)

    Threlfall, J.; Neukirch, T.; Parnell, C. E.; Eradat Oskoui, S.

    2015-02-01

    Context. While the exact acceleration mechanism of energetic particles during solar flares is (as yet) unknown, magnetic reconnection plays a key role both in the release of stored magnetic energy of the solar corona and the magnetic restructuring during a flare. Recent work has shown that special field lines, called separators, are common sites of reconnection in 3D numerical experiments. To date, 3D separator reconnection sites have received little attention as particle accelerators. Aims: We investigate the effectiveness of separator reconnection as a particle acceleration mechanism for electrons and protons. Methods: We study the particle acceleration using a relativistic guiding-centre particle code in a time-dependent kinematic model of magnetic reconnection at a separator. Results: The effect upon particle behaviour of initial position, pitch angle, and initial kinetic energy are examined in detail, both for specific (single) particle examples and for large distributions of initial conditions. The separator reconnection model contains several free parameters, and we study the effect of changing these parameters upon particle acceleration, in particular in view of the final particle energy ranges that agree with observed energy spectra.

  17. Los Alamos Explosives Performance Key to Stockpile Stewardship

    ScienceCinema

    Dattelbaum, Dana

    2018-02-14

    As the U.S. Nuclear Deterrent ages, one essential factor in making sure that the weapons will continue to perform as designed is understanding the fundamental properties of the high explosives that are part of a nuclear weapons system. As nuclear weapons go through life extension programs, some changes may be advantageous, particularly through the addition of what are known as "insensitive" high explosives that are much less likely to accidentally detonate than the already very safe "conventional" high explosives that are used in most weapons. At Los Alamos National Laboratory explosives research includes a wide variety of both large- and small-scale experiments that include small contained detonations, gas and powder gun firings, larger outdoor detonations, large-scale hydrodynamic tests, and at the Nevada Nuclear Security Site, underground sub-critical experiments.

  18. A review of the Los Alamos effort in the development of nuclear rocket propulsion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durham, F.P.; Kirk, W.L.; Bohl, R.J.

    1991-01-01

    This paper reviews the achievements of the Los Alamos nuclear rocket propulsion program and describes some specific reactor design and testing problems encountered during the development program along with the progress made in solving these problems. The relevance of these problems to a renewed nuclear thermal rocket development program for the Space Exploration Initiative (SEI) is discussed. 11 figs.

  19. Los Alamos Science, Number 25 -- 1997: Celebrating the neutrino

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, N.G.

    1997-12-31

    This issue is devoted to the neutrino and its remaining mysteries. It is divided into the following areas: (1) The Reines-Cowan experiment -- detecting the poltergeist; (2) The oscillating neutrino -- an introduction to neutrino masses and mixing; (3) A brief history of neutrino experiments at LAMPF; (4) A thousand eyes -- the story of LSND (Los Alamos neutrino oscillation experiment); (5) The evidence for oscillations; (6) The nature of neutrinos in muon decay and physics beyond the Standard Model; (7) Exorcising ghosts -- in pursuit of the missing solar neutrinos; (8) MSW -- a possible solution to the solarmore » neutrino problem; (8) Neutrinos and supernovae; and (9) Dark matter and massive neutrinos.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pang, Xiaoying; Rybarcyk, Larry

    HPSim is a GPU-accelerated online multi-particle beam dynamics simulation tool for ion linacs. It was originally developed for use on the Los Alamos 800-MeV proton linac. It is a “z-code” that contains typical linac beam transport elements. The linac RF-gap transformation utilizes transit-time-factors to calculate the beam acceleration therein. The space-charge effects are computed using the 2D SCHEFF (Space CHarge EFFect) algorithm, which calculates the radial and longitudinal space charge forces for cylindrically symmetric beam distributions. Other space- charge routines to be incorporated include the 3D PICNIC and a 3D Poisson solver. HPSim can simulate beam dynamics in drift tubemore » linacs (DTLs) and coupled cavity linacs (CCLs). Elliptical superconducting cavity (SC) structures will also be incorporated into the code. The computational core of the code is written in C++ and accelerated using the NVIDIA CUDA technology. Users access the core code, which is wrapped in Python/C APIs, via Pythons scripts that enable ease-of-use and automation of the simulations. The overall linac description including the EPICS PV machine control parameters is kept in an SQLite database that also contains calibration and conversion factors required to transform the machine set points into model values used in the simulation.« less

  1. Los Alamos National Laboratory Meteorology Monitoring Program: 2016 Data Completeness/ Quality Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruggeman, David Alan

    This report summarizes data completeness by tower and by instrument for 2016 and compares that data with the Los Alamos National Laboratory (LANL) and American National Standards Institute (ANSI) 2015 standards. This report is designed to make data users aware of data completeness and any data quality issues. LANL meteorology monitoring goals include 95% completeness for all measurements. The ANSI 2015 standard requires 90% completeness for all measurements. This report documents instrument/tower issues as they impact data completeness.

  2. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  3. Evaluation of the Intel Xeon Phi 7120 and NVIDIA K80 as accelerators for two-dimensional panel codes

    PubMed Central

    2017-01-01

    To optimize the geometry of airfoils for a specific application is an important engineering problem. In this context genetic algorithms have enjoyed some success as they are able to explore the search space without getting stuck in local optima. However, these algorithms require the computation of aerodynamic properties for a significant number of airfoil geometries. Consequently, for low-speed aerodynamics, panel methods are most often used as the inner solver. In this paper we evaluate the performance of such an optimization algorithm on modern accelerators (more specifically, the Intel Xeon Phi 7120 and the NVIDIA K80). For that purpose, we have implemented an optimized version of the algorithm on the CPU and Xeon Phi (based on OpenMP, vectorization, and the Intel MKL library) and on the GPU (based on CUDA and the MAGMA library). We present timing results for all codes and discuss the similarities and differences between the three implementations. Overall, we observe a speedup of approximately 2.5 for adding an Intel Xeon Phi 7120 to a dual socket workstation and a speedup between 3.4 and 3.8 for adding a NVIDIA K80 to a dual socket workstation. PMID:28582389

  4. Evaluation of the Intel Xeon Phi 7120 and NVIDIA K80 as accelerators for two-dimensional panel codes.

    PubMed

    Einkemmer, Lukas

    2017-01-01

    To optimize the geometry of airfoils for a specific application is an important engineering problem. In this context genetic algorithms have enjoyed some success as they are able to explore the search space without getting stuck in local optima. However, these algorithms require the computation of aerodynamic properties for a significant number of airfoil geometries. Consequently, for low-speed aerodynamics, panel methods are most often used as the inner solver. In this paper we evaluate the performance of such an optimization algorithm on modern accelerators (more specifically, the Intel Xeon Phi 7120 and the NVIDIA K80). For that purpose, we have implemented an optimized version of the algorithm on the CPU and Xeon Phi (based on OpenMP, vectorization, and the Intel MKL library) and on the GPU (based on CUDA and the MAGMA library). We present timing results for all codes and discuss the similarities and differences between the three implementations. Overall, we observe a speedup of approximately 2.5 for adding an Intel Xeon Phi 7120 to a dual socket workstation and a speedup between 3.4 and 3.8 for adding a NVIDIA K80 to a dual socket workstation.

  5. PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)

    NASA Astrophysics Data System (ADS)

    Vincenti, Henri

    2016-03-01

    The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.

  6. Saving Water at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Andy

    Los Alamos National Laboratory decreased its water usage by 26 percent in 2014, with about one-third of the reduction attributable to using reclaimed water to cool a supercomputing center. The Laboratory's goal during 2014 was to use only re-purposed water to support the mission at the Strategic Computing Complex. Using reclaimed water from the Sanitary Effluent Reclamation Facility, or SERF, substantially decreased water usage and supported the overall mission. SERF collects industrial wastewater and treats it for reuse. The reclamation facility contributed more than 27 million gallons of re-purposed water to the Laboratory's computing center, a secured supercomputing facility thatmore » supports the Laboratory’s national security mission and is one of the institution’s larger water users. In addition to the strategic water reuse program at SERF, the Laboratory reduced water use in 2014 by focusing conservation efforts on areas that use the most water, upgrading to water-conserving fixtures, and repairing leaks identified in a biennial survey.« less

  7. Saving Water at Los Alamos National Laboratory

    ScienceCinema

    Erickson, Andy

    2018-01-16

    Los Alamos National Laboratory decreased its water usage by 26 percent in 2014, with about one-third of the reduction attributable to using reclaimed water to cool a supercomputing center. The Laboratory's goal during 2014 was to use only re-purposed water to support the mission at the Strategic Computing Complex. Using reclaimed water from the Sanitary Effluent Reclamation Facility, or SERF, substantially decreased water usage and supported the overall mission. SERF collects industrial wastewater and treats it for reuse. The reclamation facility contributed more than 27 million gallons of re-purposed water to the Laboratory's computing center, a secured supercomputing facility that supports the Laboratory’s national security mission and is one of the institution’s larger water users. In addition to the strategic water reuse program at SERF, the Laboratory reduced water use in 2014 by focusing conservation efforts on areas that use the most water, upgrading to water-conserving fixtures, and repairing leaks identified in a biennial survey.

  8. Designing a Dielectric Laser Accelerator on a Chip

    NASA Astrophysics Data System (ADS)

    Niedermayer, Uwe; Boine-Frankenheim, Oliver; Egenolf, Thilo

    2017-07-01

    Dielectric Laser Acceleration (DLA) achieves gradients of more than 1GeV/m, which are among the highest in non-plasma accelerators. The long-term goal of the ACHIP collaboration is to provide relativistic (>1 MeV) electrons by means of a laser driven microchip accelerator. Examples of ’’slightly resonant” dielectric structures showing gradients in the range of 70% of the incident laser field (1 GV/m) for electrons with beta=0.32 and 200% for beta=0.91 are presented. We demonstrate the bunching and acceleration of low energy electrons in dedicated ballistic buncher and velocity matched grating structures. However, the design gradient of 500 MeV/m leads to rapid defocusing. Therefore we present a scheme to bunch the beam in stages, which does not only reduce the energy spread, but also the transverse defocusing. The designs are made with a dedicated homemade 6D particle tracking code.

  9. IMPACTS OF DRILLING ADDITIVES ON DATA OBTAINED FROM HYDROGEOLOGIC CHARACTERIZATION WELLS AT LOS ALAMOS NATIONAL LABORATORY

    EPA Science Inventory

    Personnel at the EPA Ground Water and Ecosystems Restoration Division (GWERD) were requested by EPA Region 6 to evaluate the impacts of well drilling practices at the Los Alamos National Laboratory (LANL). The focus of this review involved analysis of the impacts of bentonite- a...

  10. Load management strategy for Particle-In-Cell simulations in high energy particle acceleration

    NASA Astrophysics Data System (ADS)

    Beck, A.; Frederiksen, J. T.; Dérouillat, J.

    2016-09-01

    In the wake of the intense effort made for the experimental CILEX project, numerical simulation campaigns have been carried out in order to finalize the design of the facility and to identify optimal laser and plasma parameters. These simulations bring, of course, important insight into the fundamental physics at play. As a by-product, they also characterize the quality of our theoretical and numerical models. In this paper, we compare the results given by different codes and point out algorithmic limitations both in terms of physical accuracy and computational performances. These limitations are illustrated in the context of electron laser wakefield acceleration (LWFA). The main limitation we identify in state-of-the-art Particle-In-Cell (PIC) codes is computational load imbalance. We propose an innovative algorithm to deal with this specific issue as well as milestones towards a modern, accurate high-performance PIC code for high energy particle acceleration.

  11. Compact torus accelerator as a driver for ICF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tobin, M.T.; Meier, W.R.; Morse, E.C.

    1986-01-01

    The authors have carried out further investigations of the technical issues associated with using a compact torus (CT) accelerator as a driver for inertial confinement fusion (ICF). In a CT accelerator, a magnetically confined, torus-shaped plasma is compressed, accelerated, and focused by two concentric electrodes. After its initial formation, the torus shape is maintained for lifetimes exceeding 1 ms by inherent poloidal and toroidal currents. Hartman suggests acceleration and focusing of such a plasma ring will not cause dissolution within certain constraints. In this study, we evaluated a point design based on an available capacitor bank energy of 9.2 MJ.more » This accelerator, which was modeled by a zero-dimensional code, produces a xenon plasma ring with a 0.73-cm radius, a velocity of 4.14 x 10/sup 9/ cm/s, and a mass of 4.42 ..mu..g. The energy of the plasma ring as it leaves the accelerator is 3.8 MJ, or 41% of the capacitor bank energy. Our studies confirm the feasibility of producing a plasma ring with the characteristics required to induce fusion in an ICF target with a gain greater than 50. The low cost and high efficiency of the CT accelerator are particularly attractive. Uncertainties concerning propagation, accelerator lifetime, and power supply must be resolved to establish the viability of the accelerator as an ICF driver.« less

  12. Accelerator driven reactors and nuclear waste management projects in the Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janouch, F.; Mach, R.

    1995-10-01

    The Czech Republic is almost the only country in the central Europe which continues with the construction of nuclear power reactors. Its small territory and dense population causes public worries concerning the disposal of the spent nuclear fuel. The Czech nuclear scientists and the power companies and the nuclear industries are therefore looking for alternative solutions. The Los Alamos ATW project had received a positive response in the Czech mass-media and even in the industrial and governmental quarters. The recent scientific symposium {open_quotes}Accelerator driven reactors and nuclear waste management{close_quotes} convened at the Liblice castle near Prague, 27-29. 6. 1994 andmore » sponsored by the Czech Energy Company CEZ, reviewed the competencies and experimental basis in the Czech republic and made the first attempt to formulate the national approach and to establish international collaboration in this area.« less

  13. Calculations of beam dynamics in Sandia linear electron accelerators, 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poukey, J.W.; Coleman, P.D.

    1985-03-01

    A number of code and analytic studies were made during 1984 which pertain to the Sandia linear accelerators MABE and RADLAC. In this report the authors summarize the important results of the calculations. New results include a better understanding of gap-induced radial oscillations, leakage currents in a typical MABE gas, emittance growth in a beam passing through a series of gaps, some new diocotron results, and the latest diode simulations for both accelerators. 23 references, 30 figures, 1 table.

  14. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  15. A Tracer Test at the Los Alamos Canyon Weir

    NASA Astrophysics Data System (ADS)

    Levitt, D. G.; Stone, W. J.; Newell, D. L.; Wykoff, D. S.

    2002-12-01

    A low-head weir was constructed in the Los Alamos Canyon to reduce the transport of contaminant-bearing sediment caused by fire-enhanced runoff off Los Alamos National Laboratory (LANL) property towards the Rio Grande following the May 2000 Cerro Grande fire at Los Alamos, New Mexico. Fractured basalt was exposed in the channel by grading during construction of the weir, and water temporarily ponds behind the weir following periods of runoff. In order to monitor any downward transport of contaminants into fractured basalt, and potentially downward to the regional ground water, three boreholes (one vertical, one at 43 degrees, and one at 34 degrees from horizontal) were installed for environmental monitoring. The boreholes penetrate to depths ranging from approximately 9 to 82 m below the weir floor. The two angled boreholes are fitted with flexible FLUTe liners with resistance sensors to measure relative moisture content and absorbent sampling pads for contaminant and environmental tracer sampling within the vadose zone. The two angled boreholes are also monitored for relative changes in moisture content by neutron logging. The vertical borehole penetrates three perched water zones and is equipped with four screens and sampling ports. In April 2002, a tracer test was initiated with the application of a 0.2 M (16,000 ppm) solution of potassium bromide (KBr) onto the weir floor. The tracer experiment was intended to provide data on travel times through the complex hydrogeologic media of fractured basalt. A precipitation and runoff event in June 2002 resulted in approximately 0.61 m of standing water behind the weir. If the KBr and flood waters were well mixed, the concentration of KBr in the flood waters was approximately 24 ppm. Bromide was detected in the absorbent membrane in the 43 degree hole at concentrations up to 2 ppm. Resistance sensors in the 43 degree borehole detected moisture increases within 3 days at a depth of 27 m, indicating an average wetting

  16. Performance of the upgraded ultracold neutron source at Los Alamos National Laboratory and its implication for a possible neutron electric dipole moment experiment

    DOE PAGES

    Ito, Takeyasu M.; Adamek, E. R.; Callahan, N. B.; ...

    2018-01-29

    We report the ultracold neutron (UCN) source at Los Alamos National Laboratory (LANL), which uses solid deuterium as the UCN converter and is driven by accelerator spallation neutrons, has been successfully operated for over 10 years, providing UCN to various experiments, as the first production UCN source based on the superthermal process. It has recently undergone a major upgrade. This paper describes the design and performance of the upgraded LANL UCN source. Measurements of the cold neutron spectrum and UCN density are presented and compared to Monte Carlo predictions. The source is shown to perform as modeled. The UCN densitymore » measured at the exit of the biological shield was 184(32) UCN / cm 3, a fourfold increase from the highest previously reported. Finally, the polarized UCN density stored in an external chamber was measured to be 39(7) UCN / cm 3, which is sufficient to perform an experiment to search for the nonzero neutron electric dipole moment with a one-standard-deviation sensitivity of σ(d n) = 3 × 10 -27 e cm.« less

  17. Performance of the upgraded ultracold neutron source at Los Alamos National Laboratory and its implication for a possible neutron electric dipole moment experiment

    NASA Astrophysics Data System (ADS)

    Ito, T. M.; Adamek, E. R.; Callahan, N. B.; Choi, J. H.; Clayton, S. M.; Cude-Woods, C.; Currie, S.; Ding, X.; Fellers, D. E.; Geltenbort, P.; Lamoreaux, S. K.; Liu, C.-Y.; MacDonald, S.; Makela, M.; Morris, C. L.; Pattie, R. W.; Ramsey, J. C.; Salvat, D. J.; Saunders, A.; Sharapov, E. I.; Sjue, S.; Sprow, A. P.; Tang, Z.; Weaver, H. L.; Wei, W.; Young, A. R.

    2018-01-01

    The ultracold neutron (UCN) source at Los Alamos National Laboratory (LANL), which uses solid deuterium as the UCN converter and is driven by accelerator spallation neutrons, has been successfully operated for over 10 years, providing UCN to various experiments, as the first production UCN source based on the superthermal process. It has recently undergone a major upgrade. This paper describes the design and performance of the upgraded LANL UCN source. Measurements of the cold neutron spectrum and UCN density are presented and compared to Monte Carlo predictions. The source is shown to perform as modeled. The UCN density measured at the exit of the biological shield was 184 (32 ) UCN /cm3 , a fourfold increase from the highest previously reported. The polarized UCN density stored in an external chamber was measured to be 39 (7 ) UCN /cm3 , which is sufficient to perform an experiment to search for the nonzero neutron electric dipole moment with a one-standard-deviation sensitivity of σ (dn) =3 ×10-27e cm .

  18. Performance of the upgraded ultracold neutron source at Los Alamos National Laboratory and its implication for a possible neutron electric dipole moment experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ito, Takeyasu M.; Adamek, E. R.; Callahan, N. B.

    We report the ultracold neutron (UCN) source at Los Alamos National Laboratory (LANL), which uses solid deuterium as the UCN converter and is driven by accelerator spallation neutrons, has been successfully operated for over 10 years, providing UCN to various experiments, as the first production UCN source based on the superthermal process. It has recently undergone a major upgrade. This paper describes the design and performance of the upgraded LANL UCN source. Measurements of the cold neutron spectrum and UCN density are presented and compared to Monte Carlo predictions. The source is shown to perform as modeled. The UCN densitymore » measured at the exit of the biological shield was 184(32) UCN / cm 3, a fourfold increase from the highest previously reported. Finally, the polarized UCN density stored in an external chamber was measured to be 39(7) UCN / cm 3, which is sufficient to perform an experiment to search for the nonzero neutron electric dipole moment with a one-standard-deviation sensitivity of σ(d n) = 3 × 10 -27 e cm.« less

  19. Los Alamos Explosives Performance Key to Stockpile Stewardship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dattelbaum, Dana

    2014-11-03

    As the U.S. Nuclear Deterrent ages, one essential factor in making sure that the weapons will continue to perform as designed is understanding the fundamental properties of the high explosives that are part of a nuclear weapons system. As nuclear weapons go through life extension programs, some changes may be advantageous, particularly through the addition of what are known as "insensitive" high explosives that are much less likely to accidentally detonate than the already very safe "conventional" high explosives that are used in most weapons. At Los Alamos National Laboratory explosives research includes a wide variety of both large- andmore » small-scale experiments that include small contained detonations, gas and powder gun firings, larger outdoor detonations, large-scale hydrodynamic tests, and at the Nevada Nuclear Security Site, underground sub-critical experiments.« less

  20. Sheath field dynamics from time-dependent acceleration of laser-generated positrons

    NASA Astrophysics Data System (ADS)

    Kerr, Shaun; Fedosejevs, Robert; Link, Anthony; Williams, Jackson; Park, Jaebum; Chen, Hui

    2017-10-01

    Positrons produced in ultraintense laser-matter interactions are accelerated by the sheath fields established by fast electrons, typically resulting in quasi-monoenergetic beams. Experimental results from OMEGA EP show higher order features developing in the positron spectra when the laser energy exceeds one kilojoule. 2D PIC simulations using the LSP code were performed to give insight into these spectral features. They suggest that for high laser energies multiple, distinct phases of acceleration can occur due to time-dependent sheath field acceleration. The detailed dynamics of positron acceleration will be discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344, and funded by LDRD 17-ERD-010.

  1. Summary Report of Working Group 2: Computation

    NASA Astrophysics Data System (ADS)

    Stoltz, P. H.; Tsung, R. S.

    2009-01-01

    The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) new hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.

  2. Summary Report of Working Group 2: Computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoltz, P. H.; Tsung, R. S.

    2009-01-22

    The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) newmore » hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.« less

  3. Pesticide concentrations in water and in suspended and bottom sediments in the New and Alamo rivers, Salton Sea Watershed, California, April 2003

    USGS Publications Warehouse

    LeBlanc, Lawrence A.; Orlando, James L.; Kuivila, Kathryn

    2004-01-01

    This report contains pesticide concentration data for water, and suspended and bed sediment samples collected in April 2003 from twelve sites along the New and Alamo Rivers in the Salton Sea watershed, in southeastern California. The study was done in collaboration with the California State Regional Water Quality Control Board, Colorado River Region, to assess inputs of current-use pesticides associated with water and sediment into the New and Alamo Rivers. Five sites along the New River and seven sites along the Alamo River, downstream of major agricultural drains, were selected and covered the lengths of the rivers from the international boundary to approximately 1.5 km from the river mouths. Sampling from bridges occurred at seven of the twelve sites. At these sites, streamflow measurements were taken. These same sites were also characterized for cross-stream homogeneity by measuring dissolved oxygen, pH, specific conductance, temperature, and suspended solids concentration at several vertical (depths) and horizontal (cross-stream) points across the river. Large volume water samples (200?300 L) were collected for isolation of suspended sediments by flow-through centrifugation. Water from the outflow of the flow-through centrifuge was sampled for the determination of aqueous pesticide concentrations. In addition, bottom sediments were sampled at each site. Current-use pesticides and legacy organochlorine compounds (p,p'-DDT, p,p'-DDE and p,p'-DDD) were extracted from sediments and measured via gas chromatography/mass spectrometry (GC/MS). Organic carbon and percentage of fines were also determined for suspended and bottom sediments. Cross-stream transects of dissolved constituents and suspended sediments showed that the rivers were fairly homogeneous at the sites sampled. Streamflow was higher at the outlet sites, with the Alamo River having higher flow (1,240 cfs) than the New River (798 cfs). Twelve current-use pesticides, one legacy organochlorine compound (p

  4. Los Alamos Discovers Super Efficient Solar Using Perovskite Crystals

    ScienceCinema

    Mohite, Aditya; Nie, Wanyi

    2018-05-11

    State-of-the-art photovoltaics using high-purity, large-area, wafer-scale single-crystalline semiconductors grown by sophisticated, high temperature crystal-growth processes offer promising routes for developing low-cost, solar-based clean global energy solutions for the future. Solar cells composed of the recently discovered material organic-inorganic perovskites offer the efficiency of silicon, yet suffer from a variety of deficiencies limiting the commercial viability of perovskite photovoltaic technology. In research to appear in Science, Los Alamos National Laboratory researchers reveal a new solution-based hot-casting technique that eliminates these limitations, one that allows for the growth of high-quality, large-area, millimeter-scale perovskite crystals and demonstrates that highly efficient and reproducible solar cells with reduced trap assisted recombination can be realized.

  5. 75 FR 24957 - Decision to Evaluate a Petition to Designate a Class of Employees From the Los Alamos National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-06

    ... Laboratory. Location: Los Alamos, New Mexico. Job Titles and/or Job Duties: All employees of the Department.... Hinnefeld, Interim Director, Division of Compensation Analysis and Support, National Institute for...

  6. MHD code using multi graphical processing units: SMAUG+

    NASA Astrophysics Data System (ADS)

    Gyenge, N.; Griffiths, M. K.; Erdélyi, R.

    2018-01-01

    This paper introduces the Sheffield Magnetohydrodynamics Algorithm Using GPUs (SMAUG+), an advanced numerical code for solving magnetohydrodynamic (MHD) problems, using multi-GPU systems. Multi-GPU systems facilitate the development of accelerated codes and enable us to investigate larger model sizes and/or more detailed computational domain resolutions. This is a significant advancement over the parent single-GPU MHD code, SMAUG (Griffiths et al., 2015). Here, we demonstrate the validity of the SMAUG + code, describe the parallelisation techniques and investigate performance benchmarks. The initial configuration of the Orszag-Tang vortex simulations are distributed among 4, 16, 64 and 100 GPUs. Furthermore, different simulation box resolutions are applied: 1000 × 1000, 2044 × 2044, 4000 × 4000 and 8000 × 8000 . We also tested the code with the Brio-Wu shock tube simulations with model size of 800 employing up to 10 GPUs. Based on the test results, we observed speed ups and slow downs, depending on the granularity and the communication overhead of certain parallel tasks. The main aim of the code development is to provide massively parallel code without the memory limitation of a single GPU. By using our code, the applied model size could be significantly increased. We demonstrate that we are able to successfully compute numerically valid and large 2D MHD problems.

  7. Reliability enhancement of Navier-Stokes codes through convergence enhancement

    NASA Technical Reports Server (NTRS)

    Choi, K.-Y.; Dulikravich, G. S.

    1993-01-01

    Reduction of total computing time required by an iterative algorithm for solving Navier-Stokes equations is an important aspect of making the existing and future analysis codes more cost effective. Several attempts have been made to accelerate the convergence of an explicit Runge-Kutta time-stepping algorithm. These acceleration methods are based on local time stepping, implicit residual smoothing, enthalpy damping, and multigrid techniques. Also, an extrapolation procedure based on the power method and the Minimal Residual Method (MRM) were applied to the Jameson's multigrid algorithm. The MRM uses same values of optimal weights for the corrections to every equation in a system and has not been shown to accelerate the scheme without multigriding. Our Distributed Minimal Residual (DMR) method based on our General Nonlinear Minimal Residual (GNLMR) method allows each component of the solution vector in a system of equations to have its own convergence speed. The DMR method was found capable of reducing the computation time by 10-75 percent depending on the test case and grid used. Recently, we have developed and tested a new method termed Sensitivity Based DMR or SBMR method that is easier to implement in different codes and is even more robust and computationally efficient than our DMR method.

  8. Reliability enhancement of Navier-Stokes codes through convergence enhancement

    NASA Astrophysics Data System (ADS)

    Choi, K.-Y.; Dulikravich, G. S.

    1993-11-01

    Reduction of total computing time required by an iterative algorithm for solving Navier-Stokes equations is an important aspect of making the existing and future analysis codes more cost effective. Several attempts have been made to accelerate the convergence of an explicit Runge-Kutta time-stepping algorithm. These acceleration methods are based on local time stepping, implicit residual smoothing, enthalpy damping, and multigrid techniques. Also, an extrapolation procedure based on the power method and the Minimal Residual Method (MRM) were applied to the Jameson's multigrid algorithm. The MRM uses same values of optimal weights for the corrections to every equation in a system and has not been shown to accelerate the scheme without multigriding. Our Distributed Minimal Residual (DMR) method based on our General Nonlinear Minimal Residual (GNLMR) method allows each component of the solution vector in a system of equations to have its own convergence speed. The DMR method was found capable of reducing the computation time by 10-75 percent depending on the test case and grid used. Recently, we have developed and tested a new method termed Sensitivity Based DMR or SBMR method that is easier to implement in different codes and is even more robust and computationally efficient than our DMR method.

  9. Inter-view prediction of intra mode decision for high-efficiency video coding-based multiview video coding

    NASA Astrophysics Data System (ADS)

    da Silva, Thaísa Leal; Agostini, Luciano Volcan; da Silva Cruz, Luis A.

    2014-05-01

    Intra prediction is a very important tool in current video coding standards. High-efficiency video coding (HEVC) intra prediction presents relevant gains in encoding efficiency when compared to previous standards, but with a very important increase in the computational complexity since 33 directional angular modes must be evaluated. Motivated by this high complexity, this article presents a complexity reduction algorithm developed to reduce the HEVC intra mode decision complexity targeting multiview videos. The proposed algorithm presents an efficient fast intra prediction compliant with singleview and multiview video encoding. This fast solution defines a reduced subset of intra directions according to the video texture and it exploits the relationship between prediction units (PUs) of neighbor depth levels of the coding tree. This fast intra coding procedure is used to develop an inter-view prediction method, which exploits the relationship between the intra mode directions of adjacent views to further accelerate the intra prediction process in multiview video encoding applications. When compared to HEVC simulcast, our method achieves a complexity reduction of up to 47.77%, at the cost of an average BD-PSNR loss of 0.08 dB.

  10. UCLA Final Technical Report for the "Community Petascale Project for Accelerator Science and Simulation”.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Warren

    The UCLA Plasma Simulation Group is a major partner of the “Community Petascale Project for Accelerator Science and Simulation”. This is the final technical report. We include an overall summary, a list of publications, progress for the most recent year, and individual progress reports for each year. We have made tremendous progress during the three years. SciDAC funds have contributed to the development of a large number of skeleton codes that illustrate how to write PIC codes with a hierarchy of parallelism. These codes cover 2D and 3D as well as electrostatic solvers (which are used in beam dynamics codesmore » and quasi-static codes) and electromagnetic solvers (which are used in plasma based accelerator codes). We also used these ideas to develop a GPU enabled version of OSIRIS. SciDAC funds were also contributed to the development of strategies to eliminate the Numerical Cerenkov Instability (NCI) which is an issue when carrying laser wakefield accelerator (LWFA) simulations in a boosted frame and when quantifying the emittance and energy spread of self-injected electron beams. This work included the development of a new code called UPIC-EMMA which is an FFT based electromagnetic PIC code and to new hybrid algorithms in OSIRIS. A new hybrid (PIC in r-z and gridless in φ) algorithm was implemented into OSIRIS. In this algorithm the fields and current are expanded into azimuthal harmonics and the complex amplitude for each harmonic is calculated separately. The contributions from each harmonic are summed and then used to push the particles. This algorithm permits modeling plasma based acceleration with some 3D effects but with the computational load of an 2D r-z PIC code. We developed a rigorously charge conserving current deposit for this algorithm. Very recently, we made progress in combining the speed up from the quasi-3D algorithm with that from the Lorentz boosted frame. SciDAC funds also contributed to the improvement and speed up of the quasi

  11. Ponderomotive Acceleration in Coronal Loops

    NASA Astrophysics Data System (ADS)

    Dahlburg, Russell B.; Laming, J. Martin; Taylor, Brian; Obenschain, Keith

    2017-08-01

    Ponderomotive acceleration has been asserted to be a cause of the First Ionization Potential (FIP) effect, the by now well known enhancement in abundance by a factor of 3-4 over photospheric values of elements in the solar corona with FIP less than about 10 eV. It is shown here by means of numerical simulations that ponderomotive acceleration occurs in solar coronal loops, with the appropriate magnitude and direction, as a ``byproduct'' of coronal heating. The numerical simulations are performed with the HYPERION code, which solves the fully compressible three-dimensional magnetohydrodynamic equations including nonlinear thermal conduction and optically thin radiation. Numerical simulations of a coronal loops with an axial magnetic field from 0.005 Teslas to 0.02 Teslas and lengths from 25000 km to 75000 km are presented. In the simulations the footpoints of the axial loop magnetic field are convected by random, large-scale motions. There is a continuous formation and dissipation of field-aligned current sheets which act to heat the loop. As a consequence of coronal magnetic reconnection, small scale, high speed jets form. The familiar vortex quadrupoles form at reconnection sites. Between the magnetic footpoints and the corona the reconnection flow merges with the boundary flow. It is in this region that the ponderomotive acceleration occurs. Mirroring the character of the coronal reconnection, the ponderomotive acceleration is also found to be intermittent.

  12. Collaborative Research: Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsouleas, Thomas; Decyk, Viktor

    Final Report for grant DE-FG02-06ER54888, "Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models" Viktor K. Decyk, University of California, Los Angeles Los Angeles, CA 90095-1547 The primary goal of this collaborative proposal was to modify the code QuickPIC and apply it to study the long-time stability of beam propagation in low density electron clouds present in circular accelerators. The UCLA contribution to this collaborative proposal was in supporting the development of the pipelining scheme for the QuickPIC code, which extended the parallel scaling of this code by two orders of magnitude. The USC work was as describedmore » here the PhD research for Ms. Bing Feng, lead author in reference 2 below, who performed the research at USC under the guidance of the PI Tom Katsouleas and the collaboration of Dr. Decyk The QuickPIC code [1] is a multi-scale Particle-in-Cell (PIC) code. The outer 3D code contains a beam which propagates through a long region of plasma and evolves slowly. The plasma response to this beam is modeled by slices of a 2D plasma code. This plasma response then is fed back to the beam code, and the process repeats. The pipelining is based on the observation that once the beam has passed a 2D slice, its response can be fed back to the beam immediately without waiting for the beam to pass all the other slices. Thus independent blocks of 2D slices from different time steps can be running simultaneously. The major difficulty was when particles at the edges needed to communicate with other blocks. Two versions of the pipelining scheme were developed, for the the full quasi-static code and the other for the basic quasi-static code used by this e-cloud proposal. Details of the pipelining scheme were published in [2]. The new version of QuickPIC was able to run with more than 1,000 processors, and was successfully applied in modeling e-clouds by our collaborators in this proposal [3-8]. Jean-Luc Vay at Lawrence

  13. Los Alamos Laser Eye Investigation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odom, C. R.

    2005-01-01

    A student working in a laser laboratory at Los Alamos National Laboratory sustained a serious retinal injury to her left eye when she attempted to view suspended particles in a partially evacuated target chamber. The principle investigator was using the white light from the flash lamp of a Class 4 Nd:YAG laser to illuminate the particles. Since the Q-switch was thought to be disabled at the time of the accident, the principal investigator assumed it would be safe to view the particles without wearing laser eye protection. The Laboratory Director appointed a team to investigate the accident and to reportmore » back to him the events and conditions leading up to the accident, equipment malfunctions, safety management causal factors, supervisory and management action/inaction, adequacy of institutional processes and procedures, emergency and notification response, effectiveness of corrective actions and lessons learned from previous similar events, and recommendations for human and institutional safety improvements. The team interviewed personnel, reviewed documents, and characterized systems and conditions in the laser laboratory during an intense six week investigation. The team determined that the direct and primary failures leading to this accident were, respectively, the principle investigator's unsafe work practices and the institution's inadequate monitoring of worker performance. This paper describes the details of the investigation, the human and institutional failures, and the recommendations for improving the laser safety program.« less

  14. Generation of low-emittance electron beams in electrostatic accelerators for FEL applications

    NASA Astrophysics Data System (ADS)

    Chen, Teng; Elias, Luis R.

    1995-02-01

    This paper reports results of transverse emittance studies and beam propagation in electrostatic accelerators for free electron laser applications. In particular, we discuss emittance growth analysis of a low current electron beam system consisting of a miniature thermoionic electron gun and a National Electrostatics Accelerator (NEC) tube. The emittance growth phenomenon is discussed in terms of thermal effects in the electron gun cathode and aberrations produced by field gradient changes occurring inside the electron gun and throughout the accelerator tube. A method of reducing aberrations using a magnetic solenoidal field is described. Analysis of electron beam emittance was done with the EGUN code. Beam propagation along the accelerator tube was studied using a cylindrically symmetric beam envelope equation that included beam self-fields and the external accelerator fields which were derived from POISSON simulations.

  15. Particle Acceleration, Magnetic Field Generation in Relativistic Shocks

    NASA Technical Reports Server (NTRS)

    Nishikawa, Ken-Ichi; Hardee, P.; Hededal, C. B.; Richardson, G.; Sol, H.; Preece, R.; Fishman, G. J.

    2005-01-01

    Shock acceleration is an ubiquitous phenomenon in astrophysical plasmas. Plasma waves and their associated instabilities (e.g., the Buneman instability, two-streaming instability, and the Weibel instability) created in the shocks are responsible for particle (electron, positron, and ion) acceleration. Using a 3-D relativistic electromagnetic particle (REMP) code, we have investigated particle acceleration associated with a relativistic jet front propagating through an ambient plasma with and without initial magnetic fields. We find only small differences in the results between no ambient and weak ambient parallel magnetic fields. Simulations show that the Weibel instability created in the collisionless shock front accelerates particles perpendicular and parallel to the jet propagation direction. New simulations with an ambient perpendicular magnetic field show the strong interaction between the relativistic jet and the magnetic fields. The magnetic fields are piled up by the jet and the jet electrons are bent, which creates currents and displacement currents. At the nonlinear stage, the magnetic fields are reversed by the current and the reconnection may take place. Due to these dynamics the jet and ambient electron are strongly accelerated in both parallel and perpendicular directions.

  16. Structural Geology of the Northwestern Portion of Los Alamos National Laboratory, Rio Grande Rift, New Mexico: Implications for Seismic Surface Rupture Potential from TA-3 to TA-55

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamie N. Gardner: Alexis Lavine; Giday WoldeGabriel; Donathon Krier

    1999-03-01

    Los Alamos National Laboratory lies at the western boundary of the Rio Grande rift, a major tectonic feature of the North American Continent. Three major faults locally constitute the modem rift boundary, and each of these is potentially seismogenic. In this study we have gathered structural geologic data for the northwestern portion of Los Alamos National Laboratory through high-precision geologic mapping, conventional geologic mapping, stratigraphic studies, drilling, petrologic studies, and stereographic aerial photograph analyses. Our study area encompasses TA-55 and TA-3, where potential for seismic surface rupture is of interest, and is bounded on the north and south by themore » townsite of Los Alamos and Twomile Canyon, respectively. The study area includes parts of two of the potentially active rift boundary faults--the Pajarito and Rendija Canyon faults-that form a large graben that we name the Diamond Drive graben. The graben embraces the western part of the townsite of Los Alamos, and its southern end is in the TA-3 area where it is defined by east-southeast-trending cross faults. The cross faults are small, but they accommodate interactions between the two major fault zones and gentle tilting of structural blocks to the north into the graben. North of Los Alamos townsite, the Rendija Canyon fault is a large normal fault with about 120 feet of down-to-the-west displacement over the last 1.22 million years. South from Los Alamos townsite, the Rendija Canyon fault splays to the southwest into a broad zone of deformation. The zone of deformation is about 2,000 feet wide where it crosses Los Alamos Canyon and cuts through the Los Alamos County Landfill. Farther southwest, the fault zone is about 3,000 feet wide at the southeastern corner of TA-3 in upper Mortandad Canyon and about 5,000 feet wide in Twomile Canyon. Net down-to-the-west displacement across the entire fault zone over the last 1.22 million years decreases to the south as the fault zone

  17. Oppenheimer's Box of Chocolates: Remediation of the Manhattan Project Landfill at Los Alamos National Laboratory - 12283

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Donald L.; Ramsey, Susan S.; Finn, Kevin P.

    2012-07-01

    Material Disposal Area B (MDA B) is the oldest radioactive waste disposal facility at Los Alamos National Laboratory. Operated from 1944-48, MDA B was the disposal facility for the Manhattan Project. Recognized as one of the most challenging environmental remediation projects at Los Alamos, the excavation of MDA B received $110 million from the American Recovery and Reinvestment Act of 2009 to accelerate this complex remediation work. Several factors combined to create significant challenges to remediating the landfill known in the 1940's as the 'contaminated dump'. The secrecy surrounding the Manhattan Project meant that no records were kept of radiologicalmore » materials and chemicals disposed or of the landfill design. An extensive review of historical documents and interviews with early laboratory personnel resulted in a list of hundreds of hazardous chemicals that could have been buried in MDA B. Also, historical reports of MDA B spontaneously combusting on three occasions -with 50-foot flames and pink smoke spewing across the mesa during the last incident in 1948-indicated that hazardous materials were likely present in MDA B. To complicate matters further, though MDA B was located on an isolated mesa in the 1940's, the landfill has since been surrounded by a Los Alamos commercial district. The local newspaper, hardware store and a number of other businesses are located directly across the street from MDA B. This close proximity to the public and the potential for hazardous materials in MDA B necessitated conducting remediation work within protective enclosures. Potential chemical hazards and radiological inventory were better defined using a minimally intrusive sampling method called direct push technology (DPT) prior to excavation. Even with extensive sampling and planning the project team encountered many surprises and challenges during the project. The one area where planning did not fail to meet reality was safety. There were no serious worker

  18. The role of configuration interaction in the LTE opacity of Fe

    NASA Astrophysics Data System (ADS)

    Colgan, James; Kilcrease, David; Magee, Norm; Armstrong, Gregory; Abdallah, Joe; Sherrill, Manolo; Fontes, Christopher; Zhang, Honglin; Hakel, Peter

    2013-05-01

    The Los Alamos National Laboratory code ATOMIC has been recently used to generate a series of local-thermodynamic-equilibrium (LTE) light element opacities for the elements H through Ne. Our calculations, which include fine-structure detail, represent a systematic improvement over previous Los Alamos opacity calculations using the LEDCOP legacy code. Recent efforts have resulted in comprehensive new calculations of the opacity of Fe. In this presentation we explore the role of configuration interaction (CI) in the Fe opacity, and show where CI influences the monochromatic opacity. We present such comparisons for conditions of astrophysical interest. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396.

  19. Symplectic orbit and spin tracking code for all-electric storage rings

    NASA Astrophysics Data System (ADS)

    Talman, Richard M.; Talman, John D.

    2015-07-01

    Proposed methods for measuring the electric dipole moment (EDM) of the proton use an intense, polarized proton beam stored in an all-electric storage ring "trap." At the "magic" kinetic energy of 232.792 MeV, proton spins are "frozen," for example always parallel to the instantaneous particle momentum. Energy deviation from the magic value causes in-plane precession of the spin relative to the momentum. Any nonzero EDM value will cause out-of-plane precession—measuring this precession is the basis for the EDM determination. A proposed implementation of this measurement shows that a proton EDM value of 10-29e -cm or greater will produce a statistically significant, measurable precession after multiply repeated runs, assuming small beam depolarization during 1000 s runs, with high enough precision to test models of the early universe developed to account for the present day particle/antiparticle population imbalance. This paper describes an accelerator simulation code, eteapot, a new component of the Unified Accelerator Libraries (ual), to be used for long term tracking of particle orbits and spins in electric bend accelerators, in order to simulate EDM storage ring experiments. Though qualitatively much like magnetic rings, the nonconstant particle velocity in electric rings gives them significantly different properties, especially in weak focusing rings. Like the earlier code teapot (for magnetic ring simulation) this code performs exact tracking in an idealized (approximate) lattice rather than the more conventional approach, which is approximate tracking in a more nearly exact lattice. The Bargmann-Michel-Telegdi (BMT) equation describing the evolution of spin vectors through idealized bend elements is also solved exactly—original to this paper. Furthermore the idealization permits the code to be exactly symplectic (with no artificial "symplectification"). Any residual spurious damping or antidamping is sufficiently small to permit reliable tracking for the

  20. Preliminary estimates of nucleon fluxes in a water target exposed to solar-flare protons: BRYNTRN versus Monte Carlo code

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Lone, M. A.; Wong, P. Y.; Costen, Robert C.

    1994-01-01

    A baryon transport code (BRYNTRN) has previously been verified using available Monte Carlo results for a solar-flare spectrum as the reference. Excellent results were obtained, but the comparisons were limited to the available data on dose and dose equivalent for moderate penetration studies that involve minor contributions from secondary neutrons. To further verify the code, the secondary energy spectra of protons and neutrons are calculated using BRYNTRN and LAHET (Los Alamos High-Energy Transport code, which is a Monte Carlo code). These calculations are compared for three locations within a water slab exposed to the February 1956 solar-proton spectrum. Reasonable agreement was obtained when various considerations related to the calculational techniques and their limitations were taken into account. Although the Monte Carlo results are preliminary, it appears that the neutron albedo, which is not currently treated in BRYNTRN, might be a cause for the large discrepancy seen at small penetration depths. It also appears that the nonelastic neutron production cross sections in BRYNTRN may underestimate the number of neutrons produced in proton collisions with energies below 200 MeV. The notion that the poor energy resolution in BRYNTRN may cause a large truncation error in neutron elastic scattering requires further study.

  1. Biological Assessment of the Continued Operation of Los Alamos National Laboratory on Federally Listed Threatened and Endangered Species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Leslie A.

    2006-09-19

    This biological assessment considers the effects of continuing to operate Los Alamos National Laboratory on Federally listed threatened or endangered species, based on current and future operations identified in the 2006 Site-wide Environmental Impact Statement for the Continued Operation of Los Alamos National Laboratory (SWEIS; DOE In Prep.). We reviewed 40 projects analyzed in the SWEIS as well as two aspects on ongoing operations to determine if these actions had the potential to affect Federally listed species. Eighteen projects that had not already received U.S. Fish and Wildlife Service (USFWS) consultation and concurrence, as well as the two aspects ofmore » ongoing operations, ecological risk from legacy contaminants and the Outfall Reduction Project, were determined to have the potential to affect threatened or endangered species. Cumulative impacts were also analyzed.« less

  2. Global warming accelerates drought-induced forest death

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDowell, Nathan; Pockman, William

    2013-07-09

    Many southwestern forests in the United States will disappear or be heavily altered by 2050, according to a series of joint Los Alamos National Laboratory-University of New Mexico studies. Nathan McDowell, a Los Alamos plant physiologist, and William Pockman, a UNM biology professor, explain that their research, and more from scientists around the world, is forecasting that by 2100 most conifer forests should be heavily disturbed, if not gone, as air temperatures rise in combination with drought. "Everybody knows trees die when there's a drought, if there's bark beetles or fire, yet nobody in the world can predict it withmore » much accuracy." McDowell said. "What's really changed is that the temperature is going up," thus the researchers are imposing artificial drought conditions on segments of wild forest in the Southwest and pushing forests to their limit to discover the exact processes of mortality and survival. The study is centered on drought experiments in woodlands at both Los Alamos and the Sevilleta National Wildlife Refuge in central New Mexico. Both sites are testing hypotheses about how forests die on mature, wild trees, rather than seedlings in a greenhouse, through the ecosystem-scale removal of 50 percent of yearly precipitation through large water-diversion trough systems.« less

  3. Global warming accelerates drought-induced forest death

    ScienceCinema

    McDowell, Nathan; Pockman, William

    2018-05-16

    Many southwestern forests in the United States will disappear or be heavily altered by 2050, according to a series of joint Los Alamos National Laboratory-University of New Mexico studies. Nathan McDowell, a Los Alamos plant physiologist, and William Pockman, a UNM biology professor, explain that their research, and more from scientists around the world, is forecasting that by 2100 most conifer forests should be heavily disturbed, if not gone, as air temperatures rise in combination with drought. "Everybody knows trees die when there's a drought, if there's bark beetles or fire, yet nobody in the world can predict it with much accuracy." McDowell said. "What's really changed is that the temperature is going up," thus the researchers are imposing artificial drought conditions on segments of wild forest in the Southwest and pushing forests to their limit to discover the exact processes of mortality and survival. The study is centered on drought experiments in woodlands at both Los Alamos and the Sevilleta National Wildlife Refuge in central New Mexico. Both sites are testing hypotheses about how forests die on mature, wild trees, rather than seedlings in a greenhouse, through the ecosystem-scale removal of 50 percent of yearly precipitation through large water-diversion trough systems.

  4. The Los ALamos Neutron Science Center Hydrogen Moderator System

    NASA Astrophysics Data System (ADS)

    Jarmer, J. J.; Knudson, J. N.

    2006-04-01

    At the Los Alamos Neutron Science Center (LANSCE), spallation neutrons are produced by an 800-MeV proton beam interacting with tungsten targets. Gun-barrel-type penetrations through the heavy concrete and steel shielding that surround the targets collimate neutrons to form neutron beams used for scattering experiments. Two liquid hydrogen moderators of one-liter volume each are positioned adjacent to the neutron-production targets. Some of the neutrons that pass through a moderator interact with or scatter from protons in the hydrogen. The neutron-proton interaction reduces the energy or moderates neutrons to lower energies. Lower energy "moderated" neutrons are the most useful for some neutron scattering experiments. We provide a description of the LANSCE hydrogen-moderator system and its cryogenic performance with proton beams of up to 125 micro-amp average current.

  5. Space Science at Los Alamos National Laboratory

    NASA Astrophysics Data System (ADS)

    Smith, Karl

    2017-09-01

    The Space Science and Applications group (ISR-1) in the Intelligence and Space Research (ISR) division at the Los Alamos National Laboratory lead a number of space science missions for civilian and defense-related programs. In support of these missions the group develops sensors capable of detecting nuclear emissions and measuring radiations in space including γ-ray, X-ray, charged-particle, and neutron detection. The group is involved in a number of stages of the lifetime of these sensors including mission concept and design, simulation and modeling, calibration, and data analysis. These missions support monitoring of the atmosphere and near-Earth space environment for nuclear detonations as well as monitoring of the local space environment including space-weather type events. Expertise in this area has been established over a long history of involvement with cutting-edge projects continuing back to the first space based monitoring mission Project Vela. The group's interests cut across a large range of topics including non-proliferation, space situational awareness, nuclear physics, material science, space physics, astrophysics, and planetary physics.

  6. Threatened and Endangered Species Habitat Management Plan for Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hathcock, Charles Dean; Keller, David Charles; Thompson, Brent E.

    Los Alamos National Laboratory’s (LANL) Threatened and Endangered Species Habitat Management Plan (HMP) fulfills a commitment made to the U.S. Department of Energy (DOE) in the “Final Environmental Impact Statement for the Dual-Axis Radiographic Hydrodynamic Test Facility Mitigation Action Plan” (DOE 1996). The HMP received concurrence from the U.S. Fish and Wildlife Service (USFWS) in 1999 (USFWS consultation numbers 2-22-98-I-336 and 2-22-95-I-108). This 2017 update retains the management guidelines from the 1999 HMP for listed species, and updates some descriptive information.

  7. Study on radiation production in the charge stripping section of the RISP linear accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Oranj, Leila Mokhtari; Lee, Hee-Seock; Ko, Seung-Kook

    2015-02-01

    The linear accelerator of the Rare Isotope Science Project (RISP) accelerates 200 MeV/nucleon 238U ions in a multi-charge states. Many kinds of radiations are generated while the primary beam is transported along the beam line. The stripping process using thin carbon foil leads to complicated radiation environments at the 90-degree bending section. The charge distribution of 238U ions after the carbon charge stripper was calculated by using the LISE++ program. The estimates of the radiation environments were carried out by using the well-proved Monte Carlo codes PHITS and FLUKA. The tracks of 238U ions in various charge states were identified using the magnetic field subroutine of the PHITS code. The dose distribution caused by U beam losses for those tracks was obtained over the accelerator tunnel. A modified calculation was applied for tracking the multi-charged U beams because the fundamental idea of PHITS and FLUKA was to transport fully-ionized ion beam. In this study, the beam loss pattern after a stripping section was observed, and the radiation production by heavy ions was studied. Finally, the performance of the PHITS and the FLUKA codes was validated for estimating the radiation production at the stripping section by applying a modified method.

  8. An experimental topographic amplification study at Los Alamos National Laboratory using ambient vibrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stolte, Andrew C.; Cox, Brady R.; Lee, Richard C.

    An experimental study aimed at investigating potential topographic amplification of seismic waves was conducted on a 50-m-tall and 185-m-wide soft-rock ridge located at Los Alamos National Laboratory near Los Alamos, New Mexico. Ten portable broadband seismograph stations were placed in arrays across the ridge and left to record ambient vibration data for ~9 hours. Clear evidence of topographic amplification was observed by comparing spectral ratios calculated from ambient noise recordings at the toe, slope, and crest of the instrumented ridge. The inferred resonance frequency of the ridge obtained from the experimental recordings was found to agree well with several simplemore » estimates of the theoretical resonance frequency based on its geometry and stiffness. Results support the feasibility of quantifying the frequency range of topographic amplification solely using ambient vibrations, rather than strong or weak ground motions. Additionally, comparisons have been made between a number of widely used experimental methods for quantifying topographic effects, such as the standard spectral ratio, median reference method, and horizontal-to-vertical spectral ratio. As a result, differences in the amplification and frequency range of topographic effects indicated by these methods highlight the importance of choosing a reference condition that is appropriate for the site-specific conditions and goals associated with an experimental topographic amplification study.« less

  9. An experimental topographic amplification study at Los Alamos National Laboratory using ambient vibrations

    DOE PAGES

    Stolte, Andrew C.; Cox, Brady R.; Lee, Richard C.

    2017-03-14

    An experimental study aimed at investigating potential topographic amplification of seismic waves was conducted on a 50-m-tall and 185-m-wide soft-rock ridge located at Los Alamos National Laboratory near Los Alamos, New Mexico. Ten portable broadband seismograph stations were placed in arrays across the ridge and left to record ambient vibration data for ~9 hours. Clear evidence of topographic amplification was observed by comparing spectral ratios calculated from ambient noise recordings at the toe, slope, and crest of the instrumented ridge. The inferred resonance frequency of the ridge obtained from the experimental recordings was found to agree well with several simplemore » estimates of the theoretical resonance frequency based on its geometry and stiffness. Results support the feasibility of quantifying the frequency range of topographic amplification solely using ambient vibrations, rather than strong or weak ground motions. Additionally, comparisons have been made between a number of widely used experimental methods for quantifying topographic effects, such as the standard spectral ratio, median reference method, and horizontal-to-vertical spectral ratio. As a result, differences in the amplification and frequency range of topographic effects indicated by these methods highlight the importance of choosing a reference condition that is appropriate for the site-specific conditions and goals associated with an experimental topographic amplification study.« less

  10. Total reaction cross sections in CEM and MCNP6 at intermediate energies

    DOE PAGES

    Kerby, Leslie M.; Mashnik, Stepan G.

    2015-05-14

    Accurate total reaction cross section models are important to achieving reliable predictions from spallation and transport codes. The latest version of the Cascade Exciton Model (CEM) as incorporated in the code CEM03.03, and the Monte Carlo N-Particle transport code (MCNP6), both developed at Los Alamos National Laboratory (LANL), each use such cross sections. Having accurate total reaction cross section models in the intermediate energy region (50 MeV to 5 GeV) is very important for different applications, including analysis of space environments, use in medical physics, and accelerator design, to name just a few. The current inverse cross sections used inmore » the preequilibrium and evaporation stages of CEM are based on the Dostrovsky et al. model, published in 1959. Better cross section models are now available. Implementing better cross section models in CEM and MCNP6 should yield improved predictions for particle spectra and total production cross sections, among other results.« less

  11. Total reaction cross sections in CEM and MCNP6 at intermediate energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerby, Leslie M.; Mashnik, Stepan G.

    Accurate total reaction cross section models are important to achieving reliable predictions from spallation and transport codes. The latest version of the Cascade Exciton Model (CEM) as incorporated in the code CEM03.03, and the Monte Carlo N-Particle transport code (MCNP6), both developed at Los Alamos National Laboratory (LANL), each use such cross sections. Having accurate total reaction cross section models in the intermediate energy region (50 MeV to 5 GeV) is very important for different applications, including analysis of space environments, use in medical physics, and accelerator design, to name just a few. The current inverse cross sections used inmore » the preequilibrium and evaporation stages of CEM are based on the Dostrovsky et al. model, published in 1959. Better cross section models are now available. Implementing better cross section models in CEM and MCNP6 should yield improved predictions for particle spectra and total production cross sections, among other results.« less

  12. Hybrid petacomputing meets cosmology: The Roadrunner Universe project

    NASA Astrophysics Data System (ADS)

    Habib, Salman; Pope, Adrian; Lukić, Zarija; Daniel, David; Fasel, Patricia; Desai, Nehal; Heitmann, Katrin; Hsu, Chung-Hsing; Ankeny, Lee; Mark, Graham; Bhattacharya, Suman; Ahrens, James

    2009-07-01

    The target of the Roadrunner Universe project at Los Alamos National Laboratory is a set of very large cosmological N-body simulation runs on the hybrid supercomputer Roadrunner, the world's first petaflop platform. Roadrunner's architecture presents opportunities and difficulties characteristic of next-generation supercomputing. We describe a new code designed to optimize performance and scalability by explicitly matching the underlying algorithms to the machine architecture, and by using the physics of the problem as an essential aid in this process. While applications will differ in specific exploits, we believe that such a design process will become increasingly important in the future. The Roadrunner Universe project code, MC3 (Mesh-based Cosmology Code on the Cell), uses grid and direct particle methods to balance the capabilities of Roadrunner's conventional (Opteron) and accelerator (Cell BE) layers. Mirrored particle caches and spectral techniques are used to overcome communication bandwidth limitations and possible difficulties with complicated particle-grid interaction templates.

  13. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  14. 76 FR 62330 - Radio Broadcasting Services; Alamo, GA; Alton, MO; Boscobel, WI; Buffalo, OK; Cove, AR; Clayton...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ... Broadcasting Services; Alamo, GA; Alton, MO; Boscobel, WI; Buffalo, OK; Cove, AR; Clayton, LA; Daisy, AR; Ennis... competitive bidding process, and are considered unsold permits that were included in Auction 91. Interested... competitive bidding process. DATES: Comments must be filed on or before October 31, 2011, and reply comments...

  15. Los Alamos County Fire Department LAFD: TA-55 PF-4 Facility Familiarization Tour, OJT 55260

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutherford, Victor Stephen

    Los Alamos National Laboratory (LANL) will conduct familiarization tours for Los Alamos County Fire Department (LAFD) personnel at the Plutonium Facility (PF-4) at Technical Area (TA)-55. These familiarization tours are official LANL business; the purpose of these tours is to orient the firefighters to the facility so that they can respond efficiently and quickly to a variety of emergency situations. This orientation includes the ingress and egress of the area and buildings, layout and organization of the facility, evacuation procedures and assembly points, and areas of concern within the various buildings at the facility. LAFD firefighters have the skills andmore » abilities to perform firefighting operations and other emergency response tasks that cannot be provided by other LANL personnel who have the required clearance level. This handout provides details of the information, along with maps and diagrams, to be presented during the familiarization tours. The handout will be distributed to the trainees at the time of the tour. A corresponding checklist will also be used as guidance during the familiarization tours to ensure that all required information is presented to LAFD personnel.« less

  16. Statistical analyses of the background distribution of groundwater solutes, Los Alamos National Laboratory, New Mexico.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Longmire, Patrick A.; Goff, Fraser; Counce, D. A.

    2004-01-01

    Background or baseline water chemistry data and information are required to distingu ish between contaminated and non-contaminated waters for environmental investigations conducted at Los Alamos National Laboratory (referred to as the Laboratory). The term 'background' refers to natural waters discharged by springs or penetrated by wells that have not been contaminated by LANL or other municipal or industrial activities, and that are representative of groundwater discharging from their respective aquifer material. These investigations are conducted as part of the Environmental Restoration (ER) Project, Groundwater Protection Program (GWPP), Laboratory Surveillance Program, the Hydrogeologic Workplan, and the Site-Wide Environmental Impact Statement (SWEIS).more » This poster provides a comprehensive, validated database of inorganic, organic, stable isotope, and radionuclide analyses of up to 136 groundwater samples collected from 15 baseline springs and wells located in and around Los Alamos National Laboratory, New Mexico. The region considered in this investigation extends from the western edge of the Jemez Mountains eastward to the Rio Grande and from Frijoles Canyon northward to Garcia Canyon. Figure 1 shows the fifteen stations sampled for this investigation. The sampling stations and associated aquifer types are summarized in Table 1.« less

  17. Multilevel acceleration of scattering-source iterations with application to electron transport

    DOE PAGES

    Drumm, Clif; Fan, Wesley

    2017-08-18

    Acceleration/preconditioning strategies available in the SCEPTRE radiation transport code are described. A flexible transport synthetic acceleration (TSA) algorithm that uses a low-order discrete-ordinates (S N) or spherical-harmonics (P N) solve to accelerate convergence of a high-order S N source-iteration (SI) solve is described. Convergence of the low-order solves can be further accelerated by applying off-the-shelf incomplete-factorization or algebraic-multigrid methods. Also available is an algorithm that uses a generalized minimum residual (GMRES) iterative method rather than SI for convergence, using a parallel sweep-based solver to build up a Krylov subspace. TSA has been applied as a preconditioner to accelerate the convergencemore » of the GMRES iterations. The methods are applied to several problems involving electron transport and problems with artificial cross sections with large scattering ratios. These methods were compared and evaluated by considering material discontinuities and scattering anisotropy. Observed accelerations obtained are highly problem dependent, but speedup factors around 10 have been observed in typical applications.« less

  18. SPEAR — ToF neutron reflectometer at the Los Alamos Neutron Science Center

    NASA Astrophysics Data System (ADS)

    Dubey, M.; Jablin, M. S.; Wang, P.; Mocko, M.; Majewski, J.

    2011-11-01

    This article discusses the Surface ProfilE Analysis Reflectometer (SPEAR), a vertical scattering geometry time-of-flight reflectometer, at the Los Alamos National Laboratory Lujan Neutron Scattering Center. SPEAR occupies flight path 9 and receives spallation neutrons from a polychromatic, pulsed (20Hz) source that pass through a liquid-hydrogen moderator at 20K coupled with a Be filter to shift their energy spectrum. The spallation neutrons are generated by bombarding a tungsten target with 800MeV protons obtained from an accelerator. The process produces an integrated neutron flux of ˜ 3.4×106 cm-2 s-1 at a proton current of 100 μA. SPEAR employs choppers and frame overlap mirrors to obtain a neutron wavelength range of 4.5-16 Å. SPEAR uses a single 200mm long 3He linear position-sensitive detector with ˜ 2 mm FWHM resolution for simultaneous studies of both specular and off-specular scattering. SPEAR's moderated neutrons are collimated into a beam which impinges from above upon a level sample with an average angle of 0.9° to the horizontal, to facilitate air-liquid interface studies. In the vertical direction, the beam converges at the sample position. The neutrons can be further collimated to the desired divergence by finely slitting the beam using a set of two 10B4C slit packages. The instrument is ideally suited to study organic and inorganic thin films with total thicknesses between 5 and 3000 Å in a variety of environments. Specifically designed sample chambers available at the instrument provide the opportunity to study biological systems at the solid-liquid interface. SPEAR's unique experimental capabilities are demonstrated by specific examples in this article. Finally, an outlook for SPEAR and perspectives on future instrumentation are discussed.

  19. Climate Change and the Los Alamos National Laboratory. The Adaptation Challenge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Hjeresen, Dennis; Silverman, Josh

    2015-02-01

    The Los Alamos National Laboratory (LANL) has been adapting to climate change related impacts that have been occurring on decadal time scales. The region where LANL is located has been subject to a cascade of climate related impacts: drought, devastating wildfires, and historic flooding events. Instead of buckling under the pressure, LANL and the surrounding communities have integrated climate change mitigation strategies into their daily operations and long-term plans by increasing coordination and communication between the Federal, State, and local agencies in the region, identifying and aggressively managing forested areas in need of near-term attention, addressing flood control and retentionmore » issues, and more.« less

  20. New estimation method of neutron skyshine for a high-energy particle accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Jung, Nam-Suk; Lee, Hee-Seock; Ko, Seung-Kook

    2016-09-01

    A skyshine is the dominant component of the prompt radiation at off-site. Several experimental studies have been done to estimate the neutron skyshine at a few accelerator facilities. In this work, the neutron transports from a source place to off-site location were simulated using the Monte Carlo codes, FLUKA and PHITS. The transport paths were classified as skyshine, direct (transport), groundshine and multiple-shine to understand the contribution of each path and to develop a general evaluation method. The effect of each path was estimated in the view of the dose at far locations. The neutron dose was calculated using the neutron energy spectra obtained from each detector placed up to a maximum of 1 km from the accelerator. The highest altitude of the sky region in this simulation was set as 2 km from the floor of the accelerator facility. The initial model of this study was the 10 GeV electron accelerator, PAL-XFEL. Different compositions and densities of air, soil and ordinary concrete were applied in this calculation, and their dependences were reviewed. The estimation method used in this study was compared with the well-known methods suggested by Rindi, Stevenson and Stepleton, and also with the simple code, SHINE3. The results obtained using this method agreed well with those using Rindi's formula.

  1. Analysis of secondary particle behavior in multiaperture, multigrid accelerator for the ITER neutral beam injector.

    PubMed

    Mizuno, T; Taniguchi, M; Kashiwagi, M; Umeda, N; Tobari, H; Watanabe, K; Dairaku, M; Sakamoto, K; Inoue, T

    2010-02-01

    Heat load on acceleration grids by secondary particles such as electrons, neutrals, and positive ions, is a key issue for long pulse acceleration of negative ion beams. Complicated behaviors of the secondary particles in multiaperture, multigrid (MAMuG) accelerator have been analyzed using electrostatic accelerator Monte Carlo code. The analytical result is compared to experimental one obtained in a long pulse operation of a MeV accelerator, of which second acceleration grid (A2G) was removed for simplification of structure. The analytical results show that relatively high heat load on the third acceleration grid (A3G) since stripped electrons were deposited mainly on A3G. This heat load on the A3G can be suppressed by installing the A2G. Thus, capability of MAMuG accelerator is demonstrated for suppression of heat load due to secondary particles by the intermediate grids.

  2. Symplectic orbit and spin tracking code for all-electric storage rings

    DOE PAGES

    Talman, Richard M.; Talman, John D.

    2015-07-22

    Proposed methods for measuring the electric dipole moment (EDM) of the proton use an intense, polarized proton beam stored in an all-electric storage ring “trap.” At the “magic” kinetic energy of 232.792 MeV, proton spins are “frozen,” for example always parallel to the instantaneous particle momentum. Energy deviation from the magic value causes in-plane precession of the spin relative to the momentum. Any nonzero EDM value will cause out-of-plane precession—measuring this precession is the basis for the EDM determination. A proposed implementation of this measurement shows that a proton EDM value of 10 –29e–cm or greater will produce a statisticallymore » significant, measurable precession after multiply repeated runs, assuming small beam depolarization during 1000 s runs, with high enough precision to test models of the early universe developed to account for the present day particle/antiparticle population imbalance. This paper describes an accelerator simulation code, eteapot, a new component of the Unified Accelerator Libraries (ual), to be used for long term tracking of particle orbits and spins in electric bend accelerators, in order to simulate EDM storage ring experiments. Though qualitatively much like magnetic rings, the nonconstant particle velocity in electric rings gives them significantly different properties, especially in weak focusing rings. Like the earlier code teapot (for magnetic ring simulation) this code performs exact tracking in an idealized (approximate) lattice rather than the more conventional approach, which is approximate tracking in a more nearly exact lattice. The Bargmann-Michel-Telegdi (BMT) equation describing the evolution of spin vectors through idealized bend elements is also solved exactly—original to this paper. Furthermore the idealization permits the code to be exactly symplectic (with no artificial “symplectification”). Any residual spurious damping or antidamping is sufficiently small to permit

  3. Model-independent particle accelerator tuning

    DOE PAGES

    Scheinker, Alexander; Pang, Xiaoying; Rybarcyk, Larry

    2013-10-21

    We present a new model-independent dynamic feedback technique, rotation rate tuning, for automatically and simultaneously tuning coupled components of uncertain, complex systems. The main advantages of the method are: 1) It has the ability to handle unknown, time-varying systems, 2) It gives known bounds on parameter update rates, 3) We give an analytic proof of its convergence and its stability, and 4) It has a simple digital implementation through a control system such as the Experimental Physics and Industrial Control System (EPICS). Because this technique is model independent it may be useful as a real-time, in-hardware, feedback-based optimization scheme formore » uncertain and time-varying systems. In particular, it is robust enough to handle uncertainty due to coupling, thermal cycling, misalignments, and manufacturing imperfections. As a result, it may be used as a fine-tuning supplement for existing accelerator tuning/control schemes. We present multi-particle simulation results demonstrating the scheme’s ability to simultaneously adaptively adjust the set points of twenty two quadrupole magnets and two RF buncher cavities in the Los Alamos Neutron Science Center Linear Accelerator’s transport region, while the beam properties and RF phase shift are continuously varying. The tuning is based only on beam current readings, without knowledge of particle dynamics. We also present an outline of how to implement this general scheme in software for optimization, and in hardware for feedback-based control/tuning, for a wide range of systems.« less

  4. GPU accelerated cell-based adaptive mesh refinement on unstructured quadrilateral grid

    NASA Astrophysics Data System (ADS)

    Luo, Xisheng; Wang, Luying; Ran, Wei; Qin, Fenghua

    2016-10-01

    A GPU accelerated inviscid flow solver is developed on an unstructured quadrilateral grid in the present work. For the first time, the cell-based adaptive mesh refinement (AMR) is fully implemented on GPU for the unstructured quadrilateral grid, which greatly reduces the frequency of data exchange between GPU and CPU. Specifically, the AMR is processed with atomic operations to parallelize list operations, and null memory recycling is realized to improve the efficiency of memory utilization. It is found that results obtained by GPUs agree very well with the exact or experimental results in literature. An acceleration ratio of 4 is obtained between the parallel code running on the old GPU GT9800 and the serial code running on E3-1230 V2. With the optimization of configuring a larger L1 cache and adopting Shared Memory based atomic operations on the newer GPU C2050, an acceleration ratio of 20 is achieved. The parallelized cell-based AMR processes have achieved 2x speedup on GT9800 and 18x on Tesla C2050, which demonstrates that parallel running of the cell-based AMR method on GPU is feasible and efficient. Our results also indicate that the new development of GPU architecture benefits the fluid dynamics computing significantly.

  5. Embedded Streaming Deep Neural Networks Accelerator With Applications.

    PubMed

    Dundar, Aysegul; Jin, Jonghoon; Martini, Berin; Culurciello, Eugenio

    2017-07-01

    Deep convolutional neural networks (DCNNs) have become a very powerful tool in visual perception. DCNNs have applications in autonomous robots, security systems, mobile phones, and automobiles, where high throughput of the feedforward evaluation phase and power efficiency are important. Because of this increased usage, many field-programmable gate array (FPGA)-based accelerators have been proposed. In this paper, we present an optimized streaming method for DCNNs' hardware accelerator on an embedded platform. The streaming method acts as a compiler, transforming a high-level representation of DCNNs into operation codes to execute applications in a hardware accelerator. The proposed method utilizes maximum computational resources available based on a novel-scheduled routing topology that combines data reuse and data concatenation. It is tested with a hardware accelerator implemented on the Xilinx Kintex-7 XC7K325T FPGA. The system fully explores weight-level and node-level parallelizations of DCNNs and achieves a peak performance of 247 G-ops while consuming less than 4 W of power. We test our system with applications on object classification and object detection in real-world scenarios. Our results indicate high-performance efficiency, outperforming all other presented platforms while running these applications.

  6. Los Alamos nEDM Experiment and Demonstration of Ramsey's Method on Stored UCNs at the LANL UCN Source

    NASA Astrophysics Data System (ADS)

    Clayton, Steven; Chupp, Tim; Cude-Woods, Christopher; Currie, Scott; Ito, Takeyasu; Liu, Chen-Yu; Long, Joshua; MacDonald, Stephen; Makela, Mark; O'Shaughnessy, Christopher; Plaster, Brad; Ramsey, John; Saunders, Andy; LANL nEDM Collaboration

    2017-09-01

    The Los Alamos National Laboratory ultracold neutron (UCN) source was recently upgraded for a factor of 5 improvement in stored density, providing the statistical precision needed for a room temperature neutron electric dipole moment measurement with sensitivity 3 ×10-27 e . cm, a factor 10 better than the limit set by the Sussex-RAL-ILL experiment. Here, we show results of a demonstration of Ramsey's separated oscillatory fields method on stored UCNs at the LANL UCN source and in a geometry relevant for a nEDM measurement. We argue a world-leading nEDM experiment could be performed at LANL with existing technology and a short lead time, providing a physics result with sensitivity intermediate between the current limit set by Sussex-RAL-ILL, and the anticipated limit from the complex, cryogenic nEDM experiment planned for the next decade at the ORNL Spallation Neutron Source (SNS-nEDM). This work was supported by the Los Alamos LDRD Program, Project 20140015DR.

  7. High spatial resolution measurements in a single stage ram accelerator

    NASA Technical Reports Server (NTRS)

    Hinkey, J. B.; Burnham, E. A.; Bruckner, A. P.

    1992-01-01

    High spatial resolution experimental tube wall pressure measurements of ram accelerator gas dynamic phenomena are presented in this paper. The ram accelerator is a ramjet-in-tube device which operates in a manner similar to that of a conventional ramjet. The projectile resembles the centerbody of a ramjet and travels supersonically through a tube filled with a combustible gaseous mixture, with the tube acting as the outer cowling. Pressure data are recorded as the projectile passes by sensors mounted in the tube wall at various locations along the tube. Utilization of special highly instrumented sections of tube has allowed the recording of gas dynamic phenomena with high resolution. High spatial resolution tube wall pressure data from the three regimes of propulsion studied to date (subdetonative, transdetonative, and superdetonative) in a single stage gas mixture are presented and reveal the three-dimensional character of the flow field induced by projectile fins and the canting of the fins and the canting of the projectile body relative to the tube wall. Also presented for comparison to the experimental data are calculations made with an inviscid, three-dimensional CFD code. The knowledge gained from these experiments and simulations is useful in understanding the underlying nature of ram accelerator propulsive regimes, as well as assisting in the validation of three-dimensional CFD coded which model unsteady, chemically reactive flows.

  8. Availability of environmental radioactivity to honey bee colonies at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakonson, T.E.; Bostick, K.V.

    Data are presented on the availability of tritium, cesium 137, and plutonium to honey bee colonies foraging in the environment surrounding the Los Alamos Scientific Laboratory. Sources of these radionuclides in the laboratory environs include liquid and atmospheric effluents and buried solid waste. Honey bee colonies were placed in three canyon liquid waste disposal areas and were sampled frequently, along with honey, surface water, and surrounding vegetation, to qualitatively determine the availability of these radionuclides to bees (Apis mellifera) and to identify potential food chain sources of the elements. Tritium concentrations in bee and honey samples from the canyons increasedmore » rapidly from initial values of <1 pCi/ml moisture to as much as 9.2 nCi/ml in 75 days after placement of the hives in the canyons. Seasonal patterns in foraging activities as influenced by weather and food availability were apparent in the data. It appears that several sources of tritium were utilized by the colonies, including surface water in the canyons and vegetation receiving tritium from atmospheric effluents and buried solid waste. Concentrations of cesium 137 and plutonium were generally low or undetectable in bees throughout the study. However, levels of both nuclides increased by factors of 10 to 20 in bees from two of the canyon study areas during a 3-month period in 1973. It was speculated that the liquid effluents in the two canyons were the source of the increased concentrations in bee samples, since this water was the only significant source of /sup 137/Cs in the environs. The existence of at least three radionuclide sources in the Los Alamos Scientific Laboratory (LASL) environs complicates the interpretation of the data. However, it is apparent that honey bees can acquire /sup 3/H, /sup 137/Cs, and Pu from multiple sources in the environs.« less

  9. 2015 Los Alamos Space Weather Summer School Research Reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowee, Misa; Chen, Yuxi; Desai, Ravindra

    The fifth Los Alamos Space Weather Summer School was held June 1st - July 24th, 2015, at Los Alamos National Laboratory (LANL). With renewed support from the Institute of Geophysics, Planetary Physics, and Signatures (IGPPS) and additional support from the National Aeronautics and Space Administration (NASA) and the Department of Energy (DOE) Office of Science, we hosted a new class of five students from various U.S. and foreign research institutions. The summer school curriculum includes a series of structured lectures as well as mentored research and practicum opportunities. Lecture topics including general and specialized topics in the field of spacemore » weather were given by a number of researchers affiliated with LANL. Students were given the opportunity to engage in research projects through a mentored practicum experience. Each student works with one or more LANL-affiliated mentors to execute a collaborative research project, typically linked with a larger ongoing research effort at LANL and/or the student’s PhD thesis research. This model provides a valuable learning experience for the student while developing the opportunity for future collaboration. This report includes a summary of the research efforts fostered and facilitated by the Space Weather Summer School. These reports should be viewed as work-in-progress as the short session typically only offers sufficient time for preliminary results. At the close of the summer school session, students present a summary of their research efforts. Titles of the papers included in this report are as follows: Full particle-in-cell (PIC) simulation of whistler wave generation, Hybrid simulations of the right-hand ion cyclotron anisotropy instability in a sub-Alfvénic plasma flow, A statistical ensemble for solar wind measurements, Observations and models of substorm injection dispersion patterns, Heavy ion effects on Kelvin-Helmholtz instability: hybrid study, Simulating plasmaspheric electron densities with

  10. Study of the transverse beam motion in the DARHT Phase II accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yu-Jiuan; Fawley, W M; Houck, T L

    1998-08-20

    The accelerator for the second-axis of the Dual Axis Radiographic Hydrodynamic Test (DARHT) facility will accelerate a 4-kA, 3-MeV, 2--µs long electron current pulse to 20 MeV. The energy variation of the beam within the flat-top portion of the current pulse is (plus or equal to) 0.5%. The performance of the DARHT Phase II radiographic machine requires the transverse beam motion to be much less than the beam spot size which is about 1.5 mm diameter on the x-ray converter. In general, the leading causes of the transverse beam motion in an accelerator are the beam breakup instability (BBU) andmore » the corkscrew motion. We have modeled the transverse beam motion in the DARHT Phase II accelerator with various magnetic tunes and accelerator cell configurations by using the BREAKUP code. The predicted sensitivity of corkscrew motion and BBU growth to different tuning algorithms will be presented.« less

  11. Distribution of the background gas in the MITICA accelerator

    NASA Astrophysics Data System (ADS)

    Sartori, E.; Dal Bello, S.; Serianni, G.; Sonato, P.

    2013-02-01

    MITICA is the ITER neutral beam test facility to be built in Padova for the generation of a 40A D- ion beam with a 16×5×16 array of 1280 beamlets accelerated to 1MV. The background gas pressure distribution and the particle flows inside MITICA accelerator are critical aspects for stripping losses, generation of secondary particles and beam non-uniformities. To keep the stripping losses in the extraction and acceleration stages reasonably low, the source pressure should be 0.3 Pa or less. The gas flow in MITICA accelerator is being studied using a 3D Finite Element code, named Avocado. The gas-wall interaction model is based on the cosine law, and the whole vacuum system geometry is represented by a view factor matrix based on surface discretization and gas property definitions. Pressure distribution and mutual fluxes are then solved linearly. In this paper the result of a numerical simulation is presented, showing the steady-state pressure distribution inside the accelerator when gas enters the system at room temperature. The accelerator model is limited to a horizontal slice 400 mm high (1/4 of the accelerator height). The pressure profile at solid walls and through the beamlet axis is obtained, allowing the evaluation and the discussion of the background gas distribution and nonuniformity. The particle flux at the inlet and outlet boundaries (namely the grounded grid apertures and the lateral conductances respectively) will be discussed.

  12. Acceleration modules in linear induction accelerators

    NASA Astrophysics Data System (ADS)

    Wang, Shao-Heng; Deng, Jian-Jun

    2014-05-01

    The Linear Induction Accelerator (LIA) is a unique type of accelerator that is capable of accelerating kilo-Ampere charged particle current to tens of MeV energy. The present development of LIA in MHz bursting mode and the successful application into a synchrotron have broadened LIA's usage scope. Although the transformer model is widely used to explain the acceleration mechanism of LIAs, it is not appropriate to consider the induction electric field as the field which accelerates charged particles for many modern LIAs. We have examined the transition of the magnetic cores' functions during the LIA acceleration modules' evolution, distinguished transformer type and transmission line type LIA acceleration modules, and re-considered several related issues based on transmission line type LIA acceleration module. This clarified understanding should help in the further development and design of LIA acceleration modules.

  13. Evaluation of aircraft crash hazard at Los Alamos National Laboratory facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selvage, R.D.

    This report selects a method for use in calculating the frequency of an aircraft crash occurring at selected facilities at the Los Alamos National Laboratory (the Laboratory). The Solomon method was chosen to determine these probabilities. Each variable in the Solomon method is defined and a value for each variable is selected for fourteen facilities at the Laboratory. These values and calculated probabilities are to be used in all safety analysis reports and hazards analyses for the facilities addressed in this report. This report also gives detailed directions to perform aircraft-crash frequency calculations for other facilities. This will ensure thatmore » future aircraft-crash frequency calculations are consistent with calculations in this report.« less

  14. User input verification and test driven development in the NJOY21 nuclear data processing code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trainer, Amelia Jo; Conlin, Jeremy Lloyd; McCartney, Austin Paul

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, andmore » capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.« less

  15. Understanding large SEP events with the PATH code: Modeling of the 13 December 2006 SEP event

    NASA Astrophysics Data System (ADS)

    Verkhoglyadova, O. P.; Li, G.; Zank, G. P.; Hu, Q.; Cohen, C. M. S.; Mewaldt, R. A.; Mason, G. M.; Haggerty, D. K.; von Rosenvinge, T. T.; Looper, M. D.

    2010-12-01

    The Particle Acceleration and Transport in the Heliosphere (PATH) numerical code was developed to understand solar energetic particle (SEP) events in the near-Earth environment. We discuss simulation results for the 13 December 2006 SEP event. The PATH code includes modeling a background solar wind through which a CME-driven oblique shock propagates. The code incorporates a mixed population of both flare and shock-accelerated solar wind suprathermal particles. The shock parameters derived from ACE measurements at 1 AU and observational flare characteristics are used as input into the numerical model. We assume that the diffusive shock acceleration mechanism is responsible for particle energization. We model the subsequent transport of particles originated at the flare site and particles escaping from the shock and propagating in the equatorial plane through the interplanetary medium. We derive spectra for protons, oxygen, and iron ions, together with their time-intensity profiles at 1 AU. Our modeling results show reasonable agreement with in situ measurements by ACE, STEREO, GOES, and SAMPEX for this event. We numerically estimate the Fe/O abundance ratio and discuss the physics underlying a mixed SEP event. We point out that the flare population is as important as shock geometry changes during shock propagation for modeling time-intensity profiles and spectra at 1 AU. The combined effects of seed population and shock geometry will be examined in the framework of an extended PATH code in future modeling efforts.

  16. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  17. Environmental surveillance and compliance at Los Alamos during 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-09-01

    This report presents environmental data that characterize environmental performance and addresses compliance with environmental standards and requirements at Los Alamos National Laboratory (LANL or the Laboratory) during 1996. The Laboratory routinely monitors for radiation and for radioactive nonradioactive materials at Laboratory sites as well as in the surrounding region. LANL uses the monitoring results to determine compliance with appropriate standards and to identify potentially undesirable trends. Data were collected in 1996 to assess external penetrating radiation; quantities of airborne emissions; and concentrations of chemicals and radionuclides in ambient air, surface waters and groundwaters, the municipal water supply, soils and sediments,more » and foodstuffs. Using comparisons with standards and regulations, this report concludes that environmental effects from Laboratory operations are small and do not pose a demonstrable threat to the public, Laboratory employees, or the environment. Laboratory operations were in compliance with all major environmental regulations.« less

  18. Los Alamos National Laboratory Science Education Programs. Progress report, October 1, 1994--December 31, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gill, D.H.

    During the 1994 summer institute NTEP teachers worked in coordination with LANL and the Los Alamos Middle School and Mountain Elementary School to gain experience in communicating on-line, to gain further information from the Internet and in using electronic Bulletin Board Systems (BBSs) to exchange ideas with other teachers. To build on their telecommunications skills, NTEP teachers participated in the International Telecommunications In Education Conference (Tel*ED `94) at the Albuquerque Convention Center on November 11 & 12, 1994. They attended the multimedia keynote address, various workshops highlighting many aspects of educational telecommunications skills, and the Telecomm Rodeo sponsored by Losmore » Alamos National Laboratory. The Rodeo featured many presentations by Laboratory personnel and educational institutions on ways in which telecommunications technologies can be use din the classroom. Many were of the `hands-on` type, so that teachers were able to try out methods and equipment and evaluate their usefulness in their own schools and classrooms. Some of the presentations featured were the Geonet educational BBS system, the Supercomputing Challenge, and the Sunrise Project, all sponsored by LANL; the `CU-seeMe` live video software, various simulation software packages, networking help, and many other interesting and useful exhibits.« less

  19. Beam Loss Measurements at the Los Alamos Proton Storage Ring

    NASA Astrophysics Data System (ADS)

    Spickermann, Thomas

    2005-06-01

    During normal operation the Los Alamos Proton Storage Ring (PSR) accumulates up to 4ṡ1013 protons over 625μs with a repetition rate of 20 Hz, corresponding to a current of 125μA to the Lujan Neutron Science Center. Beam losses in the ring as well as in the extraction beam line and the subsequent activation of material are a limiting factor at these currents. Careful tuning of injection, ring and extraction line is paramount to limiting losses to acceptable levels. Losses are typically not uniform around the ring, but occur in significantly higher levels in certain "hot spots". Here I will report on losses related to the stripper foil which are the dominant source of losses in the ring. First results of a comparison with simulations will also be presented.

  20. Neutron Capture Experiments Using the DANCE Array at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dashdorj, D.; MonAme Scientific Research Center, Ulaanbaatar; Mitchell, G. E.

    2009-03-31

    The Detector for Advanced Neutron Capture Experiments (DANCE) is designed for neutron capture measurements on very small and/or radioactive targets. The DANCE array of 160 BaF{sub 2} scintillation detectors is located at the Lujan Center at the Los Alamos Neutron Science Center (LANSCE). Accurate measurements of neutron capture data are important for many current applications as well as for basic understanding of neutron capture. The gamma rays following neutron capture reactions have been studied by the time-of-flight technique using the DANCE array. The high granularity of the array allows measurements of the gamma-ray multiplicity. The gamma-ray multiplicities and energy spectramore » for different multiplicities can be measured and analyzed for spin and parity determination of the resolved resonances.« less

  1. The Los Alamos Scientific Laboratory - An Isolated Nuclear Research Establishment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradbury, Norris E.; Meade, Roger Allen

    Early in his twenty-five year career as the Director of the Los Alamos Scientific Laboratory, Norris Bradbury wrote at length about the atomic bomb and the many implications the bomb might have on the world. His themes were both technical and philosophical. In 1963, after nearly twenty years of leading the nation’s first nuclear weapons laboratory, Bradbury took the opportunity to broaden his writing. In a paper delivered to the International Atomic Energy Agency’s symposium on the “Criteria in the Selection of Sites for the Construction of Reactors and Nuclear Research Centers,” Bradbury took the opportunity to talk about themore » business of nuclear research and the human component of operating a scientific laboratory. This report is the transcript of his talk.« less

  2. Common ground: An environmental ethic for Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menlove, F.L.

    1991-01-01

    Three predominant philosophies have characterized American business ethical thinking over the past several decades. The first phase is the ethics of self-interest'' which argues that maximizing self-interest coincidentally maximizes the common good. The second phase is legality ethics.'' Proponents argue that what is important is knowing the rules and following them scrupulously. The third phase might be called stake-holder ethics.'' A central tenant is that everyone affected by a decision has a moral hold on the decision maker. This paper will discuss one recent initiative of the Los Alamos National Laboratory to move beyond rules and regulations toward an environmentalmore » ethic that integrates the values of stakeholder ethics'' into the Laboratory's historical culture and value systems. These Common Ground Principles are described. 11 refs.« less

  3. Inductive and electrostatic acceleration in relativistic jet-plasma interactions.

    PubMed

    Ng, Johnny S T; Noble, Robert J

    2006-03-24

    We report on the observation of rapid particle acceleration in numerical simulations of relativistic jet-plasma interactions and discuss the underlying mechanisms. The dynamics of a charge-neutral, narrow, electron-positron jet propagating through an unmagnetized electron-ion plasma was investigated using a three-dimensional, electromagnetic, particle-in-cell computer code. The interaction excited magnetic filamentation as well as electrostatic plasma instabilities. In some cases, the longitudinal electric fields generated inductively and electrostatically reached the cold plasma-wave-breaking limit, and the longitudinal momentum of about half the positrons increased by 50% with a maximum gain exceeding a factor of 2 during the simulation period. Particle acceleration via these mechanisms occurred when the criteria for Weibel instability were satisfied.

  4. Activation assessment of the soil around the ESS accelerator tunnel

    NASA Astrophysics Data System (ADS)

    Rakhno, I. L.; Mokhov, N. V.; Tropin, I. S.; Ene, D.

    2018-06-01

    Activation of the soil surrounding the ESS accelerator tunnel calculated by the MARS15 code is presented. A detailed composition of the soil, that comprises about 30 chemical elements, is considered. Spatial distributions of the produced activity are provided in both transverse and longitudinal directions. A realistic irradiation profile for the entire planned lifetime of the facility is used. The nuclear transmutation and decay of the produced radionuclides is calculated with the DeTra code which is a built-in tool for the MARS15 code. Radionuclide production by low-energy neutrons is calculated using the ENDF/B-VII evaluated nuclear data library. In order to estimate quality of this activation assessment, a comparison between calculated and measured activation of various foils in a similar radiation environment is presented.

  5. Activation Assessment of the Soil Around the ESS Accelerator Tunnel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakhno, I. L.; Mokhov, N. V.; Tropin, I. S.

    Activation of the soil surrounding the ESS accelerator tunnel calculated by the MARS15 code is presented. A detailed composition of the soil, that comprises about 30 different chemical elements, is considered. Spatial distributions of the produced activity are provided in both transverse and longitudinal direction. A realistic irradiation profile for the entire planned lifetime of the facility is used. The nuclear transmutation and decay of the produced radionuclides is calculated with the DeTra code which is a built-in tool for the MARS15 code. Radionuclide production by low-energy neutrons is calculated using the ENDF/B-VII evaluated nuclear data library. In order tomore » estimate quality of this activation assessment, a comparison between calculated and measured activation of various foils in a similar radiation environment is presented.« less

  6. Particle Acceleration, Magnetic Field Generation, and Emission in Relativistic Shocks

    NASA Technical Reports Server (NTRS)

    Nishikawa, Ken-IchiI.; Hededal, C.; Hardee, P.; Richardson, G.; Preece, R.; Sol, H.; Fishman, G.

    2004-01-01

    Shock acceleration is an ubiquitous phenomenon in astrophysical plasmas. Plasma waves and their associated instabilities (e.g., the Buneman instability, two-streaming instability, and the Weibel instability) created in the shocks are responsible for particle (electron, positron, and ion) acceleration. Using a 3-D relativistic electromagnetic particle (m) code, we have investigated particle acceleration associated with a relativistic jet front propagating through an ambient plasma with and without initial magnetic fields. We find only small differences in the results between no ambient and weak ambient parallel magnetic fields. Simulations show that the Weibel instability created in the collisionless shock front accelerates particles perpendicular and parallel to the jet propagation direction. New simulations with an ambient perpendicular magnetic field show the strong interaction between the relativistic jet and the magnetic fields. The magnetic fields are piled up by the jet and the jet electrons are bent, which creates currents and displacement currents. At the nonlinear stage, the magnetic fields are reversed by the current and the reconnection may take place. Due to these dynamics the jet and ambient electron are strongly accelerated in both parallel and perpendicular directions.

  7. Rate heterogeneity in six protein-coding genes from the holoparasite Balanophora (Balanophoraceae) and other taxa of Santalales

    PubMed Central

    Su, Huei-Jiun; Hu, Jer-Ming

    2012-01-01

    Background and Aims The holoparasitic flowering plant Balanophora displays extreme floral reduction and was previously found to have enormous rate acceleration in the nuclear 18S rDNA region. So far, it remains unclear whether non-ribosomal, protein-coding genes of Balanophora also evolve in an accelerated fashion and whether the genes with high substitution rates retain their functionality. To tackle these issues, six different genes were sequenced from two Balanophora species and their rate variation and expression patterns were examined. Methods Sequences including nuclear PI, euAP3, TM6, LFY and RPB2 and mitochondrial matR were determined from two Balanophora spp. and compared with selected hemiparasitic species of Santalales and autotrophic core eudicots. Gene expression was detected for the six protein-coding genes and the expression patterns of the three B-class genes (PI, AP3 and TM6) were further examined across different organs of B. laxiflora using RT-PCR. Key Results Balanophora mitochondrial matR is highly accelerated in both nonsynonymous (dN) and synonymous (dS) substitution rates, whereas the rate variation of nuclear genes LFY, PI, euAP3, TM6 and RPB2 are less dramatic. Significant dS increases were detected in Balanophora PI, TM6, RPB2 and dN accelerations in euAP3. All of the protein-coding genes are expressed in inflorescences, indicative of their functionality. PI is restrictively expressed in tepals, synandria and floral bracts, whereas AP3 and TM6 are widely expressed in both male and female inflorescences. Conclusions Despite the observation that rates of sequence evolution are generally higher in Balanophora than in hemiparasitic species of Santalales and autotrophic core eudicots, the five nuclear protein-coding genes are functional and are evolving at a much slower rate than 18S rDNA. The mechanism or mechanisms responsible for rapid sequence evolution and concomitant rate acceleration for 18S rDNA and matR are currently not well

  8. Compiler-based code generation and autotuning for geometric multigrid on GPU-accelerated supercomputers

    DOE PAGES

    Basu, Protonu; Williams, Samuel; Van Straalen, Brian; ...

    2017-04-05

    GPUs, with their high bandwidths and computational capabilities are an increasingly popular target for scientific computing. Unfortunately, to date, harnessing the power of the GPU has required use of a GPU-specific programming model like CUDA, OpenCL, or OpenACC. Thus, in order to deliver portability across CPU-based and GPU-accelerated supercomputers, programmers are forced to write and maintain two versions of their applications or frameworks. In this paper, we explore the use of a compiler-based autotuning framework based on CUDA-CHiLL to deliver not only portability, but also performance portability across CPU- and GPU-accelerated platforms for the geometric multigrid linear solvers found inmore » many scientific applications. We also show that with autotuning we can attain near Roofline (a performance bound for a computation and target architecture) performance across the key operations in the miniGMG benchmark for both CPU- and GPU-based architectures as well as for a multiple stencil discretizations and smoothers. We show that our technology is readily interoperable with MPI resulting in performance at scale equal to that obtained via hand-optimized MPI+CUDA implementation.« less

  9. Compiler-based code generation and autotuning for geometric multigrid on GPU-accelerated supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basu, Protonu; Williams, Samuel; Van Straalen, Brian

    GPUs, with their high bandwidths and computational capabilities are an increasingly popular target for scientific computing. Unfortunately, to date, harnessing the power of the GPU has required use of a GPU-specific programming model like CUDA, OpenCL, or OpenACC. Thus, in order to deliver portability across CPU-based and GPU-accelerated supercomputers, programmers are forced to write and maintain two versions of their applications or frameworks. In this paper, we explore the use of a compiler-based autotuning framework based on CUDA-CHiLL to deliver not only portability, but also performance portability across CPU- and GPU-accelerated platforms for the geometric multigrid linear solvers found inmore » many scientific applications. We also show that with autotuning we can attain near Roofline (a performance bound for a computation and target architecture) performance across the key operations in the miniGMG benchmark for both CPU- and GPU-based architectures as well as for a multiple stencil discretizations and smoothers. We show that our technology is readily interoperable with MPI resulting in performance at scale equal to that obtained via hand-optimized MPI+CUDA implementation.« less

  10. Validation of a Laser-Ray Package in an Eulerian Code

    NASA Astrophysics Data System (ADS)

    Bradley, Paul; Hall, Mike; McKenty, Patrick; Collins, Tim; Keller, David

    2014-10-01

    A laser-ray absorption package was recently installed in the RAGE code by the Laboratory for Laser Energetics (LLE). In this presentation, we describe our use of this package to implode Omega 60 beam symmetric direct drive capsules. The capsules have outer diameters of about 860 microns, CH plastic shell thicknesses between 8 and 32 microns, DD or DT gas fills between 5 and 20 atmospheres, and a 1 ns square pulse of 23 to 27 kJ. These capsule implosions were previously modeled with a calibrated energy source in the outer layer of the capsule, where we matched bang time and burn ion temperature well, but the simulated yields were two to three times higher than the data. We will run simulations with laser ray energy deposition to the experiments and the results to the yield and spectroscopic data. Work performed by Los Alamos National Laboratory under Contract DE-AC52-06NA25396 for the National Nuclear Security Administration of the U.S. Department of Energy.

  11. Los Alamos Shows Airport Security Technology at Work

    ScienceCinema

    Espy, Michelle; Schultz, Larry; Hunter, James

    2018-05-30

    Los Alamos scientists have advanced a Magnetic Resonance Imaging (MRI) technology that may provide a breakthrough for screening liquids at airport security. They've added low-power X-ray data to the mix, and as a result have unlocked a new detection technology. Funded in part by the Department of Homeland Security's Science and Technology Directorate, the new system is named MagRay. The goal is to quickly and accurately distinguish between liquids that visually appear identical. For example, what appears to be a bottle of white wine could potentially be nitromethane, a liquid that could be used to make an explosive. Both are clear liquids, one would be perfectly safe on a commercial aircraft, the other would be strictly prohibited. How to tell them apart quickly without error at an airport security area is the focus of Michelle Espy, Larry Schultz and their team. In this video, Espy and the MagRay team explain how the new technology works, how they've developed an easy operator interface, and what the next steps might be in transitioning this technology to the private sector.

  12. World's Largest Gold Crystal Studied at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogel, Sven; Nakotte, Heinz

    2014-04-03

    When geologist John Rakovan needed better tools to investigate whether a dazzling 217.78-gram piece of gold was in fact the world's largest single-crystal specimen - a distinguishing factor that would not only drastically increase its market value but also provide a unique research opportunity - he traveled to Los Alamos National Laboratory's Lujan Neutron Scattering Center to peer deep inside the mineral using neutron diffractometry. Neutrons, different from other probes such as X-rays and electrons, are able to penetrate many centimeters deep into most materials. Revealing the inner structure of a crystal without destroying the sample - imperative, as thismore » one is worth an estimated $1.5 million - would allow Rakovan and Lujan Center collaborators Sven Vogel and Heinz Nakotte to prove that this exquisite nugget, which seemed almost too perfect and too big to be real, was a single crystal and hence a creation of nature. Its owner, who lives in the United States, provided the samples to Rakovan to assess the crystallinity of four specimens, all of which had been found decades ago in Venezuela.« less

  13. World's Largest Gold Crystal Studied at Los Alamos

    ScienceCinema

    Vogel, Sven; Nakotte, Heinz

    2018-02-07

    When geologist John Rakovan needed better tools to investigate whether a dazzling 217.78-gram piece of gold was in fact the world's largest single-crystal specimen - a distinguishing factor that would not only drastically increase its market value but also provide a unique research opportunity - he traveled to Los Alamos National Laboratory's Lujan Neutron Scattering Center to peer deep inside the mineral using neutron diffractometry. Neutrons, different from other probes such as X-rays and electrons, are able to penetrate many centimeters deep into most materials. Revealing the inner structure of a crystal without destroying the sample - imperative, as this one is worth an estimated $1.5 million - would allow Rakovan and Lujan Center collaborators Sven Vogel and Heinz Nakotte to prove that this exquisite nugget, which seemed almost too perfect and too big to be real, was a single crystal and hence a creation of nature. Its owner, who lives in the United States, provided the samples to Rakovan to assess the crystallinity of four specimens, all of which had been found decades ago in Venezuela.

  14. Design of an electromagnetic accelerator for turbulent hydrodynamic mix studies

    NASA Astrophysics Data System (ADS)

    Susoeff, A. R.; Hawke, R. S.; Morrison, J. J.; Dimonte, G.; Remington, B. A.

    1993-12-01

    An electromagnetic accelerator in the form of a linear electric motor (LEM) has been designed to achieve controlled acceleration profiles of a carriage containing hydrodynamically unstable fluids for the investigation of the development of turbulent mix. The Rayleigh-Taylor instability is investigated by accelerating two dissimilar density fluids using the LEM to achieve a wide variety of acceleration and deceleration profiles. The acceleration profiles are achieved by independent control of rail and augmentation currents. A variety of acceleration-time profiles are possible including: (1) constant, (2) impulsive and (3) shaped. The LEM and support structure are a robust design in order to withstand high loads with deflections and to mitigate operational vibration. Vibration of the carriage during acceleration could create artifacts in the data which would interfere with the intended study of the Rayleigh-Taylor instability. The design allows clear access for diagnostic techniques such as laser induced fluorescence radiography, shadowgraphs and particle imaging velocimetry. Electromagnetic modeling codes were used to optimize the rail and augmentation coil positions within the support structure framework. Results of contemporary studies for non-arcing sliding contact of solid armatures are used for the design of the driving armature and the dynamic electromagnetic braking system. A 0.6MJ electrolytic capacitor bank is used for energy storage to drive the LEM. This report will discuss a LEM design which will accelerate masses of up to 3kg to a maximum of about 3000g(sub o), where g(sub o) is accelerated due to gravity.

  15. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  16. Particle acceleration, magnetic field generation, and emission in relativistic pair jets

    NASA Technical Reports Server (NTRS)

    Nishikawa, K.-I.; Ramirez-Ruiz, E.; Hardee, P.; Hededal, C.; Kouveliotou, C.; Fishman, G. J.; Mizuno, Y.

    2005-01-01

    Shock acceleration is a ubiquitous phenomenon in astrophysical plasmas. Recent simulations show that the Weibel instability created by relativistic pair jets is responsible for particle (electron, positron, and ion) acceleration. Using a 3-D relativistic electromagnetic particle (REMP) code, we have investigated particle acceleration associated with a relativistic jet propagating through an ambient plasma with and without initial magnetic fields. The growth rates of the Weibel instability depends on the distribution of pair jets. The Weibel instability created in the collisionless shock accelerates particles perpendicular and parallel to the jet propagation direction. This instability is also responsible for generating and amplifying highly nonuniform, small-scale magnetic fields, which contribute to the electron s transverse deflection behind the jet head. The jitter radiation from deflected electrons has different properties than synchrotron radiation which is calculated in a uniform magnetic field. This jitter radiation may be important to understanding the complex time evolution and/or spectral structure in gamma-ray bursts, relativistic jets, and supernova remnants.

  17. Cryogenic distribution box for Fermi National Accelerator Laboratory

    NASA Astrophysics Data System (ADS)

    Svehla, M. R.; Bonnema, E. C.; Cunningham, E. K.

    2017-12-01

    Meyer Tool & Mfg., Inc (Meyer Tool) of Oak Lawn, Illinois is manufacturing a cryogenic distribution box for Fermi National Accelerator Laboratory (FNAL). The distribution box will be used for the Muon-to-electron conversion (Mu2e) experiment. The box includes twenty-seven cryogenic valves, two heat exchangers, a thermal shield, and an internal nitrogen separator vessel, all contained within a six-foot diameter ASME coded vacuum vessel. This paper discusses the design and manufacturing processes that were implemented to meet the unique fabrication requirements of this distribution box. Design and manufacturing features discussed include: 1) Thermal strap design and fabrication, 2) Evolution of piping connections to heat exchangers, 3) Nitrogen phase separator design, 4) ASME code design of vacuum vessel, and 5) Cryogenic valve installation.

  18. Direct measurement of the image displacement instability in a linear induction accelerator

    NASA Astrophysics Data System (ADS)

    Burris-Mog, T. J.; Ekdahl, C. A.; Moir, D. C.

    2017-06-01

    The image displacement instability (IDI) has been measured on the 20 MeV Axis I of the dual axis radiographic hydrodynamic test facility and compared to theory. A 0.23 kA electron beam was accelerated across 64 gaps in a low solenoid focusing field, and the position of the beam centroid was measured to 34.3 meters downstream from the cathode. One beam dynamics code was used to model the IDI from first principles, while another code characterized the effects of the resistive wall instability and the beam break-up (BBU) instability. Although the BBU instability was not found to influence the IDI, it appears that the IDI influences the BBU. Because the BBU theory does not fully account for the dependence on beam position for coupling to cavity transverse magnetic modes, the effect of the IDI is missing from the BBU theory. This becomes of particular concern to users of linear induction accelerators operating in or near low magnetic guide fields tunes.

  19. GPU-accelerated phase-field simulation of dendritic solidification in a binary alloy

    NASA Astrophysics Data System (ADS)

    Yamanaka, Akinori; Aoki, Takayuki; Ogawa, Satoi; Takaki, Tomohiro

    2011-03-01

    The phase-field simulation for dendritic solidification of a binary alloy has been accelerated by using a graphic processing unit (GPU). To perform the phase-field simulation of the alloy solidification on GPU, a program code was developed with computer unified device architecture (CUDA). In this paper, the implementation technique of the phase-field model on GPU is presented. Also, we evaluated the acceleration performance of the three-dimensional solidification simulation by using a single NVIDIA TESLA C1060 GPU and the developed program code. The results showed that the GPU calculation for 5763 computational grids achieved the performance of 170 GFLOPS by utilizing the shared memory as a software-managed cache. Furthermore, it can be demonstrated that the computation with the GPU is 100 times faster than that with a single CPU core. From the obtained results, we confirmed the feasibility of realizing a real-time full three-dimensional phase-field simulation of microstructure evolution on a personal desktop computer.

  20. Maturation profile of inferior olivary neurons expressing ionotropic glutamate receptors in rats: role in coding linear accelerations.

    PubMed

    Li, Chuan; Han, Lei; Ma, Chun-Wai; Lai, Suk-King; Lai, Chun-Hong; Shum, Daisy Kwok Yan; Chan, Ying-Shing

    2013-07-01

    Using sinusoidal oscillations of linear acceleration along both the horizontal and vertical planes to stimulate otolith organs in the inner ear, we charted the postnatal time at which responsive neurons in the rat inferior olive (IO) first showed Fos expression, an indicator of neuronal recruitment into the otolith circuit. Neurons in subnucleus dorsomedial cell column (DMCC) were activated by vertical stimulation as early as P9 and by horizontal (interaural) stimulation as early as P11. By P13, neurons in the β subnucleus of IO (IOβ) became responsive to horizontal stimulation along the interaural and antero-posterior directions. By P21, neurons in the rostral IOβ became also responsive to vertical stimulation, but those in the caudal IOβ remained responsive only to horizontal stimulation. Nearly all functionally activated neurons in DMCC and IOβ were immunopositive for the NR1 subunit of the NMDA receptor and the GluR2/3 subunit of the AMPA receptor. In situ hybridization studies further indicated abundant mRNA signals of the glutamate receptor subunits by the end of the second postnatal week. This is reinforced by whole-cell patch-clamp data in which glutamate receptor-mediated miniature excitatory postsynaptic currents of rostral IOβ neurons showed postnatal increase in amplitude, reaching the adult level by P14. Further, these neurons exhibited subthreshold oscillations in membrane potential as from P14. Taken together, our results support that ionotropic glutamate receptors in the IO enable postnatal coding of gravity-related information and that the rostral IOβ is the only IO subnucleus that encodes spatial orientations in 3-D.

  1. Overview of Progress on the LANSCE Accelerator and Target Facilities Improvement Program

    NASA Astrophysics Data System (ADS)

    Macek, R. J.; Brun, T.; Donahue, J. B.; Fitzgerald, D. H.

    1997-05-01

    Three projects to improve the performance of the accelerator and target facilities for the Los Alamos Neutron Science Center have been initiated since 1994. The LANSCE Reliability Improvement Project was separated into two phases. Phase I, completed in 1995, was targeted at near-term improvements to beam availability that could be completed in a year. Phase II, now underway, consists of two projects: 1) converting the beam injection into the Proton Storage Ring (PSR) from the present two-step process H^- to H^0 to H^+) to direct injection of H^- beam in one step (H^- to H^+), and 2) an upgrade of the spallation neutron production target which will reduce the target change-out time from about a year to about three weeks. The third project, the SPSS Enhancement Project, is aimed at increasing the PSR output beam current from the present 70 μA at 20 Hz to 200 μA at 30 Hz, plus implementing seven new neutron scattering instruments. Objectives, plans, results and progress to date will be summarized.

  2. GASFLOW: A Computational Fluid Dynamics Code for Gases, Aerosols, and Combustion, Volume 3: Assessment Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müller, C.; Hughes, E. D.; Niederauer, G. F.

    1998-10-01

    Los Alamos National Laboratory (LANL) and Forschungszentrum Karlsruhe (FzK) are developing GASFLOW, a three-dimensional (3D) fluid dynamics field code as a best- estimate tool to characterize local phenomena within a flow field. Examples of 3D phenomena include circulation patterns; flow stratification; hydrogen distribution mixing and stratification; combustion and flame propagation; effects of noncondensable gas distribution on local condensation and evaporation; and aerosol entrainment, transport, and deposition. An analysis with GASFLOW will result in a prediction of the gas composition and discrete particle distribution in space and time throughout the facility and the resulting pressure and temperature loadings on the wallsmore » and internal structures with or without combustion. A major application of GASFLOW is for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containment and other facilities. It has been applied to situations involving transporting and distributing combustible gas mixtures. It has been used to study gas dynamic behavior in low-speed, buoyancy-driven flows, as well as sonic flows or diffusion dominated flows; and during chemically reacting flows, including deflagrations. The effects of controlling such mixtures by safety systems can be analyzed. The code version described in this manual is designated GASFLOW 2.1, which combines previous versions of the United States Nuclear Regulatory Commission code HMS (for Hydrogen Mixing Studies) and the Department of Energy and FzK versions of GASFLOW. The code was written in standard Fortran 90. This manual comprises three volumes. Volume I describes the governing physical equations and computational model. Volume II describes how to use the code to set up a model geometry, specify gas species and material properties, define initial and boundary conditions, and specify different outputs, especially graphical displays. Sample problems are included

  3. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories,more » along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.« less

  4. Carbon Stripper Foils Used in the Los Alamos PSR

    NASA Astrophysics Data System (ADS)

    Borden, M.; Plum, M. A.; Sugai, I.

    1997-05-01

    Carbon stripper foils produced by the modified controlled ACDC arc discharge method (mCADAD) at the Institute for Nuclear Study by Dr. Isao Sugai have been tested and used for high current 800-MeV beam production in the Proton Storage Ring (PSR) since 1993. Two approximately 110 μg/cm2 foils are sandwiched together to produce an equivalent 220 μg/cm^2 foil. The combined foil is supported by 4-5 μm diameter carbon fibers attached to an aluminum frame. These foils have survived as long as five months during PSR normal beam production of near 70 μA on target average current. Typical life-times of other foils vary from seven to fourteen days with lower on-target average current. Beam loss data also indicate that Sugai's foils have slower shrinkage rates than other foils. Equipment has been assembled and used to produce foils by the mCADAD method at Los Alamos. These foils will be tested during 1997 operation.

  5. Energetic properties' investigation of removing flattening filter at phantom surface: Monte Carlo study using BEAMnrc code, DOSXYZnrc code and BEAMDP code

    NASA Astrophysics Data System (ADS)

    Bencheikh, Mohamed; Maghnouj, Abdelmajid; Tajmouati, Jaouad

    2017-11-01

    The Monte Carlo calculation method is considered to be the most accurate method for dose calculation in radiotherapy and beam characterization investigation, in this study, the Varian Clinac 2100 medical linear accelerator with and without flattening filter (FF) was modelled. The objective of this study was to determine flattening filter impact on particles' energy properties at phantom surface in terms of energy fluence, mean energy, and energy fluence distribution. The Monte Carlo codes used in this study were BEAMnrc code for simulating linac head, DOSXYZnrc code for simulating the absorbed dose in a water phantom, and BEAMDP for extracting energy properties. Field size was 10 × 10 cm2, simulated photon beam energy was 6 MV and SSD was 100 cm. The Monte Carlo geometry was validated by a gamma index acceptance rate of 99% in PDD and 98% in dose profiles, gamma criteria was 3% for dose difference and 3mm for distance to agreement. In without-FF, the energetic properties was as following: electron contribution was increased by more than 300% in energy fluence, almost 14% in mean energy and 1900% in energy fluence distribution, however, photon contribution was increased 50% in energy fluence, and almost 18% in mean energy and almost 35% in energy fluence distribution. The removing flattening filter promotes the increasing of electron contamination energy versus photon energy; our study can contribute in the evolution of removing flattening filter configuration in future linac.

  6. Shielding analyses for repetitive high energy pulsed power accelerators

    NASA Astrophysics Data System (ADS)

    Jow, H. N.; Rao, D. V.

    Sandia National Laboratories (SNL) designs, tests and operates a variety of accelerators that generate large amounts of high energy Bremsstrahlung radiation over an extended time. Typically, groups of similar accelerators are housed in a large building that is inaccessible to the general public. To facilitate independent operation of each accelerator, test cells are constructed around each accelerator to shield it from the radiation workers occupying surrounding test cells and work-areas. These test cells, about 9 ft. high, are constructed of high density concrete block walls that provide direct radiation shielding. Above the target areas (radiation sources), lead or steel plates are used to minimize skyshine radiation. Space, accessibility and cost considerations impose certain restrictions on the design of these test cells. SNL Health Physics division is tasked to evaluate the adequacy of each test cell design and compare resultant dose rates with the design criteria stated in DOE Order 5480.11. In response, SNL Health Physics has undertaken an intensive effort to assess existing radiation shielding codes and compare their predictions against measured dose rates. This paper provides a summary of the effort and its results.

  7. Environmental surveillance at Los Alamos during 1991. Environmental protection group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewart, J.; Kohen, K.L.

    1993-08-01

    This report describes the environmental surveillance program conducted by Los Alamos National Laboratory during 1991. Routine monitoring for radiation and for radioactive and chemical materials is conducted on the Laboratory site as well as in the surrounding region. Monitoring results are used to determine compliance with appropriate standards and to permit early identification of potentially undesirable trends. Results and interpretation of data for 1991 cover external penetrating radiation; quantities of airborne emissions and effluents; concentrations of chemicals and radionuclides in ambient air, surface waters and groundwaters, municipal water supply, soils and sediments, and foodstuffs; and environmental compliance. Comparisons with appropriatemore » standards, regulations, and background levels provide the basis for concluding that environmental effects from Laboratory operations are small and do not pose a threat to the public, Laboratory employees, or the environment.« less

  8. Accelerator system and method of accelerating particles

    NASA Technical Reports Server (NTRS)

    Wirz, Richard E. (Inventor)

    2010-01-01

    An accelerator system and method that utilize dust as the primary mass flux for generating thrust are provided. The accelerator system can include an accelerator capable of operating in a self-neutralizing mode and having a discharge chamber and at least one ionizer capable of charging dust particles. The system can also include a dust particle feeder that is capable of introducing the dust particles into the accelerator. By applying a pulsed positive and negative charge voltage to the accelerator, the charged dust particles can be accelerated thereby generating thrust and neutralizing the accelerator system.

  9. Recent Research with the Detector for Advanced Neutron Capture Experiments (dance) at the LOS Alamos Neutron Science Center

    NASA Astrophysics Data System (ADS)

    Ullmann, J. L.

    2014-09-01

    The DANCE detector at Los Alamos is a 160 element, nearly 4π BaF2 detector array designed to make measurements of neutron capture on rare or radioactive nuclides. It has also been used to make measurements of gamma-ray multiplicity following capture and gamma-ray output from fission. Several examples of measurements are briefly discussed.

  10. GPU-Accelerated Molecular Modeling Coming Of Age

    PubMed Central

    Stone, John E.; Hardy, David J.; Ufimtsev, Ivan S.

    2010-01-01

    Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. PMID:20675161

  11. Acceleration of Magnetospheric Relativistic Electrons by Ultra-Low Frequency Waves: A Comparison between Two Cases Observed by Cluster and LANL Satellites

    NASA Technical Reports Server (NTRS)

    Shao, X.; Fung, S. F.; Tan, L. C.; Sharma, A. S.

    2010-01-01

    Understanding the origin and acceleration of magnetospheric relativistic electrons (MREs) in the Earth's radiation belt during geomagnetic storms is an important subject and yet one of outstanding questions in space physics. It has been statistically suggested that during geomagnetic storms, ultra-low-frequency (ULF) Pc-5 wave activities in the magnetosphere are correlated with order of magnitude increase of MRE fluxes in the outer radiation belt. Yet, physical and observational understandings of resonant interactions between ULF waves and MREs remain minimum. In this paper, we show two events during storms on September 25, 2001 and November 25, 2001, the solar wind speeds in both cases were > 500 km/s while Cluster observations indicate presence of strong ULF waves in the magnetosphere at noon and dusk, respectively, during a approx. 3-hour period. MRE observations by the Los Alamos (LANL) spacecraft show a quadrupling of 1.1-1.5 MeV electron fluxes in the September 25, 2001 event, but only a negligible increase in the November 2.5, 2001 event. We present a detailed comparison between these two events. Our results suggest that the effectiveness of MRE acceleration during the September 25, 2001 event can be attributed to the compressional wave mode with strong ULF wave activities and the physical origin of MRE acceleration depends more on the distribution of toroidal and poloidal ULF waves in the outer radiation belt.

  12. Nuclear criticality safety staff training and qualifications at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monahan, S.P.; McLaughlin, T.P.

    1997-05-01

    Operations involving significant quantities of fissile material have been conducted at Los Alamos National Laboratory continuously since 1943. Until the advent of the Laboratory`s Nuclear Criticality Safety Committee (NCSC) in 1957, line management had sole responsibility for controlling criticality risks. From 1957 until 1961, the NCSC was the Laboratory body which promulgated policy guidance as well as some technical guidance for specific operations. In 1961 the Laboratory created the position of Nuclear Criticality Safety Office (in addition to the NCSC). In 1980, Laboratory management moved the Criticality Safety Officer (and one other LACEF staff member who, by that time, wasmore » also working nearly full-time on criticality safety issues) into the Health Division office. Later that same year the Criticality Safety Group, H-6 (at that time) was created within H-Division, and staffed by these two individuals. The training and education of these individuals in the art of criticality safety was almost entirely self-regulated, depending heavily on technical interactions between each other, as well as NCSC, LACEF, operations, other facility, and broader criticality safety community personnel. Although the Los Alamos criticality safety group has grown both in size and formality of operations since 1980, the basic philosophy that a criticality specialist must be developed through mentoring and self motivation remains the same. Formally, this philosophy has been captured in an internal policy, document ``Conduct of Business in the Nuclear Criticality Safety Group.`` There are no short cuts or substitutes in the development of a criticality safety specialist. A person must have a self-motivated personality, excellent communications skills, a thorough understanding of the principals of neutron physics, a safety-conscious and helpful attitude, a good perspective of real risk, as well as a detailed understanding of process operations and credible upsets.« less

  13. Capabilities for high explosive pulsed power research at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goforth, James H; Oona, Henn; Tasker, Douglas G

    2008-01-01

    Research on topics requiring high magnetic fields and high currents have been pursued using high explosive pulsed power (HEPP) techniques since the 1950s at Los Alamos National Laboratory. We have developed many sophisticated HEPr systems through the years, and most of them depend on technology available from the nuclear weapons program. Through the 1980s and 1990s, our budgets would sustain parallel efforts in zpinch research using both HEPr and capacitor banks. In recent years, many changes have occurred that are driven by concerns such as safety, security, and environment, as well as reduced budgets and downsizing of the National Nuclearmore » Security Administration (NNSA) complex due to the end of the cold war era. In this paper, we review the teclmiques developed to date, and adaptations that are driven by changes in budgets and our changing complex. One new Ranchero-based solid liner z-pinch experimental design is also presented. Explosives that are cast to shape instead of being machined, and initiation systems that depend on arrays of slapper detonators are important new tools. Some materials that are seen as hazardous to the environment are avoided in designs. The process continues to allow a wide range of research however, and there are few, if any, experiments that we have done in the past that could not be perform today. The HErr firing facility at Los Alamos continues to have a 2000 lb. high explosive limit, and our 2.4 MJ capacitor bank remains a mainstay of the effort. Modem diagnostic and data analysis capabilities allow fewer personnel to achieve better results, and in the broad sense we continue to have a robust capability.« less

  14. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    NASA Astrophysics Data System (ADS)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  15. Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albright, Brian James; Yin, Lin; Stark, David James

    One of the long-standing problems in the community is the question of how we can model “next-generation” laser-ion acceleration in a computationally tractable way. A new particle tracking capability in the LANL VPIC kinetic plasma modeling code has enabled us to solve this long-standing problem

  16. Development and Implementation of Photonuclear Cross-Section Data for Mutually Coupled Neutron-Photon Transport Calculations in the Monte Carlo N-Particle (MCNP) Radiation Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Morgan C.

    2000-07-01

    The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a selectmore » group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability

  17. The UPSF code: a metaprogramming-based high-performance automatically parallelized plasma simulation framework

    NASA Astrophysics Data System (ADS)

    Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao

    2017-10-01

    UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.

  18. The FLUKA code for space applications: recent developments

    NASA Technical Reports Server (NTRS)

    Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; hide

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  19. A boundary-Fitted Coordinate Code for General Two-Dimensional Regions with Obstacles and Boundary Intrusions.

    DTIC Science & Technology

    1983-03-01

    values of these functions on the two sides of the slits. The acceleration parameters for the iteration at each point are in the field array WACC (I,J...code will calculate a locally optimum value at each point in the field, these values being placed in the field array WACC . This calculation is...changes in x and y, are calculated by calling subroutine ERROR.) The acceleration parameter is placed in the field 65 array WACC . The addition to the

  20. Kinetic Simulations of Plasma Energization and Particle Acceleration in Interacting Magnetic Flux Ropes

    NASA Astrophysics Data System (ADS)

    Du, S.; Guo, F.; Zank, G. P.; Li, X.; Stanier, A.

    2017-12-01

    The interaction between magnetic flux ropes has been suggested as a process that leads to efficient plasma energization and particle acceleration (e.g., Drake et al. 2013; Zank et al. 2014). However, the underlying plasma dynamics and acceleration mechanisms are subject to examination of numerical simulations. As a first step of this effort, we carry out 2D fully kinetic simulations using the VPIC code to study the plasma energization and particle acceleration during coalescence of two magnetic flux ropes. Our analysis shows that the reconnection electric field and compression effect are important in plasma energization. The results may help understand the energization process associated with magnetic flux ropes frequently observed in the solar wind near the heliospheric current sheet.

  1. Overview of the 1997 Dirac High-Magnetic Series at LOS Alamos

    NASA Astrophysics Data System (ADS)

    Clark, D. A.; Campbell, L. J.; Forman, K. C.; Fowler, C. M.; Goettee, J. D.; Mielke, C. H.; Rickel, D. G.; Marshall, B. R.

    2004-11-01

    During the summer of 1997, a series of high magnetic field experiments was conducted at Los Alamos National Laboratory. Four experiments utilizing Russian built MC-1 generators, which can reach fields as high as 10 Megagauss, and four smaller strip generator experiments at fields near 1.5 Megagauss were conducted. Experiments mounted on the devices included magnetoresistance of high temperature superconductors and semiconductors, optical reflectivity (conductivity) of semiconductors, magnetization of a magnetic cluster material and a semiconductor, Faraday rotation in a semiconductor and a magnetic cluster material, and transmission spectroscopy of molecules. Brief descriptions of the experimental setups, magnetic field measurement techniques, field results and various experiments are presented. Magnetic field data and other information on Dirac `97 can be found at .

  2. MANHATTAN: The View From Los Alamos of History's Most Secret Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, Alan Brady

    This presentation covers the political and scientific events leading up to the creation of the Manhattan Project. The creation of the Manhattan Project’s three most significant sites--Los Alamos, Oak Ridge, and Hanford--is also discussed. The lecture concludes by exploring the use of the atomic bombs at the end of World War II. The presentation slides include three videos. The first is a short clip of the 100-ton Test. The 100-Ton Test was history’s largest measured blast at that point in time; it was a pre-test for Trinity, the world’s first nuclear detonation. The second clip features views of Trinity followedmore » a short statement by the Laboratory’s first director, J. Robert Oppenheimer. The final clip shows Norris Bradbury talking about arms control.« less

  3. The Los Alamos Seismic Network (LASN): Recent Network Upgrades and Northern New Mexico Earthquake Catalog Updates

    NASA Astrophysics Data System (ADS)

    Roberts, P. M.; House, L. S.; Greene, M.; Ten Cate, J. A.; Schultz-Fellenz, E. S.; Kelley, R.

    2012-12-01

    From the first data recorded in the fall of 1973 to now, the Los Alamos Seismograph Network (LASN) has operated for nearly 40 years. LASN data have been used to locate more than 2,500 earthquakes in north-central New Mexico. The network was installed for seismic verification research, as well as to monitor and locate earthquakes near Los Alamos National Laboratory (LANL). LASN stations are the only earthquake monitoring stations in New Mexico north of Albuquerque. In the late 1970s, LASN included 22 stations spread over a geographic area of 150 km (N-S) by 350 km (E-W), of northern New Mexico. In the early 1980s, the available funding limited the stations that could be operated to a set of 7, located within an area of about 15 km (N-S) by 15 km (E-W), centered on Los Alamos. Over the last 3 years, 6 additional stations have been installed, which have considerably expanded the spatial coverage of the network. These new stations take advantage of broadband state-of-the-art sensors as well as digital recording and telemetry technology. Currently, 7 stations have broadband, three-component seismometers with digital telemetry, and the remaining 6 have traditional 1 Hz short-period seismometers with analog telemetry. In addition, a vertical array of accelerometers was installed in a wellbore on LANL property. This borehole station has 3-component digital strong-motion sensors. In addition, four forensic strong-motion accelerometers (SMA) are operated at LANL facilities. With 3 of the new broadband stations in and around the nearby Valles Caldera, LASN is now able to monitor any very small volcano-seismic events that may be associated with the caldera. We will present a complete description of the current LASN station, instrumentation and telemetry configurations, as well as the data acquisition and event-detection software structure used to record events in Earthworm. More than 2,000 earthquakes were detected and located in north-central New Mexico during the first 11

  4. Los Alamos Shows Airport Security Technology at Work

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espy, Michelle; Schultz, Larry; Hunter, James

    Los Alamos scientists have advanced a Magnetic Resonance Imaging (MRI) technology that may provide a breakthrough for screening liquids at airport security. They've added low-power X-ray data to the mix, and as a result have unlocked a new detection technology. Funded in part by the Department of Homeland Security's Science and Technology Directorate, the new system is named MagRay. The goal is to quickly and accurately distinguish between liquids that visually appear identical. For example, what appears to be a bottle of white wine could potentially be nitromethane, a liquid that could be used to make an explosive. Both aremore » clear liquids, one would be perfectly safe on a commercial aircraft, the other would be strictly prohibited. How to tell them apart quickly without error at an airport security area is the focus of Michelle Espy, Larry Schultz and their team. In this video, Espy and the MagRay team explain how the new technology works, how they've developed an easy operator interface, and what the next steps might be in transitioning this technology to the private sector.« less

  5. Organizational cultural survey of the Los Alamos Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An Organizational Survey (OS) was administered at the Los Alamos Site that queried employees on the subjects of organizational culture, various aspects of communications, employee commitment, work group cohesion, coordination of work, environmental, safety, and health concerns, hazardous nature of work, safety and overall job satisfaction. The purpose of the OS is to measure in a quantitative and objective way the notion of ``culture;`` that is, the values, attitudes, and beliefs of the individuals working within the organization. In addition, through the OS, a broad sample of individuals can be reached that would probably not be interviewed or observed duringmore » the course of a typical assessment. The OS also provides a descriptive profile of the organization at one point in time that can then be compared to a profile taken at a different point in time to assess changes in the culture of the organization. While comparisons among groups are made, it is not the purpose of this report to make evaluative statements of which profile may be positive or negative. However, using the data presented in this report in conjunction with other evaluative activities, may provide useful insight into the organization.« less

  6. Organizational cultural survey of the Los Alamos Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An Organizational Survey (OS) was administered at the Los Alamos Site that queried employees on the subjects of organizational culture, various aspects of communications, employee commitment, work group cohesion, coordination of work, environmental, safety, and health concerns, hazardous nature of work, safety and overall job satisfaction. The purpose of the OS is to measure in a quantitative and objective way the notion of culture;'' that is, the values, attitudes, and beliefs of the individuals working within the organization. In addition, through the OS, a broad sample of individuals can be reached that would probably not be interviewed or observed duringmore » the course of a typical assessment. The OS also provides a descriptive profile of the organization at one point in time that can then be compared to a profile taken at a different point in time to assess changes in the culture of the organization. While comparisons among groups are made, it is not the purpose of this report to make evaluative statements of which profile may be positive or negative. However, using the data presented in this report in conjunction with other evaluative activities, may provide useful insight into the organization.« less

  7. Numerical simulations of the superdetonative ram accelerator combusting flow field

    NASA Technical Reports Server (NTRS)

    Soetrisno, Moeljo; Imlay, Scott T.; Roberts, Donald W.

    1993-01-01

    The effects of projectile canting and fins on the ram accelerator combusting flowfield and the possible cause of the ram accelerator unstart are investigated by performing axisymmetric, two-dimensional, and three-dimensional calculations. Calculations are performed using the INCA code for solving Navier-Stokes equations and a guasi-global combustion model of Westbrook and Dryer (1981, 1984), which includes N2 and nine reacting species (CH4, CO, CO2, H2, H, O2, O, OH, and H2O), which are allowed to undergo a 12-step reaction. It is found that, without canting, interactions between the fins, boundary layers, and combustion fronts are insufficient to unstart the projectile at superdetonative velocities. With canting, the projectile will unstart at flow conditions where it appears to accelerate without canting. Unstart occurs at some critical canting angle. It is also found that three-dimensionality plays an important role in the overall combustion process.

  8. Particle Acceleration, Magnetic Field Generation, and Emission in Relativistic Pair Jets

    NASA Technical Reports Server (NTRS)

    Nishikawa, K.-I.; Ramirez-Ruiz, E.; Hardee, P.; Hededal, C.; Mizuno, Y.

    2005-01-01

    Shock acceleration is a ubiquitous phenomenon in astrophysical plasmas. Plasma waves and their associated instabilities (e.g., the Buneman instability, two-streaming instability, and the Weibel instability) created by relativistic pair jets are responsible for particle (electron, positron, and ion) acceleration. Using a 3-D relativistic electromagnetic particle (REMP) code, we have investigated particle acceleration associated with a relativistic jet propagating through an ambient plasma with and without initial magnetic fields. The growth rates of the Weibel instability depends on the distribution of pair jets. Simulations show that the Weibel instability created in the collisionless shock accelerates particles perpendicular and parallel to the jet propagation direction. The simulation results show that this instability is responsible for generating and amplifying highly nonuniform, small-scale magnetic fields, which contribute to the electron's transverse deflection behind the jet head. The "jitter" radiation from deflected electrons has different properties than synchrotron radiation which is calculated in a uniform magnetic field. This jitter radiation may be important to understanding the complex time evolution and/or spectral structure in gamma-ray bursts, relativistic jets, and supernova remnants.

  9. Particle Acceleration, Magnetic Field Generation, and Emission in Relativistic Pair Jets

    NASA Technical Reports Server (NTRS)

    Nishikawa, K. I.; Hardee, P.; Hededal, C. B.; Richardson, G.; Sol, H.; Preece, R.; Fishman, G. J.

    2004-01-01

    Shock acceleration is a ubiquitous phenomenon in astrophysical plasmas. Plasma waves and their associated instabilities (e.g., Buneman, Weibel and other two-stream instabilities) created in collisionless shocks are responsible for particle (electron, positron, and ion) acceleration. Using a 3-D relativistic electromagnetic particle (REMP) code, we have investigated particle acceleration associated with a relativistic jet front propagating into an ambient plasma. We find that the growth times depend on the Lorenz factors of jets. The jets with larger Lorenz factors grow slower. Simulations show that the Weibel instability created in the collisionless shock front accelerates jet and ambient particles both perpendicular and parallel to the jet propagation direction. The small scale magnetic field structure generated by the Weibel instability is appropriate to the generation of "jitter" radiation from deflected electrons (positrons) as opposed to synchrotron radiation. The jitter radiation resulting from small scale magnetic field structures may be important for understanding the complex time structure and spectral evolution observed in gamma-ray bursts or other astrophysical sources containing relativistic jets and relativistic collisionless shocks.

  10. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  11. Transcoding method from H.264/AVC to high efficiency video coding based on similarity of intraprediction, interprediction, and motion vector

    NASA Astrophysics Data System (ADS)

    Liu, Mei-Feng; Zhong, Guo-Yun; He, Xiao-Hai; Qing, Lin-Bo

    2016-09-01

    Currently, most video resources on line are encoded in the H.264/AVC format. More fluent video transmission can be obtained if these resources are encoded in the newest international video coding standard: high efficiency video coding (HEVC). In order to improve the video transmission and storage on line, a transcoding method from H.264/AVC to HEVC is proposed. In this transcoding algorithm, the coding information of intraprediction, interprediction, and motion vector (MV) in H.264/AVC video stream are used to accelerate the coding in HEVC. It is found through experiments that the region of interprediction in HEVC overlaps that in H.264/AVC. Therefore, the intraprediction for the region in HEVC, which is interpredicted in H.264/AVC, can be skipped to reduce coding complexity. Several macroblocks in H.264/AVC are combined into one PU in HEVC when the MV difference between two of the macroblocks in H.264/AVC is lower than a threshold. This method selects only one coding unit depth and one prediction unit (PU) mode to reduce the coding complexity. An MV interpolation method of combined PU in HEVC is proposed according to the areas and distances between the center of one macroblock in H.264/AVC and that of the PU in HEVC. The predicted MV accelerates the motion estimation for HEVC coding. The simulation results show that our proposed algorithm achieves significant coding time reduction with a little loss in bitrates distortion rate, compared to the existing transcoding algorithms and normal HEVC coding.

  12. NESSY: NLTE spectral synthesis code for solar and stellar atmospheres

    NASA Astrophysics Data System (ADS)

    Tagirov, R. V.; Shapiro, A. I.; Schmutz, W.

    2017-07-01

    Context. Physics-based models of solar and stellar magnetically-driven variability are based on the calculation of synthetic spectra for various surface magnetic features as well as quiet regions, which are a function of their position on the solar or stellar disc. Such calculations are performed with radiative transfer codes tailored for modeling broad spectral intervals. Aims: We aim to present the NLTE Spectral SYnthesis code (NESSY), which can be used for modeling of the entire (UV-visible-IR and radio) spectra of solar and stellar magnetic features and quiet regions. Methods: NESSY is a further development of the COde for Solar Irradiance (COSI), in which we have implemented an accelerated Λ-iteration (ALI) scheme for co-moving frame (CMF) line radiation transfer based on a new estimate of the local approximate Λ-operator. Results: We show that the new version of the code performs substantially faster than the previous one and yields a reliable calculation of the entire solar spectrum. This calculation is in a good agreement with the available observations.

  13. Stormwater Pollution Prevention Plan for the TA-03-38 Carpenter's Shop, Los Alamos National Laboratory, Revision 3, January 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgin, Jillian Elizabeth

    This Storm Water Pollution Prevention Plan (SWPPP) was developed in accordance with the provisions of the Clean Water Act (33 U.S.C. §§1251 et seq., as amended), and the Multi-Sector General Permit for Storm Water Discharges Associated with Industrial Activity (U.S. EPA, June 2015) issued by the U.S. Environmental Protection Agency (EPA) for the National Pollutant Discharge Elimination System (NPDES) and using the industry specific permit requirements for Sector A–Timber Products, Subsector A4 (Wood Products Facilities not elsewhere classified) as a guide. This SWPPP applies to discharges of stormwater from the operational areas of the TA-03-38 Carpenter’s Shop at Los Alamosmore » National Laboratory. Los Alamos National Laboratory (also referred to as LANL or the “Laboratory”) is owned by the Department of Energy (DOE), and is operated by Los Alamos National Security, LLC (LANS). Throughout this document, the term “facility” refers to the TA-03-38 Carpenter’s Shop and associated areas. The current permit expires at midnight on June 4, 2020.« less

  14. New methods in WARP, a particle-in-cell code for space-charge dominated beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grote, D., LLNL

    1998-01-12

    The current U.S. approach for a driver for inertial confinement fusion power production is a heavy-ion induction accelerator; high-current beams of heavy ions are focused onto the fusion target. The space-charge of the high-current beams affects the behavior more strongly than does the temperature (the beams are described as being ``space-charge dominated``) and the beams behave like non-neutral plasmas. The particle simulation code WARP has been developed and used to study the transport and acceleration of space-charge dominated ion beams in a wide range of applications, from basic beam physics studies, to ongoing experiments, to fusion driver concepts. WARP combinesmore » aspects of a particle simulation code and an accelerator code; it uses multi-dimensional, electrostatic particle-in-cell (PIC) techniques and has a rich mechanism for specifying the lattice of externally applied fields. There are both two- and three-dimensional versions, the former including axisymmetric (r-z) and transverse slice (x-y) models. WARP includes a number of novel techniques and capabilities that both enhance its performance and make it applicable to a wide range of problems. Some of these have been described elsewhere. Several recent developments will be discussed in this paper. A transverse slice model has been implemented with the novel capability of including bends, allowing more rapid simulation while retaining essential physics. An interface using Python as the interpreter layer instead of Basis has been developed. A parallel version of WARP has been developed using Python.« less

  15. GPU-accelerated Tersoff potentials for massively parallel Molecular Dynamics simulations

    NASA Astrophysics Data System (ADS)

    Nguyen, Trung Dac

    2017-03-01

    The Tersoff potential is one of the empirical many-body potentials that has been widely used in simulation studies at atomic scales. Unlike pair-wise potentials, the Tersoff potential involves three-body terms, which require much more arithmetic operations and data dependency. In this contribution, we have implemented the GPU-accelerated version of several variants of the Tersoff potential for LAMMPS, an open-source massively parallel Molecular Dynamics code. Compared to the existing MPI implementation in LAMMPS, the GPU implementation exhibits a better scalability and offers a speedup of 2.2X when run on 1000 compute nodes on the Titan supercomputer. On a single node, the speedup ranges from 2.0 to 8.0 times, depending on the number of atoms per GPU and hardware configurations. The most notable features of our GPU-accelerated version include its design for MPI/accelerator heterogeneous parallelism, its compatibility with other functionalities in LAMMPS, its ability to give deterministic results and to support both NVIDIA CUDA- and OpenCL-enabled accelerators. Our implementation is now part of the GPU package in LAMMPS and accessible for public use.

  16. GPU-accelerated molecular modeling coming of age.

    PubMed

    Stone, John E; Hardy, David J; Ufimtsev, Ivan S; Schulten, Klaus

    2010-09-01

    Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. (c) 2010 Elsevier Inc. All rights reserved.

  17. Laser beam coupling with capillary discharge plasma for laser wakefield acceleration applications

    NASA Astrophysics Data System (ADS)

    Bagdasarov, G. A.; Sasorov, P. V.; Gasilov, V. A.; Boldarev, A. S.; Olkhovskaya, O. G.; Benedetti, C.; Bulanov, S. S.; Gonsalves, A.; Mao, H.-S.; Schroeder, C. B.; van Tilborg, J.; Esarey, E.; Leemans, W. P.; Levato, T.; Margarone, D.; Korn, G.

    2017-08-01

    One of the most robust methods, demonstrated to date, of accelerating electron beams by laser-plasma sources is the utilization of plasma channels generated by the capillary discharges. Although the spatial structure of the installation is simple in principle, there may be some important effects caused by the open ends of the capillary, by the supplying channels etc., which require a detailed 3D modeling of the processes. In the present work, such simulations are performed using the code MARPLE. First, the process of capillary filling with cold hydrogen before the discharge is fired, through the side supply channels is simulated. Second, the simulation of the capillary discharge is performed with the goal to obtain a time-dependent spatial distribution of the electron density near the open ends of the capillary as well as inside the capillary. Finally, to evaluate the effectiveness of the beam coupling with the channeling plasma wave guide and of the electron acceleration, modeling of the laser-plasma interaction was performed with the code INF&RNO.

  18. Direct measurement of the image displacement instability in a linear induction accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burris-Mog, T. J.; Ekdahl, C. A.; Moir, D. C.

    The image displacement instability (IDI) has been measured on the 20 MeV Axis I of the dual axis radiographic hydrodynamic test facility and compared to theory. A 0.23 kA electron beam was accelerated across 64 gaps in a low solenoid focusing field, and the position of the beam centroid was measured to 34.3 meters downstream from the cathode. One beam dynamics code was used to model the IDI from first principles, while another code characterized the effects of the resistive wall instability and the beam break-up (BBU) instability. Although the BBU instability was not found to influence the IDI, itmore » appears that the IDI influences the BBU. Because the BBU theory does not fully account for the dependence on beam position for coupling to cavity transverse magnetic modes, the effect of the IDI is missing from the BBU theory. Finally, this becomes of particular concern to users of linear induction accelerators operating in or near low magnetic guide fields tunes.« less

  19. Direct measurement of the image displacement instability in a linear induction accelerator

    DOE PAGES

    Burris-Mog, T. J.; Ekdahl, C. A.; Moir, D. C.

    2017-06-19

    The image displacement instability (IDI) has been measured on the 20 MeV Axis I of the dual axis radiographic hydrodynamic test facility and compared to theory. A 0.23 kA electron beam was accelerated across 64 gaps in a low solenoid focusing field, and the position of the beam centroid was measured to 34.3 meters downstream from the cathode. One beam dynamics code was used to model the IDI from first principles, while another code characterized the effects of the resistive wall instability and the beam break-up (BBU) instability. Although the BBU instability was not found to influence the IDI, itmore » appears that the IDI influences the BBU. Because the BBU theory does not fully account for the dependence on beam position for coupling to cavity transverse magnetic modes, the effect of the IDI is missing from the BBU theory. Finally, this becomes of particular concern to users of linear induction accelerators operating in or near low magnetic guide fields tunes.« less

  20. Historic Manhattan Project Sites at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGehee, Ellen

    The Manhattan Project laboratory constructed at Los Alamos, New Mexico, beginning in 1943, was intended from the start to be temporary and to go up with amazing speed. Because most of those WWII-era facilities were built with minimal materials and so quickly, much of the original infrastructure was torn down in the late '40s and early '50s and replaced by more permanent facilities. However, a few key facilities remained, and are being preserved and maintained for historic significance. Four such sites are visited briefly in this video, taking viewers to V-Site, the buildings where the first nuclear explosive device wasmore » pre-assembled in preparation for the Trinity Test in Southern New Mexico. Included is another WWII area, Gun Site. So named because it was the area where scientists and engineers tested the so-called "gun method" of assembling nuclear materials -- the fundamental design of the Little Boy weapon that was eventually dropped on Hiroshima. The video also goes to Pajarito Site, home of the "Slotin Building" and "Pond Cabin." The Slotin Building is the place where scientist Louis Slotin conducted a criticality experiment that went awry in early 1946, leading to his unfortunate death, and the Pond Cabin served the team of eminent scientist Emilio Segre who did early chemistry work on plutonium that ultimately led to the Fat Man weapon.« less

  1. Historic Manhattan Project Sites at Los Alamos

    ScienceCinema

    McGehee, Ellen

    2018-05-11

    The Manhattan Project laboratory constructed at Los Alamos, New Mexico, beginning in 1943, was intended from the start to be temporary and to go up with amazing speed. Because most of those WWII-era facilities were built with minimal materials and so quickly, much of the original infrastructure was torn down in the late '40s and early '50s and replaced by more permanent facilities. However, a few key facilities remained, and are being preserved and maintained for historic significance. Four such sites are visited briefly in this video, taking viewers to V-Site, the buildings where the first nuclear explosive device was pre-assembled in preparation for the Trinity Test in Southern New Mexico. Included is another WWII area, Gun Site. So named because it was the area where scientists and engineers tested the so-called "gun method" of assembling nuclear materials -- the fundamental design of the Little Boy weapon that was eventually dropped on Hiroshima. The video also goes to Pajarito Site, home of the "Slotin Building" and "Pond Cabin." The Slotin Building is the place where scientist Louis Slotin conducted a criticality experiment that went awry in early 1946, leading to his unfortunate death, and the Pond Cabin served the team of eminent scientist Emilio Segre who did early chemistry work on plutonium that ultimately led to the Fat Man weapon.

  2. Particle Acceleration and Radiation associated with Magnetic Field Generation from Relativistic Collisionless Shocks

    NASA Technical Reports Server (NTRS)

    Nishikawa, K.; Hardee, P. E.; Richardson, G. A.; Preece, R. D.; Sol, H.; Fishman, G. J.

    2003-01-01

    Shock acceleration is an ubiquitous phenomenon in astrophysical plasmas. Plasma waves and their associated instabilities (e.g., the Buneman instability, two-streaming instability, and the Weibel instability) created in the shocks are responsible for particle (electron, positron, and ion) acceleration. Using a 3-D relativistic electromagnetic particle (REMP) code, we have investigated particle acceleration associated with a relativistic jet front propagating through an ambient plasma with and without initial magnetic fields. We find only small differences in the results between no ambient and weak ambient magnetic fields. Simulations show that the Weibel instability created in the collisionless shock front accelerates particles perpendicular and parallel to the jet propagation direction. While some Fermi acceleration may occur at the jet front, the majority of electron acceleration takes place behind the jet front and cannot be characterized as Fermi acceleration. The simulation results show that this instability is responsible for generating and amplifying highly nonuniform, small-scale magnetic fields, which contribute to the electron s transverse deflection behind the jet head. The "jitter" radiation from deflected electrons has different properties than synchrotron radiation which is calculated in a uniform magnetic field. This jitter radiation may be important to understanding the complex time evolution and/or spectral structure in gamma-ray bursts, relativistic jets, and supernova remnants.

  3. Tuning the DARHT Axis-II linear induction accelerator focusing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Carl A.

    2012-04-24

    Flash radiography of large hydrodynamic experiments driven by high explosives is a well-known diagnostic technique in use at many laboratories, and the Dual-Axis Radiography for Hydrodynamic Testing (DARHT) facility at Los Alamos produces flash radiographs of large hydrodynamic experiments. Two linear induction accelerators (LIAs) make the bremsstrahlung radiographic source spots for orthogonal views of each test. The 2-kA, 20-MeV Axis-I LIA creates a single 60-ns radiography pulse. The 1.7-kA, 16.5-MeV Axis-II LIA creates up to four radiography pulses by kicking them out of a longer pulse that has a 1.6-{mu}s flattop. The Axis-II injector, LIA, kicker, and downstream transport (DST)more » to the bremsstrahlung converter are described. Adjusting the magnetic focusing and steering elements to optimize the electron-beam transport through an LIA is often called 'tuning.' As in all high-current LIAs, the focusing field is designed to be as close to that of the ideal continuous solenoid as physically possible. In ideal continuous solenoidal transport a smoothly varying beam size can easily be found for which radial forces balance, and the beam is said to be 'matched' to the focusing field. A 'mismatched' beam exhibits unwanted oscillations in size, which are a source of free energy that contributes to emittance growth. This is undesirable, because in the absence of beam-target effects, the radiographic spot size is proportional to the emittance. Tuning the Axis-II LIA is done in two steps. First, the solenoidal focusing elements are set to values designed to provide a matched beam with little or no envelope oscillations, and little or no beam-breakup (BBU) instability growth. Then, steering elements are adjusted to minimize the motion of the centroid of a well-centered beam at the LIA exit. This article only describes the design of the tune for the focusing solenoids. The DARHT Axis-II LIA was required to be re-tuned after installing an accelerator cell to replace a

  4. Single event effects in high-energy accelerators

    NASA Astrophysics Data System (ADS)

    García Alía, Rubén; Brugger, Markus; Danzeca, Salvatore; Cerutti, Francesco; de Carvalho Saraiva, Joao Pedro; Denz, Reiner; Ferrari, Alfredo; Foro, Lionel L.; Peronnard, Paul; Røed, Ketil; Secondo, Raffaello; Steckert, Jens; Thurel, Yves; Toccafondo, Iacocpo; Uznanski, Slawosz

    2017-03-01

    The radiation environment encountered at high-energy hadron accelerators strongly differs from the environment relevant for space applications. The mixed-field expected at modern accelerators is composed of charged and neutral hadrons (protons, pions, kaons and neutrons), photons, electrons, positrons and muons, ranging from very low (thermal) energies up to the TeV range. This complex field, which is extensively simulated by Monte Carlo codes (e.g. FLUKA) is due to beam losses in the experimental areas, distributed along the machine (e.g. collimation points) and deriving from the interaction with the residual gas inside the beam pipe. The resulting intensity, energy distribution and proportion of the different particles largely depends on the distance and angle with respect to the interaction point as well as the amount of installed shielding material. Electronics operating in the vicinity of the accelerator will therefore be subject to both cumulative damage from radiation (total ionizing dose, displacement damage) as well as single event effects which can seriously compromise the operation of the machine. This, combined with the extensive use of commercial-off-the-shelf components due to budget, performance and availability reasons, results in the need to carefully characterize the response of the devices and systems to representative radiation conditions.

  5. Numerical investigation on the effects of acceleration reversal times in Rayleigh-Taylor Instability with multiple reversals

    NASA Astrophysics Data System (ADS)

    Farley, Zachary; Aslangil, Denis; Banerjee, Arindam; Lawrie, Andrew G. W.

    2017-11-01

    An implicit large eddy simulation (ILES) code, MOBILE, is used to explore the growth rate of the mixing layer width of the acceleration-driven Rayleigh-Taylor instability (RTI) under variable acceleration histories. The sets of computations performed consist of a series of accel-decel-accel (ADA) cases in addition to baseline constant acceleration and accel-decel (AD) cases. The ADA cases are a series of varied times for the second acceleration reversal (t2) and show drastic differences in the growth rates. Upon the deceleration phase, the kinetic energy of the flow is shifted into internal wavelike patterns. These waves are evidenced by the examined differences in growth rate in the second acceleration phase for the set of ADA cases. Here, we investigate global parameters that include mixing width, growth rates and the anisotropy tensor for the kinetic energy to better understand the behavior of the growth during the re-acceleration period. Authors acknowledge financial support from DOE-SSAA (DE-NA0003195) and NSF CAREER (#1453056) awards.

  6. Graphics Processing Unit Acceleration of Gyrokinetic Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Hause, Benjamin; Parker, Scott; Chen, Yang

    2013-10-01

    We find a substantial increase in on-node performance using Graphics Processing Unit (GPU) acceleration in gyrokinetic delta-f particle-in-cell simulation. Optimization is performed on a two-dimensional slab gyrokinetic particle simulation using the Portland Group Fortran compiler with the OpenACC compiler directives and Fortran CUDA. Mixed implementation of both Open-ACC and CUDA is demonstrated. CUDA is required for optimizing the particle deposition algorithm. We have implemented the GPU acceleration on a third generation Core I7 gaming PC with two NVIDIA GTX 680 GPUs. We find comparable, or better, acceleration relative to the NERSC DIRAC cluster with the NVIDIA Tesla C2050 computing processor. The Tesla C 2050 is about 2.6 times more expensive than the GTX 580 gaming GPU. We also see enormous speedups (10 or more) on the Titan supercomputer at Oak Ridge with Kepler K20 GPUs. Results show speed-ups comparable or better than that of OpenMP models utilizing multiple cores. The use of hybrid OpenACC, CUDA Fortran, and MPI models across many nodes will also be discussed. Optimization strategies will be presented. We will discuss progress on optimizing the comprehensive three dimensional general geometry GEM code.

  7. Floodplain Assessment for the Middle Los Alamos Canyon Aggregate Area Investigations in Technical Area 02 at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hathcock, Charles Dean

    The proposed action being assessed in this document occurs in TA-02 in the bottom of Los Alamos Canyon. The DOE proposes to conduct soil sampling at AOC 02-011 (d), AOC 02- 011(a)(ii), and SWMU 02-005, and excavate soils in AOC 02-011(a)(ii) as part of a corrective actions effort. Additional shallow surface soil samples (soil grab samples) will be collected throughout the TA-02 area, including within the floodplain, to perform ecotoxicology studies (Figures 1 and 2). The excavation boundaries in AOC 02-011(a)(ii) are slightly within the delineated 100-year floodplain. The project will use a variety of techniques for soil sampling andmore » remediation efforts to include hand/digging, standard hand auger/sampling, excavation using machinery such as backhoe and front end loader and small drill rig. Heavy equipment will traverse the floodplain and spoils piles will be staged in the floodplain within developed or previously disturbed areas (e.g., existing paved roads and parking areas). The project will utilize and maintain appropriate best management practices (BMPs) to contain excavated materials, and all pollutants, including oil from machinery/vehicles. The project will stabilize disturbed areas as appropriate at the end of the project.« less

  8. GASPRNG: GPU accelerated scalable parallel random number generator library

    NASA Astrophysics Data System (ADS)

    Gao, Shuang; Peterson, Gregory D.

    2013-04-01

    Graphics processors represent a promising technology for accelerating computational science applications. Many computational science applications require fast and scalable random number generation with good statistical properties, so they use the Scalable Parallel Random Number Generators library (SPRNG). We present the GPU Accelerated SPRNG library (GASPRNG) to accelerate SPRNG in GPU-based high performance computing systems. GASPRNG includes code for a host CPU and CUDA code for execution on NVIDIA graphics processing units (GPUs) along with a programming interface to support various usage models for pseudorandom numbers and computational science applications executing on the CPU, GPU, or both. This paper describes the implementation approach used to produce high performance and also describes how to use the programming interface. The programming interface allows a user to be able to use GASPRNG the same way as SPRNG on traditional serial or parallel computers as well as to develop tightly coupled programs executing primarily on the GPU. We also describe how to install GASPRNG and use it. To help illustrate linking with GASPRNG, various demonstration codes are included for the different usage models. GASPRNG on a single GPU shows up to 280x speedup over SPRNG on a single CPU core and is able to scale for larger systems in the same manner as SPRNG. Because GASPRNG generates identical streams of pseudorandom numbers as SPRNG, users can be confident about the quality of GASPRNG for scalable computational science applications. Catalogue identifier: AEOI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOI_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: UTK license. No. of lines in distributed program, including test data, etc.: 167900 No. of bytes in distributed program, including test data, etc.: 1422058 Distribution format: tar.gz Programming language: C and CUDA. Computer: Any PC or

  9. Overview of the Tusas Code for Simulation of Dendritic Solidification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trainer, Amelia J.; Newman, Christopher Kyle; Francois, Marianne M.

    2016-01-07

    The aim of this project is to conduct a parametric investigation into the modeling of two dimensional dendrite solidification, using the phase field model. Specifically, we use the Tusas code, which is for coupled heat and phase-field simulation of dendritic solidification. Dendritic solidification, which may occur in the presence of an unstable solidification interface, results in treelike microstructures that often grow perpendicular to the rest of the growth front. The interface may become unstable if the enthalpy of the solid material is less than that of the liquid material, or if the solute is less soluble in solid than itmore » is in liquid, potentially causing a partition [1]. A key motivation behind this research is that a broadened understanding of phase-field formulation and microstructural developments can be utilized for macroscopic simulations of phase change. This may be directly implemented as a part of the Telluride project at Los Alamos National Laboratory (LANL), through which a computational additive manufacturing simulation tool is being developed, ultimately to become part of the Advanced Simulation and Computing Program within the U.S. Department of Energy [2].« less

  10. Accelerator test of the coded aperture mask technique for gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Jenkins, T. L.; Frye, G. M., Jr.; Owens, A.; Carter, J. N.; Ramsden, D.

    1982-01-01

    A prototype gamma-ray telescope employing the coded aperture mask technique has been constructed and its response to a point source of 20 MeV gamma-rays has been measured. The point spread function is approximately a Gaussian with a standard deviation of 12 arc minutes. This resolution is consistent with the cell size of the mask used and the spatial resolution of the detector. In the context of the present experiment, the error radius of the source position (90 percent confidence level) is 6.1 arc minutes.

  11. Chemical decontamination technical resources at Los Alamos National Laboratory (2008)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Murray E

    This document supplies information resources for a person seeking to create planning or pre-planning documents for chemical decontamination operations. A building decontamination plan can be separated into four different sections: Pre-planning, Characterization, Decontamination (Initial response and also complete cleanup), and Clearance. Of the identified Los Alamos resources, they can be matched with these four sections: Pre-planning -- Dave Seidel, EO-EPP, Emergency Planning and Preparedness; David DeCroix and Bruce Letellier, D-3, Computational fluids modeling of structures; Murray E. Moore, RP-2, Aerosol sampling and ventilation engineering. Characterization (this can include development projects) -- Beth Perry, IAT-3, Nuclear Counterterrorism Response (SNIPER database); Fernandomore » Garzon, MPA-11, Sensors and Electrochemical Devices (development); George Havrilla, C-CDE, Chemical Diagnostics and Engineering; Kristen McCabe, B-7, Biosecurity and Public Health. Decontamination -- Adam Stively, EO-ER, Emergency Response; Dina Matz, IHS-IP, Industrial hygiene; Don Hickmott, EES-6, Chemical cleanup. Clearance (validation) -- Larry Ticknor, CCS-6, Statistical Sciences.« less

  12. Environmental Assessment and Finding of No Significant Impact: The Proposed Issuance of an Easement to Public Service Company of New Mexico for the Construction and Operation of a 12-inch Natural Gas Pipeline within Los Alamos National Laboratory, Los Alamos, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N /A

    2002-07-30

    The National Nuclear Security Administration (NNSA) has assigned a continuing role to Los Alamos National Laboratory (LANL) in carrying out NNSAs national security mission. To enable LANL to continue this enduring responsibility requires that NNSA maintain the capabilities and capacities required in support of its national mission assignments at LANL. To carry out its Congressionally assigned mission requirements, NNSA must maintain a safe and reliable infrastructure at LANL. Upgrades to the various utility services at LANL have been ongoing together with routine maintenance activities over the years. However, the replacement of a certain portion of natural gas service transmission pipelinemore » is now necessary as this delivery system element has been operating well beyond its original design life for the past 20 to 30 years and components of the line are suffering from normal stresses, strains, and general failures. The Proposed Action is to grant an easement to the Public Service Company of New Mexico (PNM) to construct, operate, and maintain approximately 15,000 feet (4,500 meters) of 12-inch (in.) (30-centimeter [cm]) coated steel natural gas transmission mainline on NNSA-administered land within LANL along Los Alamos Canyon. The new gas line would begin at the existing valve setting located at the bottom of Los Alamos Canyon near the Los Alamos County water well pump house and adjacent to the existing 12-in. (30-cm) PNM gas transmission mainline. The new gas line (owned by PNM) would then cross the streambed and continue east in a new easement obtained by PNM from the NNSA, paralleling the existing electrical power line along the bottom of the canyon. The gas line would then turn northeast near State Road (SR) 4 and be connected to the existing 12-in. (30-cm) coated steel gas transmission mainline, located within the right-of-way (ROW) of SR 502. The Proposed Action would also involve crossing a streambed twice. PNM would bore under the streambed for pipe

  13. Monte Carlo method for calculating the radiation skyshine produced by electron accelerators

    NASA Astrophysics Data System (ADS)

    Kong, Chaocheng; Li, Quanfeng; Chen, Huaibi; Du, Taibin; Cheng, Cheng; Tang, Chuanxiang; Zhu, Li; Zhang, Hui; Pei, Zhigang; Ming, Shenjin

    2005-06-01

    Using the MCNP4C Monte Carlo code, the X-ray skyshine produced by 9 MeV, 15 MeV and 21 MeV electron linear accelerators were calculated respectively with a new two-step method combined with the split and roulette variance reduction technique. Results of the Monte Carlo simulation, the empirical formulas used for skyshine calculation and the dose measurements were analyzed and compared. In conclusion, the skyshine dose measurements agreed reasonably with the results computed by the Monte Carlo method, but deviated from computational results given by empirical formulas. The effect on skyshine dose caused by different structures of accelerator head is also discussed in this paper.

  14. Particle acceleration magnetic field generation, and emission in Relativistic pair jets

    NASA Technical Reports Server (NTRS)

    Nishikawa, K.-I.; Ramirez-Ruiz, E.; Hardee, P.; Hededal, C.; Kouveliotou, C.; Fishman, G. J.

    2005-01-01

    Plasma waves and their associated instabilities (e.g., the Buneman instability, two-streaming instability, and the Weibel instability) are responsible for particle acceleration in relativistic pair jets. Using a 3-D relativistic electromagnetic particle (REMP) code, we have investigated particle acceleration associated with a relativistic pair jet propagating through a pair plasma. Simulations show that the Weibel instability created in the collisionless shock accelerates particles perpendicular and parallel to the jet propagation direction. Simulation results show that this instability generates and amplifies highly nonuniform, small-scale magnetic fields, which contribute to the electron's transverse deflection behind the jet head. The "jitter' I radiation from deflected electrons can have different properties than synchrotron radiation which is calculated in a uniform magnetic field. This jitter radiation may be important to understanding the complex time evolution and/or spectral structure in gamma-ray bursts, relativistic jets, and supernova remnants. The growth rate of the Weibel instability and the resulting particle acceleration depend on the magnetic field strength and orientation, and on the initial particle distribution function. In this presentation we explore some of the dependencies of the Weibel instability and resulting particle acceleration on the magnetic field strength and orientation, and the particle distribution function.

  15. Dynamics of electron injection and acceleration driven by laser wakefield in tailored density profiles

    DOE PAGES

    Lee, Patrick; Maynard, G.; Audet, T. L.; ...

    2016-11-16

    The dynamics of electron acceleration driven by laser wakefield is studied in detail using the particle-in-cell code WARP with the objective to generate high-quality electron bunches with narrow energy spread and small emittance, relevant for the electron injector of a multistage accelerator. Simulation results, using experimentally achievable parameters, show that electron bunches with an energy spread of ~11% can be obtained by using an ionization-induced injection mechanism in a mm-scale length plasma. By controlling the focusing of a moderate laser power and tailoring the longitudinal plasma density profile, the electron injection beginning and end positions can be adjusted, while themore » electron energy can be finely tuned in the last acceleration section.« less

  16. Enhanced quasi-static particle-in-cell simulation of electron cloud instabilities in circular accelerators

    NASA Astrophysics Data System (ADS)

    Feng, Bing

    Electron cloud instabilities have been observed in many circular accelerators around the world and raised concerns of future accelerators and possible upgrades. In this thesis, the electron cloud instabilities are studied with the quasi-static particle-in-cell (PIC) code QuickPIC. Modeling in three-dimensions the long timescale propagation of beam in electron clouds in circular accelerators requires faster and more efficient simulation codes. Thousands of processors are easily available for parallel computations. However, it is not straightforward to increase the effective speed of the simulation by running the same problem size on an increasingly number of processors because there is a limit to domain size in the decomposition of the two-dimensional part of the code. A pipelining algorithm applied on the fully parallelized particle-in-cell code QuickPIC is implemented to overcome this limit. The pipelining algorithm uses multiple groups of processors and optimizes the job allocation on the processors in parallel computing. With this novel algorithm, it is possible to use on the order of 102 processors, and to expand the scale and the speed of the simulation with QuickPIC by a similar factor. In addition to the efficiency improvement with the pipelining algorithm, the fidelity of QuickPIC is enhanced by adding two physics models, the beam space charge effect and the dispersion effect. Simulation of two specific circular machines is performed with the enhanced QuickPIC. First, the proposed upgrade to the Fermilab Main Injector is studied with an eye upon guiding the design of the upgrade and code validation. Moderate emittance growth is observed for the upgrade of increasing the bunch population by 5 times. But the simulation also shows that increasing the beam energy from 8GeV to 20GeV or above can effectively limit the emittance growth. Then the enhanced QuickPIC is used to simulate the electron cloud effect on electron beam in the Cornell Energy Recovery Linac

  17. GPU Linear Algebra Libraries and GPGPU Programming for Accelerating MOPAC Semiempirical Quantum Chemistry Calculations.

    PubMed

    Maia, Julio Daniel Carvalho; Urquiza Carvalho, Gabriel Aires; Mangueira, Carlos Peixoto; Santana, Sidney Ramos; Cabral, Lucidio Anjos Formiga; Rocha, Gerd B

    2012-09-11

    In this study, we present some modifications in the semiempirical quantum chemistry MOPAC2009 code that accelerate single-point energy calculations (1SCF) of medium-size (up to 2500 atoms) molecular systems using GPU coprocessors and multithreaded shared-memory CPUs. Our modifications consisted of using a combination of highly optimized linear algebra libraries for both CPU (LAPACK and BLAS from Intel MKL) and GPU (MAGMA and CUBLAS) to hasten time-consuming parts of MOPAC such as the pseudodiagonalization, full diagonalization, and density matrix assembling. We have shown that it is possible to obtain large speedups just by using CPU serial linear algebra libraries in the MOPAC code. As a special case, we show a speedup of up to 14 times for a methanol simulation box containing 2400 atoms and 4800 basis functions, with even greater gains in performance when using multithreaded CPUs (2.1 times in relation to the single-threaded CPU code using linear algebra libraries) and GPUs (3.8 times). This degree of acceleration opens new perspectives for modeling larger structures which appear in inorganic chemistry (such as zeolites and MOFs), biochemistry (such as polysaccharides, small proteins, and DNA fragments), and materials science (such as nanotubes and fullerenes). In addition, we believe that this parallel (GPU-GPU) MOPAC code will make it feasible to use semiempirical methods in lengthy molecular simulations using both hybrid QM/MM and QM/QM potentials.

  18. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes >1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa (“displacement-per-atom”, the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  19. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes > 1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa ("displacement-per-atom", the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  20. Advances in petascale kinetic plasma simulation with VPIC and Roadrunner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowers, Kevin J; Albright, Brian J; Yin, Lin

    2009-01-01

    VPIC, a first-principles 3d electromagnetic charge-conserving relativistic kinetic particle-in-cell (PIC) code, was recently adapted to run on Los Alamos's Roadrunner, the first supercomputer to break a petaflop (10{sup 15} floating point operations per second) in the TOP500 supercomputer performance rankings. They give a brief overview of the modeling capabilities and optimization techniques used in VPIC and the computational characteristics of petascale supercomputers like Roadrunner. They then discuss three applications enabled by VPIC's unprecedented performance on Roadrunner: modeling laser plasma interaction in upcoming inertial confinement fusion experiments at the National Ignition Facility (NIF), modeling short pulse laser GeV ion acceleration andmore » modeling reconnection in magnetic confinement fusion experiments.« less

  1. Post-acceleration of laser driven protons with a compact high field linac

    NASA Astrophysics Data System (ADS)

    Sinigardi, Stefano; Londrillo, Pasquale; Rossi, Francesco; Turchetti, Giorgio; Bolton, Paul R.

    2013-05-01

    We present a start-to-end 3D numerical simulation of a hybrid scheme for the acceleration of protons. The scheme is based on a first stage laser acceleration, followed by a transport line with a solenoid or a multiplet of quadrupoles, and then a post-acceleration section in a compact linac. Our simulations show that from a laser accelerated proton bunch with energy selection at ~ 30MeV, it is possible to obtain a high quality monochromatic beam of 60MeV with intensity at the threshold of interest for medical use. In the present day experiments using solid targets, the TNSA mechanism describes accelerated bunches with an exponential energy spectrum up to a cut-off value typically below ~ 60MeV and wide angular distribution. At the cut-off energy, the number of protons to be collimated and post-accelerated in a hybrid scheme are still too low. We investigate laser-plasma acceleration to improve the quality and number of the injected protons at ~ 30MeV in order to assure efficient post-acceleration in the hybrid scheme. The results are obtained with 3D PIC simulations using a code where optical acceleration with over-dense targets, transport and post-acceleration in a linac can all be investigated in an integrated framework. The high intensity experiments at Nara are taken as a reference benchmarks for our virtual laboratory. If experimentally confirmed, a hybrid scheme could be the core of a medium sized infrastructure for medical research, capable of producing protons for therapy and x-rays for diagnosis, which complements the development of all optical systems.

  2. Stepwise Distributed Open Innovation Contests for Software Development: Acceleration of Genome-Wide Association Analysis

    PubMed Central

    Hill, Andrew; Loh, Po-Ru; Bharadwaj, Ragu B.; Pons, Pascal; Shang, Jingbo; Guinan, Eva; Lakhani, Karim; Kilty, Iain

    2017-01-01

    Abstract Background: The association of differing genotypes with disease-related phenotypic traits offers great potential to both help identify new therapeutic targets and support stratification of patients who would gain the greatest benefit from specific drug classes. Development of low-cost genotyping and sequencing has made collecting large-scale genotyping data routine in population and therapeutic intervention studies. In addition, a range of new technologies is being used to capture numerous new and complex phenotypic descriptors. As a result, genotype and phenotype datasets have grown exponentially. Genome-wide association studies associate genotypes and phenotypes using methods such as logistic regression. As existing tools for association analysis limit the efficiency by which value can be extracted from increasing volumes of data, there is a pressing need for new software tools that can accelerate association analyses on large genotype-phenotype datasets. Results: Using open innovation (OI) and contest-based crowdsourcing, the logistic regression analysis in a leading, community-standard genetics software package (PLINK 1.07) was substantially accelerated. OI allowed us to do this in <6 months by providing rapid access to highly skilled programmers with specialized, difficult-to-find skill sets. Through a crowd-based contest a combination of computational, numeric, and algorithmic approaches was identified that accelerated the logistic regression in PLINK 1.07 by 18- to 45-fold. Combining contest-derived logistic regression code with coarse-grained parallelization, multithreading, and associated changes to data initialization code further developed through distributed innovation, we achieved an end-to-end speedup of 591-fold for a data set size of 6678 subjects by 645 863 variants, compared to PLINK 1.07's logistic regression. This represents a reduction in run time from 4.8 hours to 29 seconds. Accelerated logistic regression code developed in this

  3. Simulations of an accelerator-based shielding experiment using the particle and heavy-ion transport code system PHITS.

    PubMed

    Sato, T; Sihver, L; Iwase, H; Nakashima, H; Niita, K

    2005-01-01

    In order to estimate the biological effects of HZE particles, an accurate knowledge of the physics of interaction of HZE particles is necessary. Since the heavy ion transport problem is a complex one, there is a need for both experimental and theoretical studies to develop accurate transport models. RIST and JAERI (Japan), GSI (Germany) and Chalmers (Sweden) are therefore currently developing and bench marking the General-Purpose Particle and Heavy-Ion Transport code System (PHITS), which is based on the NMTC and MCNP for nucleon/meson and neutron transport respectively, and the JAM hadron cascade model. PHITS uses JAERI Quantum Molecular Dynamics (JQMD) and the Generalized Evaporation Model (GEM) for calculations of fission and evaporation processes, a model developed at NASA Langley for calculation of total reaction cross sections, and the SPAR model for stopping power calculations. The future development of PHITS includes better parameterization in the JQMD model used for the nucleus-nucleus reactions, and improvement of the models used for calculating total reaction cross sections, and addition of routines for calculating elastic scattering of heavy ions, and inclusion of radioactivity and burn up processes. As a part of an extensive bench marking of PHITS, we have compared energy spectra of secondary neutrons created by reactions of HZE particles with different targets, with thicknesses ranging from <1 to 200 cm. We have also compared simulated and measured spatial, fluence and depth-dose distributions from different high energy heavy ion reactions. In this paper, we report simulations of an accelerator-based shielding experiment, in which a beam of 1 GeV/n Fe-ions has passed through thin slabs of polyethylene, Al, and Pb at an acceptance angle up to 4 degrees. c2005 Published by Elsevier Ltd on behalf of COSPAR.

  4. New Developments in Proton Radiography at the Los Alamos Neutron Science Center (LANSCE)

    DOE PAGES

    Morris, C. L.; Brown, E. N.; Agee, C.; ...

    2015-12-30

    An application of nuclear physics, a facility for using protons for flash radiography, was developed at the Los Alamos Neutron Science Center (LANSCE). Protons have proven far superior to high energy x-rays for flash radiography because of their long mean free path, good position resolution, and low scatter background. Although this facility is primarily used for studying very fast phenomena such as high explosive driven experiments, it is finding increasing application to other fields, such as tomography of static objects, phase changes in materials and the dynamics of chemical reactions. The advantages of protons are discussed, data from some recentmore » experiments will be reviewed and concepts for new techniques are introduced.« less

  5. Additive manufacturing capabilities applied to inertial confinement confusion at Los Alamos National Laboratory

    DOE PAGES

    Cardenas, Tana; Schmidt, Derek William; Peterson, Dominic S.

    2016-08-01

    We describe the use at Los Alamos National Laboratory of additive manufacturing (AM) for a variety of jigs and coating, assembly, and radiography fixtures. Additive manufacturing has also been used to produce shipping containers of complex design that would be too costly to have fabricated using traditional techniques. The current goal for AM use in target fabrication is to increase target accuracy and rigidity. This has been realized by implementing AM into target stalk fabrication, allowing increased complexity to address target strength and the addition of features for alignment at facilities. As a result, we will describe the fabrication ofmore » these components and our plans to utilize AM in the future.« less

  6. Performance of the New Los Alamos UCN Source and Implications for Future Experiments

    NASA Astrophysics Data System (ADS)

    Makela, Mark; LANL UCN Team

    2017-01-01

    The Los Alamos Ultracold Neutron (UCN) source was replaced during this past summer and has been commissioned during the last few months. The new source is the result of lessons learned during the 10 year operation of the first UCN source and extensive Monte Carlo analysis. The new source is a spallation driven source based on a solid deuterium UCN moderator similar the previous one. This talk will present an overview of the new source design and the results of commissioning tests. The talk will conclude with a brief overview of the implications of source performance on the neutron lifetime and LANL nEDM experiments. This work was funded by LANL LDRD.

  7. Electron linear accelerator system for natural rubber vulcanization

    NASA Astrophysics Data System (ADS)

    Rimjaem, S.; Kongmon, E.; Rhodes, M. W.; Saisut, J.; Thongbai, C.

    2017-09-01

    Development of an electron accelerator system, beam diagnostic instruments, an irradiation apparatus and electron beam processing methodology for natural rubber vulcanization is underway at the Plasma and Beam Physics Research Facility, Chiang Mai University, Thailand. The project is carried out with the aims to improve the qualities of natural rubber products. The system consists of a DC thermionic electron gun, 5-cell standing-wave radio-frequency (RF) linear accelerator (linac) with side-coupling cavities and an electron beam irradiation apparatus. This system is used to produce electron beams with an adjustable energy between 0.5 and 4 MeV and a pulse current of 10-100 mA at a pulse repetition rate of 20-400 Hz. An average absorbed dose between 160 and 640 Gy is expected to be archived for 4 MeV electron beam when the accelerator is operated at 400 Hz. The research activities focus firstly on assembling of the accelerator system, study on accelerator properties and electron beam dynamic simulations. The resonant frequency of the RF linac in π/2 operating mode is 2996.82 MHz for the operating temperature of 35 °C. The beam dynamic simulations were conducted by using the code ASTRA. Simulation results suggest that electron beams with an average energy of 4.002 MeV can be obtained when the linac accelerating gradient is 41.7 MV/m. The rms transverse beam size and normalized rms transverse emittance at the linac exit are 0.91 mm and 10.48 π mm·mrad, respectively. This information can then be used as the input data for Monte Carlo simulations to estimate the electron beam penetration depth and dose distribution in the natural rubber latex. The study results from this research will be used to define optimal conditions for natural rubber vulcanization with different electron beam energies and doses. This is very useful for development of future practical industrial accelerator units.

  8. Emittance Growth in the DARHT-II Linear Induction Accelerator

    DOE PAGES

    Ekdahl, Carl; Carlson, Carl A.; Frayer, Daniel K.; ...

    2017-10-03

    The dual-axis radiographic hydrodynamic test (DARHT) facility uses bremsstrahlung radiation source spots produced by the focused electron beams from two linear induction accelerators (LIAs) to radiograph large hydrodynamic experiments driven by high explosives. Radiographic resolution is determined by the size of the source spot, and beam emittance is the ultimate limitation to spot size. On the DARHT-II LIA, we measure an emittance higher than predicted by theoretical simulations, and even though this accelerator produces submillimeter source spots, we are exploring ways to improve the emittance. Some of the possible causes for the discrepancy have been investigated using particle-in-cell codes. Finally,more » the simulations establish that the most likely source of emittance growth is a mismatch of the beam to the magnetic transport, which can cause beam halo.« less

  9. Emittance Growth in the DARHT-II Linear Induction Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Carl; Carlson, Carl A.; Frayer, Daniel K.

    The dual-axis radiographic hydrodynamic test (DARHT) facility uses bremsstrahlung radiation source spots produced by the focused electron beams from two linear induction accelerators (LIAs) to radiograph large hydrodynamic experiments driven by high explosives. Radiographic resolution is determined by the size of the source spot, and beam emittance is the ultimate limitation to spot size. On the DARHT-II LIA, we measure an emittance higher than predicted by theoretical simulations, and even though this accelerator produces submillimeter source spots, we are exploring ways to improve the emittance. Some of the possible causes for the discrepancy have been investigated using particle-in-cell codes. Finally,more » the simulations establish that the most likely source of emittance growth is a mismatch of the beam to the magnetic transport, which can cause beam halo.« less

  10. Combining Acceleration and Displacement Dependent Modal Frequency Responses Using an MSC/NASTRAN DMAP Alter

    NASA Technical Reports Server (NTRS)

    Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.

    1996-01-01

    Solving for dynamic responses of free-free launch vehicle/spacecraft systems acted upon by buffeting winds is commonly performed throughout the aerospace industry. Due to the unpredictable nature of this wind loading event, these problems are typically solved using frequency response random analysis techniques. To generate dynamic responses for spacecraft with statically-indeterminate interfaces, spacecraft contractors prefer to develop models which have response transformation matrices developed for mode acceleration data recovery. This method transforms spacecraft boundary accelerations and displacements into internal responses. Unfortunately, standard MSC/NASTRAN modal frequency response solution sequences cannot be used to combine acceleration- and displacement-dependent responses required for spacecraft mode acceleration data recovery. External user-written computer codes can be used with MSC/NASTRAN output to perform such combinations, but these methods can be labor and computer resource intensive. Taking advantage of the analytical and computer resource efficiencies inherent within MS C/NASTRAN, a DMAP Alter has been developed to combine acceleration- and displacement-dependent modal frequency responses for performing spacecraft mode acceleration data recovery. The Alter has been used successfully to efficiently solve a common aerospace buffeting wind analysis.

  11. Probabilistic seismic hazard zonation for the Cuban building code update

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Llanes-Buron, C.

    2013-05-01

    A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.

  12. Turbulent Heating and Wave Pressure in Solar Wind Acceleration Modeling: New Insights to Empirical Forecasting of the Solar Wind

    NASA Astrophysics Data System (ADS)

    Woolsey, L. N.; Cranmer, S. R.

    2013-12-01

    The study of solar wind acceleration has made several important advances recently due to improvements in modeling techniques. Existing code and simulations test the competing theories for coronal heating, which include reconnection/loop-opening (RLO) models and wave/turbulence-driven (WTD) models. In order to compare and contrast the validity of these theories, we need flexible tools that predict the emergent solar wind properties from a wide range of coronal magnetic field structures such as coronal holes, pseudostreamers, and helmet streamers. ZEPHYR (Cranmer et al. 2007) is a one-dimensional magnetohydrodynamics code that includes Alfven wave generation and reflection and the resulting turbulent heating to accelerate solar wind in open flux tubes. We present the ZEPHYR output for a wide range of magnetic field geometries to show the effect of the magnetic field profiles on wind properties. We also investigate the competing acceleration mechanisms found in ZEPHYR to determine the relative importance of increased gas pressure from turbulent heating and the separate pressure source from the Alfven waves. To do so, we developed a code that will become publicly available for solar wind prediction. This code, TEMPEST, provides an outflow solution based on only one input: the magnetic field strength as a function of height above the photosphere. It uses correlations found in ZEPHYR between the magnetic field strength at the source surface and the temperature profile of the outflow solution to compute the wind speed profile based on the increased gas pressure from turbulent heating. With this initial solution, TEMPEST then adds in the Alfven wave pressure term to the modified Parker equation and iterates to find a stable solution for the wind speed. This code, therefore, can make predictions of the wind speeds that will be observed at 1 AU based on extrapolations from magnetogram data, providing a useful tool for empirical forecasting of the sol! ar wind.

  13. Computation of Thermodynamic Equilibria Pertinent to Nuclear Materials in Multi-Physics Codes

    NASA Astrophysics Data System (ADS)

    Piro, Markus Hans Alexander

    components at each iterative step, and the objective is to minimize the residuals of the mass balance equations. Several numerical advantages are achieved through this simplification. In particular, computational expense is reduced and the rate of convergence is enhanced. Furthermore, the software has demonstrated the ability to solve systems involving as many as 118 component elements. An early version of the code has already been integrated into the Advanced Multi-Physics (AMP) code under development by the Oak Ridge National Laboratory, Los Alamos National Laboratory, Idaho National Laboratory and Argonne National Laboratory. Keywords: Engineering, Nuclear -- 0552, Engineering, Material Science -- 0794, Chemistry, Mathematics -- 0405, Computer Science -- 0984

  14. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    DTIC Science & Technology

    2017-04-13

    modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a

  15. Benchmarking the MCNP Monte Carlo code with a photon skyshine experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsher, R.H.; Hsu, Hsiao Hua; Harvey, W.F.

    1993-07-01

    The MCNP Monte Carlo transport code is used by the Los Alamos National Laboratory Health and Safety Division for a broad spectrum of radiation shielding calculations. One such application involves the determination of skyshine dose for a variety of photon sources. To verify the accuracy of the code, it was benchmarked with the Kansas State Univ. (KSU) photon skyshine experiment of 1977. The KSU experiment for the unshielded source geometry was simulated in great detail to include the contribution of groundshine, in-silo photon scatter, and the effect of spectral degradation in the source capsule. The standard deviation of the KSUmore » experimental data was stated to be 7%, while the statistical uncertainty of the simulation was kept at or under 1%. The results of the simulation agreed closely with the experimental data, generally to within 6%. At distances of under 100 m from the silo, the modeling of the in-silo scatter was crucial to achieving close agreement with the experiment. Specifically, scatter off the top layer of the source cask accounted for [approximately]12% of the dose at 50 m. At distance >300m, using the [sup 60]Co line spectrum led to a dose overresponse as great as 19% at 700 m. It was necessary to use the actual source spectrum, which includes a Compton tail from photon collisions in the source capsule, to achieve close agreement with experimental data. These results highlight the importance of using Monte Carlo transport techniques to account for the nonideal features of even simple experiments''.« less

  16. Workshop on Probing Frontiers in Matter with Neutron Scattering, Wrap-up Session Chaired by John C. Browne on December 14, 1997, at Fuller Lodge, Los Alamos, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mezei, F.; Thompson, J.

    1998-12-01

    The Workshop on Probing Frontiers in Matter with Neutron Scattering consisted of a series of lectures and discussions about recent highlights in neutron scattering. In this report, we present the transcript of the concluding discussion session (wrap-up session) chaired by John C. Browne, Director of Los Alamos National Laboratory. The workshop had covered a spectrum of topics ranging from high T{sub c} superconductivity to polymer science, from glasses to molecular biology, a broad review aimed at identifying trends and future needs in condensed matter research. The focus of the wrap-up session was to summarize the workshop participants' views on developmentsmore » to come. Most of the highlights presented during the workshop were the result of experiments performed at the leading reactor-based neutron scattering facilities. However, recent advances with very high power accelerators open up opportunities to develop new approaches to spallation technique that could decisively advance neutron scattering research in areas for which reactor sources are today by far the best choice. The powerful combination of neutron scattering and increasingly accurate computer modeling emerged as another area of opportunity for research in the coming decades.« less

  17. Activation of accelerator construction materials by heavy ions

    NASA Astrophysics Data System (ADS)

    Katrík, P.; Mustafin, E.; Hoffmann, D. H. H.; Pavlovič, M.; Strašík, I.

    2015-12-01

    Activation data for an aluminum target irradiated by 200 MeV/u 238U ion beam are presented in the paper. The target was irradiated in the stacked-foil geometry and analyzed using gamma-ray spectroscopy. The purpose of the experiment was to study the role of primary particles, projectile fragments, and target fragments in the activation process using the depth profiling of residual activity. The study brought information on which particles contribute dominantly to the target activation. The experimental data were compared with the Monte Carlo simulations by the FLUKA 2011.2c.0 code. This study is a part of a research program devoted to activation of accelerator construction materials by high-energy (⩾200 MeV/u) heavy ions at GSI Darmstadt. The experimental data are needed to validate the computer codes used for simulation of interaction of swift heavy ions with matter.

  18. First muon acceleration using a radio-frequency accelerator

    NASA Astrophysics Data System (ADS)

    Bae, S.; Choi, H.; Choi, S.; Fukao, Y.; Futatsukawa, K.; Hasegawa, K.; Iijima, T.; Iinuma, H.; Ishida, K.; Kawamura, N.; Kim, B.; Kitamura, R.; Ko, H. S.; Kondo, Y.; Li, S.; Mibe, T.; Miyake, Y.; Morishita, T.; Nakazawa, Y.; Otani, M.; Razuvaev, G. P.; Saito, N.; Shimomura, K.; Sue, Y.; Won, E.; Yamazaki, T.

    2018-05-01

    Muons have been accelerated by using a radio-frequency accelerator for the first time. Negative muonium atoms (Mu- ), which are bound states of positive muons (μ+) and two electrons, are generated from μ+'s through the electron capture process in an aluminum degrader. The generated Mu- 's are initially electrostatically accelerated and injected into a radio-frequency quadrupole linac (RFQ). In the RFQ, the Mu- 's are accelerated to 89 keV. The accelerated Mu- 's are identified by momentum measurement and time of flight. This compact muon linac opens the door to various muon accelerator applications including particle physics measurements and the construction of a transmission muon microscope.

  19. Threatened and Endangered Species Habitat Management Plan for Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, David Charles; Hathcock, Charles Dean

    Los Alamos National Laboratory’s (LANL) Threatened and Endangered Species Habitat Management Plan (HMP) fulfills a commitment made to the U.S. Department of Energy (DOE) in the “Final Environmental Impact Statement for the Dual-Axis Radiographic Hydrodynamic Test Facility Mitigation Action Plan” (DOE 1996). The HMP received concurrence from the U.S. Fish and Wildlife Service (USFWS) in 1999 (USFWS consultation numbers 2-22-98-I-336 and 2-22-95-I-108). This 2015 update retains the management guidelines from the 1999 HMP for listed species, updates some descriptive information, and adds the New Mexico Meadow Jumping Mouse (Zapus hudsonius luteus) and Yellow-billed Cuckoo (Coccyzus americanus) which were federally listedmore » in 2014 (Keller 2015: USFWS consultation number 02ENNM00- 2015-I-0538).« less

  20. Pickup ion acceleration in the successive appearance of corotating interaction regions

    NASA Astrophysics Data System (ADS)

    Tsubouchi, K.

    2017-04-01

    Acceleration of pickup ions (PUIs) in an environment surrounded by a pair of corotating interaction regions (CIRs) was investigated by numerical simulations using a hybrid code. Energetic particles associated with CIRs have been considered to be a result of the acceleration at their shock boundaries, but recent observations identified the ion flux peaks in the sub-MeV to MeV energy range in the rarefaction region, where two separate CIRs were likely connected by the magnetic field. Our simulation results confirmed these observational features. As the accelerated PUIs repeatedly bounce back and forth along the field lines between the reverse shock of the first CIR and the forward shock of the second one, the energetic population is accumulated in the rarefaction region. It was also verified that PUI acceleration in the dual CIR system had two different stages. First, because PUIs have large gyroradii, multiple shock crossing is possible for several tens of gyroperiods, and there is an energy gain in the component parallel to the magnetic field via shock drift acceleration. Second, as the field rarefaction evolves and the radial magnetic field becomes dominant, Fermi-type reflection takes place at the shock. The converging nature of two shocks results in a net energy gain. The PUI energy acquired through these processes is close to 0.5 MeV, which may be large enough for further acceleration, possibly resulting in the source of anomalous cosmic rays.