Science.gov

Sample records for fusion simulation project

  1. Fusion Simulation Project Workshop Report

    NASA Astrophysics Data System (ADS)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  2. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    SciTech Connect

    Erickson, S.A.

    1984-04-25

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project.

  3. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    SciTech Connect

    2007-05-16

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  4. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    SciTech Connect

    Kritz, A.; Keyes, D.

    2007-05-18

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  5. Lessons Learned from ASCI applied to the Fusion Simulation Project (FSP)

    NASA Astrophysics Data System (ADS)

    Post, Douglass

    2003-10-01

    The magnetic fusion program has proposed a 20M dollar per year project to develop a computational predictive capability for magnetic fusion experiments. The DOE NNSA launched a program in 1996, the Accelerated Strategic Computing Initiative (ASCI) to achieve the same goal for nuclear weapons to allow certification of the stockpile without testing. We present a "lessons learned" analysis of the 3B dollary 7 year ASCI program with the goal of improving the FSP to maximize the likelihood of success. The major lessons from ASCI are: 1. Build on your institution's successful history; 2.Teams are the key element; 3. Sound Software Project Management is essential: 4. Requirements, schedule and resources must be consistent; 5. Practices, not processes, are important; 6. Minimize and mitigate risks; 7. Minimize the computer science research aspect and maximize the physics elements; and 8. Verification and Validation are essential. We map this experience and recommendations into the FSP.

  6. Advanced fusion concepts: project summaries

    SciTech Connect

    1980-12-01

    This report contains descriptions of the activities of all the projects supported by the Advanced Fusion Concepts Branch of the Office of Fusion Energy, US Department of Energy. These descriptions are project summaries of each of the individual projects, and contain the following: title, principle investigators, funding levels, purpose, approach, progress, plans, milestones, graduate students, graduates, other professional staff, and recent publications. Information is given for each of the following programs: (1) reverse-field pinch, (2) compact toroid, (3) alternate fuel/multipoles, (4) stellarator/torsatron, (5) linear magnetic fusion, (6) liners, and (7) Tormac. (MOW)

  7. Fusion Simulation Program

    SciTech Connect

    Project Staff

    2012-02-29

    Under this project, General Atomics (GA) was tasked to develop the experimental validation plans for two high priority ISAs, Boundary and Pedestal and Whole Device Modeling in collaboration with the theory, simulation and experimental communities. The following sections have been incorporated into the final FSP Program Plan (www.pppl.gov/fsp), which was delivered to the US Department of Energy (DOE). Additional deliverables by GA include guidance for validation, development of metrics to evaluate success and procedures for collaboration with experiments. These are also part of the final report.

  8. Fusion Plasma Theory project summaries

    SciTech Connect

    Not Available

    1993-10-01

    This Project Summary book is a published compilation consisting of short descriptions of each project supported by the Fusion Plasma Theory and Computing Group of the Advanced Physics and Technology Division of the Department of Energy, Office of Fusion Energy. The summaries contained in this volume were written by the individual contractors with minimal editing by the Office of Fusion Energy. Previous summaries were published in February of 1982 and December of 1987. The Plasma Theory program is responsible for the development of concepts and models that describe and predict the behavior of a magnetically confined plasma. Emphasis is given to the modelling and understanding of the processes controlling transport of energy and particles in a toroidal plasma and supporting the design of the International Thermonuclear Experimental Reactor (ITER). A tokamak transport initiative was begun in 1989 to improve understanding of how energy and particles are lost from the plasma by mechanisms that transport them across field lines. The Plasma Theory program has actively-participated in this initiative. Recently, increased attention has been given to issues of importance to the proposed Tokamak Physics Experiment (TPX). Particular attention has been paid to containment and thermalization of fast alpha particles produced in a burning fusion plasma as well as control of sawteeth, current drive, impurity control, and design of improved auxiliary heating. In addition, general models of plasma behavior are developed from physics features common to different confinement geometries. This work uses both analytical and numerical techniques. The Fusion Theory program supports research projects at US government laboratories, universities and industrial contractors. Its support of theoretical work at universities contributes to the office of Fusion Energy mission of training scientific manpower for the US Fusion Energy Program.

  9. Fusion Simulation Program Definition. Final report

    SciTech Connect

    Cary, John R.

    2012-09-05

    We have completed our contributions to the Fusion Simulation Program Definition Project. Our contributions were in the overall planning with concentration in the definition of the area of Software Integration and Support. We contributed to the planning of multiple meetings, and we contributed to multiple planning documents.

  10. Simulation of Fusion Plasmas

    ScienceCinema

    Holland, Chris [UC San Diego, San Diego, California, United States

    2016-07-12

    The upcoming ITER experiment (www.iter.org) represents the next major milestone in realizing the promise of using nuclear fusion as a commercial energy source, by moving into the “burning plasma” regime where the dominant heat source is the internal fusion reactions. As part of its support for the ITER mission, the US fusion community is actively developing validated predictive models of the behavior of magnetically confined plasmas. In this talk, I will describe how the plasma community is using the latest high performance computing facilities to develop and refine our models of the nonlinear, multiscale plasma dynamics, and how recent advances in experimental diagnostics are allowing us to directly test and validate these models at an unprecedented level.

  11. SECAD-- a Schema-based Environment for Configuring, Analyzing and Documenting Integrated Fusion Simulations. Final report

    SciTech Connect

    Shasharina, Svetlana

    2012-05-23

    SECAD is a project that developed a GUI for running integrated fusion simulations as implemented in FACETS and SWIM SciDAC projects. Using the GUI users can submit simulations locally and remotely and visualize the simulation results.

  12. Component Framework for Coupled Integrated Fusion Plasma Simulation

    SciTech Connect

    Elwasif, Wael R; Bernholdt, David E; Berry, Lee A; Batchelor, Donald B

    2007-01-01

    Fusion Successful simulation of the complex physics that affect magnetically confined fusion plasma remains an important target milestone towards the development of viable fusion energy. Major advances in the underlying physics formulations, mathematical modeling, and computational tools and techniques are needed to enable a complete fusion simulation on the emerging class of large scale capability parallel computers that are coming on-line in the next few years. Several pilot projects are currently being undertaken to explore different (partial) code integration and coupling problems, and possible solutions that may guide the larger integration endeavor. In this paper, we present the design and implementation details of one such project, a component based approach to couple existing codes to model the interaction between high power radio frequency (RF) electromagnetic waves, and magnetohydrodynamics (MHD) aspects of the burning plasma. The framework and component design utilize a light coupling approach based on high level view of constituent codes that facilitates rapid incorporation of new components into the integrated simulation framework. The work illustrates the viability of the light coupling approach to better understand physics and stand-alone computer code dependencies and interactions, as a precursor to a more tightly coupled integrated simulation environment.

  13. FASTER project - data fusion for trafficability assessment

    NASA Astrophysics Data System (ADS)

    Skocki, K.; Nevatia, Y.

    2013-09-01

    Martian surface missions since Sojourner mission typically use robotic rover platform for carrying the science instrumentation. Such concept, successfully demonstrated by twin MER rovers, is however risky due to low trafficability soil patches unrecognized. Idea of soil traversability assessment is the base for FASTER project activities. This article shortly presents topics of special interest for planetary rover safe path finding and decision making process. The data fusion aspect of such process is analyzed shortly.

  14. Development of our laser fusion integration simulation

    NASA Astrophysics Data System (ADS)

    Li, Jinghong; Zhai, Chuanlei; Li, Shuanggui; Li, Xin; Zheng, Wudi; Yong, Heng; Zeng, Qinghong; Hang, Xudeng; Qi, Jin; Yang, Rong; Cheng, Juan; Song, Peng; Gu, Peijun; Zhang, Aiqing; An, Hengbin; Xu, Xiaowen; Guo, Hong; Cao, Xiaolin; Mo, Zeyao; Pei, Wenbing; Jiang, Song; Zhu, Shao-ping

    2013-11-01

    In the target design of the Inertial Confinement Fusion (ICF) program, it is common practice to apply radiation hydrodynamics code to study the key physical processes happening in ICF process, such as hohlraum physics, radiation drive symmetry, capsule implosion physics in the radiation-drive approach of ICF. Recently, many efforts have been done to develop our 2D integrated simulation capability of laser fusion with a variety of optional physical models and numerical methods. In order to effectively integrate the existing codes and to facilitate the development of new codes, we are developing an object-oriented structured-mesh parallel code-supporting infrastructure, called JASMIN. Based on two-dimensional three-temperature hohlraum physics code LARED-H and two-dimensional multi-group radiative transfer code LARED-R, we develop a new generation two-dimensional laser fusion code under the JASMIN infrastructure, which enable us to simulate the whole process of laser fusion from the laser beams' entrance into the hohlraum to the end of implosion. In this paper, we will give a brief description of our new-generation two-dimensional laser fusion code, named LARED-Integration, especially in its physical models, and present some simulation results of holhraum.

  15. Simulation of MTF experiments at General Fusion

    NASA Astrophysics Data System (ADS)

    Reynolds, Meritt; Froese, Aaron; Barsky, Sandra; Devietien, Peter; Toth, Gabor; Brennan, Dylan; Hooper, Bick

    2016-10-01

    General Fusion (GF) aims to develop a magnetized target fusion (MTF) power plant based on compression of magnetically-confined plasma by liquid metal. GF is testing this compression concept by collapsing solid aluminum liners onto spheromak or tokamak plasmas. To simulate the evolution of the compressing plasma in these experiments, we integrated a moving-mesh method into a finite-volume MHD code (VAC). The single-fluid model includes temperature-dependent resistivity and anisotropic heat transport. The trajectory of the liner is based on experiments and LS-DYNA simulations. During compression the geometry remains axially symmetric, but the MHD simulation is fully 3D to capture ideal and resistive plasma instabilities. We compare simulation to experiment through the primary diagnostic of Mirnov probes embedded in the inner coaxial surface against which the magnetic flux and plasma are compressed by the imploding liner. The MHD simulation reproduces the appearance of n=1 mode activity observed in experiments performed in negative D-shape geometry (MRT and PROSPECTOR machines). The same code predicts more favorable compression in spherical tokamak geometry, having positive D-shape (SPECTOR machine).

  16. Gyrokinetic Simulation of TAE in Fusion plasmas

    NASA Astrophysics Data System (ADS)

    Wang, Zhixuan

    Linear gyrokinetic simulation of fusion plasmas finds a radial localization of the toroidal Alfvén eigenmodes (TAE) due to the non-perturbative energetic particles (EP) contribution. The EP-driven TAE has a radial mode width much smaller than that predicted by the magnetohydrodynamic (MHD) theory. The TAE radial position stays around the strongest EP pressure gradients when the EP profile evolves. The non-perturbative EP contribution is also the main cause for the breaking of the radial symmetry of the ballooning mode structure and for the dependence of the TAE frequency on the toroidal mode number. These phenomena are beyond the picture of the conventional MHD theory. Linear gyrokinetic simulation of the electron cyclotron heating (ECH) experiments on DIII-D successfully recover the TAE and RSAE. The EP profile, rather than the electron temperature, is found to be the key factor determining whether TAE or RSAE is the dominant mode in the system in our simulation. Investigation on the nonlinear gyrokinetic simulation model reveals a missing nonlinear term which has important contributions to the zonal magnetic fields. A new fluid-electron hybrid model is proposed to keep this nonlinear term in the lowest order fluid part. Nonlinear simulation of TAE using DIII-D parameters confirms the importance of this new term for the zonal magnetic fields. It is also found that zonal structures dominated by zonal electric fields are forced driven at about twice the linear growth rate of TAE in the linear phase. The zonal flows then limit the nonlinear saturation level by tearing the eigenmode structures apart. In the nonlinear phase of the simulation, the major frequency in the system chirps down by about 30% and stays there.

  17. Purdue Contribution of Fusion Simulation Program

    SciTech Connect

    Jeffrey Brooks

    2011-09-30

    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall

  18. Quality assurance in the Antares laser fusion construction project

    SciTech Connect

    Reichelt, W.H.

    1984-01-01

    The Antares CO/sub 2/ laser facility came on line in November 1983 as an experimental physics facility; it is the world's largest CO/sub 2/ laser fusion system. Antares is a major component of the Department of Energy's Inertial Confinement Fusion Program. Antares is a one-of-a-kind laser system that is used in an experimental environment. Given limited project funds and tight schedules, the quality assurance program was tailored to achieve project goals without imposing oppressive constraints. The discussion will review the Antares quality assurance program and the utility of various portions to completion of the project.

  19. Fusion Cross Sections of Astrophysics Interest Within the STELLA Project

    NASA Astrophysics Data System (ADS)

    Courtin, Sandrine; Fruet, Guillaume; Jenkins, David G.; Heine, Marcel; Montanari, Daniele; Morris, Luke G.; Lotay, Gavin; Regan, Patrick H.; Kirsebom, Oliver S.; Della Negra, Serge; Hammache, Faïrouz; de Sereville, Nicolas; Bastin, Beyhan; de Oliveira, François; Randisi, Giacomo; Stodel, Christelle; Beck, Christian; Haas, Florent

    Low energy fusion between light heavy-ions is a key feature of the evolution of massive stars. In systems of astrophysical interest, the process may be strongly affected by molecular configurations of the compound nucleus, leading to resonant S factors. In particular, the 12C+12C fusion reaction has been the object of numerous experimental investigations. The STELLA project has been developed to extend these investigations to lower energies towards the Gamow window.

  20. Submodeling Simulations in Fusion Welds: Part II

    NASA Astrophysics Data System (ADS)

    Bonifaz, E. A.

    2013-11-01

    In part I, three-dimensional transient non-linear sub modeling heat transfer simulations were performed to study the thermal histories and thermal cycles that occur during the welding process at the macro, meso and micro scales. In the present work, the corresponding non-uniform temperature changes were imposed as load conditions on structural calculations to study the evolution of localized plastic strains and residual stresses at these sub-level scales. To reach the goal, a three-dimensional finite element elastic-plastic model (ABAQUS code) was developed. The sub-modeling technique proposed to be used in coupling phase-field (and/or digital microstructures) codes with finite element codes, was used to mesh a local part of the model with a refined mesh based on interpolation of the solution from an initial, relatively coarse, macro global model. The meso-sub-model is the global model for the subsequent micro sub-model. The strategy used to calculate temperatures, strains and residual stresses at the macro, meso and micro scale level, is very flexible to be used to any number of levels. The objective of this research was to initiate the development of microstructural models to identify fusion welding process parameters for preserving the single crystal nature of gas turbine blades during repair procedures. The multi-scale submodeling approach can be used to capture weld pool features at the macro-meso scale level, and micro residual stress and secondary dendrite arm spacing features at the micro scale level.

  1. Simulation of National Intelligence Process with Fusion

    DTIC Science & Technology

    2008-03-01

    modelled as zero mean Gaussian noise. The state estimate provided by a Kalman filter is statistically optimal in that it minimizes the mean squared error...the fusion methods above contain logical and mathematical algorithms based on either continuous or discrete quantifiable data, so to use these methods...method for capturing statistics about the performance of different architectures, it fails to capture the synergy of intelligence or information fusion

  2. Spherically symmetric simulation of plasma liner driven magnetoinertial fusion

    SciTech Connect

    Samulyak, Roman; Parks, Paul; Wu Lingling

    2010-09-15

    Spherically symmetric simulations of the implosion of plasma liners and compression of plasma targets in the concept of the plasma jet driven magnetoinertial fusion have been performed using the method of front tracking. The cases of single deuterium and xenon liners and double layer deuterium-xenon liners compressing various deuterium-tritium targets have been investigated, optimized for maximum fusion energy gains, and compared with theoretical predictions and scaling laws of [P. Parks, Phys. Plasmas 15, 062506 (2008)]. In agreement with the theory, the fusion gain was significantly below unity for deuterium-tritium targets compressed by Mach 60 deuterium liners. The most optimal setup for a given chamber size contained a target with the initial radius of 20 cm compressed by a 10 cm thick, Mach 60 xenon liner, achieving a fusion energy gain of 10 with 10 GJ fusion yield. Simulations also showed that composite deuterium-xenon liners reduce the energy gain due to lower target compression rates. The effect of heating of targets by alpha particles on the fusion energy gain has also been investigated.

  3. Secondary fusion coupled deuteron/triton transport simulation and thermal-to-fusion neutron convertor measurement

    SciTech Connect

    Wang, G. B.; Wang, K.; Liu, H. G.; Li, R. D.

    2013-07-01

    A Monte Carlo tool RSMC (Reaction Sequence Monte Carlo) was developed to simulate deuteron/triton transportation and reaction coupled problem. The 'Forced particle production' variance reduction technique was used to improve the simulation speed, which made the secondary product play a major role. The mono-energy 14 MeV fusion neutron source was employed as a validation. Then the thermal-to-fusion neutron convertor was studied with our tool. Moreover, an in-core conversion efficiency measurement experiment was performed with {sup 6}LiD and {sup 6}LiH converters. Threshold activation foils was used to indicate the fast and fusion neutron flux. Besides, two other pivotal parameters were calculated theoretically. Finally, the conversion efficiency of {sup 6}LiD is obtained as 1.97x10{sup -4}, which matches well with the theoretical result. (authors)

  4. Projective simulation for artificial intelligence.

    PubMed

    Briegel, Hans J; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  5. Projective simulation for artificial intelligence

    NASA Astrophysics Data System (ADS)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  6. Human Sensing Fusion Project for Safety and Health Society

    NASA Astrophysics Data System (ADS)

    Maenaka, Kazusuke

    This paper introduces objectives and status of “Human sensing fusion project” in the Exploratory Research for Advanced Technology (ERATO) scheme produced by Japan Science and Technology Agency (JST). This project was started in December 2007 and the laboratory with 11 members opened on April 2008. The aim of this project is to realize a human activity-monitoring device with many kinds of sensors in ultimate small size so that the device can be pasted or patched to the human body, and to establish the algorism for understanding human condition including both physical and mental conditions from obtained data. This system can be used towards the prevention of the danger of accidents and the maintenance of health. The actual research has just begun and preparations for project are well under way.

  7. Atomic Data and Modelling for Fusion: the ADAS Project

    SciTech Connect

    Summers, H. P.; O'Mullane, M. G.

    2011-05-11

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models.The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on 'lifting the fundamental data baseline'--a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  8. Atomic Data and Modelling for Fusion: the ADAS Project

    NASA Astrophysics Data System (ADS)

    Summers, H. P.; O'Mullane, M. G.

    2011-05-01

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models. The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on `lifting the fundamental data baseline'—a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  9. Simulated disparity and peripheral blur interact during binocular fusion

    PubMed Central

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-01-01

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual's aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. PMID:25034260

  10. KULL: LLNL's ASCI Inertial Confinement Fusion Simulation Code

    SciTech Connect

    Rathkopf, J. A.; Miller, D. S.; Owen, J. M.; Zike, M. R.; Eltgroth, P. G.; Madsen, N. K.; McCandless, K. P.; Nowak, P. F.; Nemanic, M. K.; Gentile, N. A.; Stuart, L. M.; Keen, N. D.; Palmer, T. S.

    2000-01-10

    KULL is a three dimensional, time dependent radiation hydrodynamics simulation code under development at Lawrence Livermore National Laboratory. A part of the U.S. Department of Energy's Accelerated Strategic Computing Initiative (ASCI), KULL's purpose is to simulate the physical processes in Inertial Confinement Fusion (ICF) targets. The National Ignition Facility, where ICF experiments will be conducted, and ASCI are part of the experimental and computational components of DOE's Stockpile Stewardship Program. This paper provides an overview of ASCI and describes KULL, its hydrodynamic simulation capability and its three methods of simulating radiative transfer. Particular emphasis is given to the parallelization techniques essential to obtain the performance required of the Stockpile Stewardship Program and to exploit the massively parallel processor machines that ASCI is procuring.

  11. Simulation of polyethylene glycol and calcium-mediated membrane fusion

    SciTech Connect

    Pannuzzo, Martina; De Jong, Djurre H.; Marrink, Siewert J.; Raudino, Antonio

    2014-03-28

    We report on the mechanism of membrane fusion mediated by polyethylene glycol (PEG) and Ca{sup 2+} by means of a coarse-grained molecular dynamics simulation approach. Our data provide a detailed view on the role of cations and polymer in modulating the interaction between negatively charged apposed membranes. The PEG chains cause a reduction of the inter-lamellar distance and cause an increase in concentration of divalent cations. When thermally driven fluctuations bring the membranes at close contact, a switch from cis to trans Ca{sup 2+}-lipid complexes stabilizes a focal contact acting as a nucleation site for further expansion of the adhesion region. Flipping of lipid tails induces subsequent stalk formation. Together, our results provide a molecular explanation for the synergistic effect of Ca{sup 2+} and PEG on membrane fusion.

  12. Simulating Intense Ion Beams for Inertial Fusion Energy

    SciTech Connect

    Friedman, A.

    2001-02-20

    The Heavy Ion Fusion (HIF) program's goal is the development of the body of knowledge needed for Inertial Fusion Energy (IFE) to realize its promise. The intense ion beams that will drive HIF targets are rzonneutral plasmas and exhibit collective, nonlinear dynamics which must be understood using the kinetic models of plasma physics. This beam physics is both rich and subtle: a wide range in spatial and temporal scales is involved, and effects associated with both instabilities and non-ideal processes must be understood. Ion beams have a ''long memory,'' and initialization of a beam at mid-system with an idealized particle distribution introduces uncertainties; thus, it will be crucial to develop, and to extensively use, an integrated and detailed ''source-to-target'' HIF beam simulation capability. We begin with an overview of major issues.

  13. Simulating Intense Ion Beams for Inertial Fusion Energy

    SciTech Connect

    Friedman, A

    2001-02-20

    The Heavy Ion Fusion (HIF) program's goal is the development of the body of knowledge needed for Inertial Fusion Energy (IFE) to realize its promise. The intense ion beams that will drive HIF targets are nonneutral plasmas and exhibit collective, nonlinear dynamics which must be understood using the kinetic models of plasma physics. This beam physics is both rich and subtle: a wide range in spatial and temporal scales is involved, and effects associated with both instabilities and non-ideal processes must be understood. Ion beams have a ''long memory'', and initialization of a beam at mid-system with an idealized particle distribution introduces uncertainties; thus, it will be crucial to develop, and to extensively use, an integrated and detailed ''source-to-target'' HIF beam simulation capability. We begin with an overview of major issues.

  14. Terascale simulations for heavy ion inertial fusion energy

    SciTech Connect

    Friedman, A; Cohen, R H; Grote, D P; Sharp, W M; Celata, C M; Lee, E P; Vay, J-L; Davidson, R C; Kaganovich, I; Lee, W W; Qin, H; Welch, D R; Haber, I; Kishek, R A

    2000-06-08

    The intense ion beams in a heavy ion Inertial Fusion Energy (IFE) driver and fusion chamber are non-neutral plasmas whose dynamics are largely dominated by space charge. We propose to develop a ''source-to-target'' Heavy Ion Fusion (HIF) beam simulation capability: a description of the kinetic behavior of this complex, nonlinear system which is both integrated and detailed. We will apply this new capability to further our understanding of key scientific issues in the physics of ion beams for IFE. The simulations will entail self-consistent field descriptions that require interprocessor communication, but are scalable and will run efficiently on terascale architectures. This new capability will be based on the integration of three types of simulations, each requiring terascale computing: (1) simulations of acceleration and confinement of the space-charge-dominated ion beams through the driver (accelerator, pulse compression line, and final focusing system) which accurately describe their dynamics, including emittance growth (phase-space dilution) effects; these are particle-in-cell (PIC) models; (2) electromagnetic (EM) and magnetoinductive (Darwin) simulations which describe the beam and the fusion chamber environment, including multibeam, neutralization, stripping, beam and plasma ionization processes, and return current effects; and (3) highly detailed simulations (6f, multispecies PIC, continuum Vlasov), which can examine electron effects and collective modes in the driver and chamber, and can study halo generation with excellent statistics, to ensure that these effects do not disrupt the focusability of the beams. The code development will involve: (i) adaptation of existing codes to run efficiently on multi-SMP computers that use a hybrid of shared and distributed memory; (ii) development of new and improved numerical algorithms, e.g., averaging techniques that will afford larger timesteps; and (iii) incorporation of improved physics models (e.g., for self

  15. The Progress of Research Project for Magnetized Target Fusion in China

    NASA Astrophysics Data System (ADS)

    Yang, Xian-Jun

    2015-11-01

    The fusion of magnetized plasma called Magnetized Target Fusion (MTF) is a hot research area recently. It may significantly reduce the cost and size. Great progress has been achieved in past decades around the world. Five years ago, China initiated the MTF project and has gotten some progress as follows: 1. Verifying the feasibility of ignition of MTF by means of first principle and MHD simulation; 2. Generating the magnetic field over 1400 Tesla, which can be suppress the heat conduction from charged particles, deposit the energy of alpha particle to promote the ignition process, and produce the stable magnetized plasma for the target of ignition; 3. The imploding facility of FP-1 can put several Mega Joule energy to the solid liner of about ten gram in the range of microsecond risen time, while the simulating tool has been developed for design and analysis of the process; 4. The target of FRC can be generated by ``YG 1 facility'' while some simulating tools have be developed. Next five years, the above theoretical work and the experiments of MTF may be integrated to step up as the National project, which may make my term play an important lead role and be supposed to achieve farther progress in China. Supported by the National Natural Science Foundation of China under Grant No 11175028.

  16. Bohunice Simulator Data Collection Project

    SciTech Connect

    Cillik, Ivan; Prochaska, Jan

    2002-07-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project training of V-2 control room crews was performed at VUJE-Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for different operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  17. Computer modeling and simulation in inertial confinement fusion

    SciTech Connect

    McCrory, R.L.; Verdon, C.P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics. 46 refs., 19 figs., 1 tab.

  18. Computer modeling and simulation in inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    McCrory, R. L.; Verdon, C. P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics.

  19. Report of the Fusion Energy Sciences Advisory Committee. Panel on Integrated Simulation and Optimization of Magnetic Fusion Systems

    SciTech Connect

    Dahlburg, Jill; Corones, James; Batchelor, Donald; Bramley, Randall; Greenwald, Martin; Jardin, Stephen; Krasheninnikov, Sergei; Laub, Alan; Leboeuf, Jean-Noel; Lindl, John; Lokke, William; Rosenbluth, Marshall; Ross, David; Schnack, Dalton

    2002-11-01

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individual features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC

  20. The Mars Gravity Simulation Project

    NASA Technical Reports Server (NTRS)

    Korienek, Gene

    1998-01-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  1. Inertial Fusion Energy Studies on an Earth Simulator-Class Computer

    SciTech Connect

    Friedman, A; Stephens, R

    2002-08-13

    The U.S. is developing fusion energy based on inertial confinement of the burning fusion fuel, as a complement to the magnetic confinement approach. DOE's Inertial Fusion Energy (IFE) program within the Office of Fusion Energy Sciences (OFES) is coordinated with, and gains leverage from, the much larger Inertial Confinement Fusion program of the National Nuclear Security Administration (NNSA). Advanced plasma and particle beam simulations play a major role in the IFE effort, and the program is well poised to benefit from an Earth Simulator-class resource. Progress in all key physics areas of IFE, including heavy-ion ''drivers'' which impart the energy to the fusion fuel, the targets for both ion- and laser-driven approaches, and an advanced concept known as fast ignition, would be dramatically accelerated by an Earth Simulator-class resource.

  2. Deuterium-Tritium Simulations of the Enhanced Reversed Shear Mode in the Tokamak Fusion Test Reactor

    SciTech Connect

    Mikkelsen, D.R.; Manickam, J.; Scott, S.D.; Zarnstorff

    1997-04-01

    The potential performance, in deuterium-tritium plasmas, of a new enhanced con nement regime with reversed magnetic shear (ERS mode) is assessed. The equilibrium conditions for an ERS mode plasma are estimated by solving the plasma transport equations using the thermal and particle dif- fusivities measured in a short duration ERS mode discharge in the Tokamak Fusion Test Reactor [F. M. Levinton, et al., Phys. Rev. Letters, 75, 4417, (1995)]. The plasma performance depends strongly on Zeff and neutral beam penetration to the core. The steady state projections typically have a central electron density of {approx}2:5x10 20 m{sup -3} and nearly equal central electron and ion temperatures of {approx}10 keV. In time dependent simulations the peak fusion power, {approx} 25 MW, is twice the steady state level. Peak performance occurs during the density rise when the central ion temperature is close to the optimal value of {approx} 15 keV. The simulated pressure profiles can be stable to ideal MHD instabilities with toroidal mode number n = 1, 2, 3, 4 and {infinity} for {beta}{sub norm} up to 2.5; the simulations have {beta}{sub norm} {le} 2.1. The enhanced reversed shear mode may thus provide an opportunity to conduct alpha physics experiments in conditions imilar to those proposed for advanced tokamak reactors.

  3. Lipid Tail Protrusion in Simulations Predicts Fusogenic Activity of Influenza Fusion Peptide Mutants and Conformational Models

    PubMed Central

    Larsson, Per; Kasson, Peter M.

    2013-01-01

    Fusion peptides from influenza hemagglutinin act on membranes to promote membrane fusion, but the mechanism by which they do so remains unknown. Recent theoretical work has suggested that contact of protruding lipid tails may be an important feature of the transition state for membrane fusion. If this is so, then influenza fusion peptides would be expected to promote tail protrusion in proportion to the ability of the corresponding full-length hemagglutinin to drive lipid mixing in fusion assays. We have performed molecular dynamics simulations of influenza fusion peptides in lipid bilayers, comparing the X-31 influenza strain against a series of N-terminal mutants. As hypothesized, the probability of lipid tail protrusion correlates well with the lipid mixing rate induced by each mutant. This supports the conclusion that tail protrusion is important to the transition state for fusion. Furthermore, it suggests that tail protrusion can be used to examine how fusion peptides might interact with membranes to promote fusion. Previous models for native influenza fusion peptide structure in membranes include a kinked helix, a straight helix, and a helical hairpin. Our simulations visit each of these conformations. Thus, the free energy differences between each are likely low enough that specifics of the membrane environment and peptide construct may be sufficient to modulate the equilibrium between them. However, the kinked helix promotes lipid tail protrusion in our simulations much more strongly than the other two structures. We therefore predict that the kinked helix is the most fusogenic of these three conformations. PMID:23505359

  4. Evaluation of performance of select fusion experiments and projected reactors

    NASA Technical Reports Server (NTRS)

    Miley, G. H.

    1978-01-01

    The performance of NASA Lewis fusion experiments (SUMMA and Bumpy Torus) is compared with other experiments and that necessary for a power reactor. Key parameters cited are gain (fusion power/input power) and the time average fusion power, both of which may be more significant for real fusion reactors than the commonly used Lawson parameter. The NASA devices are over 10 orders of magnitude below the required powerplant values in both gain and time average power. The best experiments elsewhere are also as much as 4 to 5 orders of magnitude low. However, the NASA experiments compare favorably with other alternate approaches that have received less funding than the mainline experiments. The steady-state character and efficiency of plasma heating are strong advantages of the NASA approach. The problem, though, is to move ahead to experiments of sufficient size to advance in gain and average power parameters.

  5. Projection-Based linear constrained estimation and fusion over long-haul links

    SciTech Connect

    Rao, Nageswara S

    2016-01-01

    We study estimation and fusion with linear dynamics in long-haul sensor networks, wherein a number of sensors are remotely deployed over a large geographical area for performing tasks such as target tracking, and a remote fusion center serves to combine the information provided by these sensors in order to improve the overall tracking accuracy. In reality, the motion of a dynamic target might be subject to certain constraints, for instance, those defined by a road network. We explore the accuracy performance of projection-based constrained estimation and fusion methods that is affected by information loss over the long-haul links. We use an example to compare the tracking errors under various implementations of centralized and distributed projection-based estimation and fusion methods and demonstrate the effectiveness of using projection-based methods in these settings.

  6. Fusion

    NASA Astrophysics Data System (ADS)

    Herman, Robin

    1990-10-01

    The book abounds with fascinating anecdotes about fusion's rocky path: the spurious claim by Argentine dictator Juan Peron in 1951 that his country had built a working fusion reactor, the rush by the United States to drop secrecy and publicize its fusion work as a propaganda offensive after the Russian success with Sputnik; the fortune Penthouse magazine publisher Bob Guccione sank into an unconventional fusion device, the skepticism that met an assertion by two University of Utah chemists in 1989 that they had created "cold fusion" in a bottle. Aimed at a general audience, the book describes the scientific basis of controlled fusion--the fusing of atomic nuclei, under conditions hotter than the sun, to release energy. Using personal recollections of scientists involved, it traces the history of this little-known international race that began during the Cold War in secret laboratories in the United States, Great Britain and the Soviet Union, and evolved into an astonishingly open collaboration between East and West.

  7. Mesh refinement for particle-in-cell plasma simulations: Applications to - and benefits for - heavy ion fusion

    SciTech Connect

    Vay, J.L.; Colella, P.; McCorquodale, P.; Van Straalen, B.; Friedman, A.; Grote, D.P.

    2002-05-24

    The numerical simulation of the driving beams in a heavy ion fusion power plant is a challenging task, and simulation of the power plant as a whole, or even of the driver, is not yet possible. Despite the rapid progress in computer power, past and anticipated, one must consider the use of the most advanced numerical techniques, if they are to reach the goal expeditiously. One of the difficulties of these simulations resides in the disparity of scales, in time and in space, which must be resolved. When these disparities are in distinctive zones of the simulation region, a method which has proven to be effective in other areas (e.g., fluid dynamics simulations) is the mesh refinement technique. They discuss the challenges posed by the implementation of this technique into plasma simulations (due to the presence of particles and electromagnetic waves). They present the prospects for and projected benefits of its application to heavy ion fusion, in particular to the simulation of the ion source and the final beam propagation in the chamber. A Collaboration project is under way at LBNL between the Applied Numerical Algorithms Group (ANAG) and the HIF group to couple the Adaptive Mesh Refinement (AMR) library CHOMBO developed by the ANAG group to the Particle-In-Cell accelerator code (WARP) developed by the HIF-VNL. They describe their progress and present their initial findings.

  8. Networking Industry and Academia: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Stephens, Simon; Onofrei, George

    2009-01-01

    Graduate development programmes such as FUSION continue to be seen by policy makers, higher education institutions and small and medium-sized enterprises (SMEs) as primary means of strengthening higher education-business links and in turn improving the match between graduate output and the needs of industry. This paper provides evidence from case…

  9. Projection-Based Linear Constrained Estimation and Fusion over Long-Haul Links

    SciTech Connect

    Rao, Nageswara S

    2016-01-01

    In this work, we study estimation and fusion with linear dynamics in long-haul sensor networks, wherein a number of sensors are remotely deployed over a large geographical area for performing tasks such as target tracking, and a remote fusion center serves to combine the information provided by these sensors in order to improve the overall tracking accuracy. In reality, the motion of a dynamic target might be subject to certain constraints, for instance, those defined by a road network. We explore the accuracy performance of projection-based constrained estimation and fusion methods that is affected by information loss over the long-haul links. We use a tracking example to compare the tracking errors under various implementations of centralized and distributed projection-based estimation and fusion methods.

  10. Internet and web projects for fusion plasma science and education. Final technical report

    SciTech Connect

    Eastman, Timothy E.

    1999-08-30

    The plasma web site at http://www.plasmas.org provides comprehensive coverage of all plasma science and technology with site links worldwide. Prepared to serve the general public, students, educators, researchers, and decision-makers, the site covers basic plasma physics, fusion energy, magnetic confinement fusion, high energy density physics include ICF, space physics and astrophysics, pulsed-power, lighting, waste treatment, plasma technology, plasma theory, simulations and modeling.

  11. Phase space structures in gyrokinetic simulations of fusion plasma turbulence

    NASA Astrophysics Data System (ADS)

    Ghendrih, Philippe; Norscini, Claudia; Cartier-Michaud, Thomas; Dif-Pradalier, Guilhem; Abiteboul, Jérémie; Dong, Yue; Garbet, Xavier; Gürcan, Ozgür; Hennequin, Pascale; Grandgirard, Virginie; Latu, Guillaume; Morel, Pierre; Sarazin, Yanick; Storelli, Alexandre; Vermare, Laure

    2014-10-01

    Gyrokinetic simulations of fusion plasmas give extensive information in 5D on turbulence and transport. This paper highlights a few of these challenging physics in global, flux driven simulations using experimental inputs from Tore Supra shot TS45511. The electrostatic gyrokinetic code GYSELA is used for these simulations. The 3D structure of avalanches indicates that these structures propagate radially at localised toroidal angles and then expand along the field line at sound speed to form the filaments. Analysing the poloidal mode structure of the potential fluctuations (at a given toroidal location), one finds that the low modes m = 0 and m = 1 exhibit a global structure; the magnitude of the m = 0 mode is much larger than that of the m = 1 mode. The shear layers of the corrugation structures are thus found to be dominated by the m = 0 contribution, that are comparable to that of the zonal flows. This global mode seems to localise the m = 2 mode but has little effect on the localisation of the higher mode numbers. However when analysing the pulsation of the latter modes one finds that all modes exhibit a similar phase velocity, comparable to the local zonal flow velocity. The consequent dispersion like relation between the modes pulsation and the mode numbers provides a means to measure the zonal flow. Temperature fluctuations and the turbulent heat flux are localised between the corrugation structures. Temperature fluctuations are found to exhibit two scales, small fluctuations that are localised by the corrugation shear layers, and appear to bounce back and forth radially, and large fluctuations, also readily observed on the flux, which are associated to the disruption of the corrugations. The radial ballistic velocity of both avalanche events if of the order of 0.5ρ∗c0 where ρ∗ = ρ0/a, a being the tokamak minor radius and ρ0 being the characteristic Larmor radius, ρ0 = c0/Ω0. c0 is the reference ion thermal velocity and Ω0 = qiB0/mi the reference

  12. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  13. Image Fusion Software in the Clearpem-Sonic Project

    NASA Astrophysics Data System (ADS)

    Pizzichemi, M.; di Vara, N.; Cucciati, G.; Ghezzi, A.; Paganoni, M.; Farina, F.; Frisch, B.; Bugalho, R.

    2012-08-01

    ClearPEM-Sonic is a mammography scanner that combines Positron Emission Tomography with 3D ultrasound echographic and elastographic imaging. It has been developed to improve early stage detection of breast cancer by combining metabolic and anatomical information. The PET system has been developed by the Crystal Clear Collaboration, while the 3D ultrasound probe has been provided by SuperSonic Imagine. In this framework, the visualization and fusion software is an essential tool for the radiologists in the diagnostic process. This contribution discusses the design choices, the issues faced during the implementation, and the commissioning of the software tools developed for ClearPEM-Sonic.

  14. Humanoid Flight Metabolic Simulator Project

    NASA Technical Reports Server (NTRS)

    Ross, Stuart

    2015-01-01

    NASA's Evolvable Mars Campaign (EMC) has identified several areas of technology that will require significant improvements in terms of performance, capacity, and efficiency, in order to make a manned mission to Mars possible. These include crew vehicle Environmental Control and Life Support System (ECLSS), EVA suit Portable Life Support System (PLSS) and Information Systems, autonomous environmental monitoring, radiation exposure monitoring and protection, and vehicle thermal control systems (TCS). (MADMACS) in a Suit can be configured to simulate human metabolism, consuming crew resources (oxygen) in the process. In addition to providing support for testing Life Support on unmanned flights, MADMACS will also support testing of suit thermal controls, and monitor radiation exposure, body zone temperatures, moisture, and loads.

  15. Programmable AC power supply for simulating power transient expected in fusion reactor

    SciTech Connect

    Halimi, B.; Suh, K. Y.

    2012-07-01

    This paper focus on control engineering of the programmable AC power source which has capability to simulate power transient expected in fusion reactor. To generate the programmable power source, AC-AC power electronics converter is adopted to control the power of a set of heaters to represent the transient phenomena of heat exchangers or heat sources of a fusion reactor. The International Thermonuclear Experimental Reactor (ITER) plasma operation scenario is used as the basic reference for producing this transient power source. (authors)

  16. Gamma Efficiency Simulations towards Coincidence Measurements for Fusion Cross Sections

    NASA Astrophysics Data System (ADS)

    Heine, M.; Courtin, S.; Fruet, G.; Jenkins, D. G.; Montanari, D.; Morris, L.; Regan, P. H.; Rudigier, M.; Symochko, D.

    2016-10-01

    With the experimental station STELLA (STELlar LAboratory) we will measure fusion cross sections of astrophysical relevance making use of the coincident detection of charged particles and gamma rays for background reduction. For the measurement of gamma rays from the de-excitation of fusion products a compact array of 36 UK FATIMA LaBr3 detectors is designed based on efficiency studies with Geant4. The photo peak efficiency in the region of interest compares to other gamma detection systems used in this field. The features of the internal decay of 138La is used in a background study to obtain an online calibration of the gamma detectors. Background data are fit to the Monte Carlo model of the self activity assuming crude exponential behavior of external background. Accuracy in the region of interest is of the order of some keV in this first study.

  17. Simulation of RF-fields in a fusion device

    SciTech Connect

    De Witte, Dieter; Bogaert, Ignace; De Zutter, Daniel; Van Oost, Guido; Van Eester, Dirk

    2009-11-26

    In this paper the problem of scattering off a fusion plasma is approached from the point of view of integral equations. Using the volume equivalence principle an integral equation is derived which describes the electromagnetic fields in a plasma. The equation is discretized with MoM using conforming basis functions. This reduces the problem to solving a dense matrix equation. This can be done iteratively. Each iteration can be sped up using FFTs.

  18. Graduate Training: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Hegarty, Cecilia; Johnston, Janet

    2008-01-01

    Purpose: This paper aims to explore graduate training through SME-based project work. The views and behaviours of graduates are examined along with the perceptions of the SMEs and academic partner institutions charged with training graduates. Design/methodology/approach: The data are largely qualitative and derived from the experiences of…

  19. One-dimensional particle simulations of Knudsen-layer effects on D-T fusion

    SciTech Connect

    Cohen, Bruce I.; Dimits, Andris M.; Zimmerman, George B.; Wilks, Scott C.

    2014-12-15

    Particle simulations are used to solve the fully nonlinear, collisional kinetic equation describing the interaction of a high-temperature, high-density, deuterium-tritium plasma with absorbing boundaries, a plasma source, and the influence of kinetic effects on fusion reaction rates. Both hydrodynamic and kinetic effects influence the end losses, and the simulations show departures of the ion velocity distributions from Maxwellian due to the reduction of the population of the highest energy ions (Knudsen-layer effects). The particle simulations show that the interplay between sources, plasma dynamics, and end losses results in temperature anisotropy, plasma cooling, and concomitant reductions in the fusion reaction rates. However, for the model problems and parameters considered, particle simulations show that Knudsen-layer modifications do not significantly affect the velocity distribution function for velocities most important in determining the fusion reaction rates, i.e., the thermal fusion reaction rates using the local densities and bulk temperatures give good estimates of the kinetic fusion reaction rates.

  20. Synaptic fusion pore structure and AMPA receptor activation according to Brownian simulation of glutamate diffusion.

    PubMed

    Ventriglia, Francesco; Maio, Vito Di

    2003-03-01

    The rising phase of fast, AMPA-mediated Excitatory Post Synaptic Currents (EPSCs) has a primary role in the computational ability of neurons. The structure and radial expansion velocity of the fusion pore between the vesicle and the presynaptic membrane could be important factors in determining the time course of the EPSC. We have used a Brownian simulation model for glutamate neurotransmitter diffusion to test two hypotheses on the fusion pore structure, namely, the proteinaceous pore and the purely lipidic pore. Three more hypotheses on the radial expansion velocity were also tested. The rising phases of the EPSC, computed under various conditions, were compared with experimental data from the literature. Our present results show that a proteinaceous fusion pore should produce a more marked foot at the beginning of the rising phase of the EPSC. They also confirm the hypothesis that the structure of the fusion pore and its radial expansion velocity play significant roles in shaping the fast EPSC time course.

  1. Overview of Theory and Simulations in the Heavy Ion Fusion Science Virtual National Laboratory

    SciTech Connect

    Friedman, A

    2006-07-03

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  2. Overview of Theory and Simulations in the Heavy Ion Fusion ScienceVirtual National Laboratory

    SciTech Connect

    Friedman, Alex

    2006-07-09

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  3. Simulations of the performance of the Fusion-FEM, for an increased e-beam emittance

    SciTech Connect

    Tulupov, A.V.; Urbanus, W.H.; Caplan, M.

    1995-12-31

    The original design of the Fusion-FEM, which is under construction at the FOM-Institute for Plasma Physics, was based on an electron beam emittance of 50 {pi} mm mrad. Recent measurements of the emittance of the beam emitted by the electron gun showed that the actual emittance is 80 {pi} mm mrad. This results in a 2.5 times lower beam current density inside the undulator. As a result it changes the linear gain, the start-up time, the saturation level and the frequency spectrum. The main goal of the FEM project is to demonstrate a stable microwave output power of at least 1 MW. The decrease of the electron beam current density has to be compensated by variations of the other FEM parameters, such as the reflection (feedback) coefficient of the microwave cavity and the length of the drift gap between the two sections of the step-tapered undulator. All basic dependencies of the linear and nonlinear gain, and of the output power on the main FEM parameters have been simulated numerically with the CRMFEL code. Regimes of stable operation of the FEM with the increased emittance have been found. These regimes could be found because of the original flexibility of the FEM design.

  4. Phasor Simulator for Operator Training Project

    SciTech Connect

    Dyer, Jim

    2016-09-14

    Synchrophasor systems are being deployed in power systems throughout the North American Power Grid and there are plans to integrate this technology and its associated tools into Independent System Operator (ISO)/utility control room operations. A pre-requisite to using synchrophasor technologies in control rooms is for operators to obtain training and understand how to use this technology in real-time situations. The Phasor Simulator for Operator Training (PSOT) project objective was to develop, deploy and demonstrate a pre-commercial training simulator for operators on the use of this technology and to promote acceptance of the technology in utility and ISO/Regional Transmission Owner (RTO) control centers.

  5. Data assimilation and data fusion in a regional simulation

    NASA Astrophysics Data System (ADS)

    Hoareau, N.; Umbert, M.; Turiel, A.; Ballabrera, J.; Portabella, M.

    2012-04-01

    An Ensemble Kalman filter [Ballabrera-Poy et al., 2009] has been used to assimilate Sea Surface Temperature (SST) and Argo data into a regional configuration of the NEMO-OPA ocean model. Our validation of the data assimilation experiments include the comparison against a random ensemble of Argo profilers previously set aside (cross-validation), where the usual metrics are estimated from the differences of our data assimilation fields against Argo data (root mean square, mean value, standard deviation). We have also developed another metric based on the multifractal structure of the flow, comparing the histograms of singularity exponents of observations, as well as those of the background and analysis fields. While the first approach does directly measure the point by point difference between the model data and the in-situ independent observation, the second method focuses on the geophysical coherence of dynamical structures as it gives information about multi-point spatial correlations. In a second part of this work we have analysed the relative advantages and drawbacks between data assimilation (here based on the Ensemble Kalman filter) and data fusion (here based on the Multifractal Microcanonical Formalism, see Pottier et al., 2008) when applied to the production of quality remote sensing products of ocean observation. We have thus used both methods for the generation of SMOS Level 4 products of Sea Surface Salinity; the resulting maps have been validated with our metrics and analyzed at global and regional basis.

  6. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  7. Sensitivity of mix in Inertial Confinement Fusion simulations to diffusion processes

    NASA Astrophysics Data System (ADS)

    Melvin, Jeremy; Cheng, Baolian; Rana, Verinder; Lim, Hyunkyung; Glimm, James; Sharp, David H.

    2015-11-01

    We explore two themes related to the simulation of mix within an Inertial Confinement Fusion (ICF) implosion, the role of diffusion (viscosity, mass diffusion and thermal conduction) processes and the impact of front tracking on the growth of the hydrodynamic instabilities. Using the University of Chicago HEDP code FLASH, we study the sensitivity of post-shot simulations of a NIC cryogenic shot to the diffusion models and front tracking of the material interfaces. Results of 1D and 2D simulations are compared to experimental quantities and an analysis of the current state of fully integrated ICF simulations is presented.

  8. Surface Roughness Instability Simulations of Inertial Confinement Fusion Implosions

    NASA Astrophysics Data System (ADS)

    McGlinchey, Kristopher; Niasse, Nicolas; Chittenden, Jeremy

    2016-10-01

    Understanding hydrodynamic instabilities seeded by the inherit roughness on a capsule's surface is critical in quantifying an implosion's performance. Combined with instabilities on the ice-gas interface during the deceleration phase, their growth can lead to inhomogeneity in the shell's areal density. Recent work carried out at the National Ignition Facility (NIF) on surface roughness Rayleigh-Taylor Instability (RTI) growth rates show larger amplitudes in experiment as compared to simulation, even with a deliberately roughened surface. We report on simulations of ICF experiments occurring at NIF using the Chimera code developed at Imperial College. Chimera is a fully explicit, Eulerian 3D multi-group radiation-hydrodynamics code utilising P1/3 automatic flux limiting radiation transport with opacity data from a non-LTE atomic model also developed at Imperial College. One-dimensional simulations are briefly presented to highlight that proper shock timing and stagnation properties have been achieved as are 2D harmonic perturbation simulations to benchmark their growth rates. Surface roughness implosions (initialised from metrology data) were then simulated for: shot N120321, a low-foot implosion with large surface perturbations and shot N130927, a high-foot implosion. Synthetic radiographs of these implosions were constructed at low convergence ratio (3-4) for comparison to experiment and at higher convergence to investigate what will be observable by new diagnostics in development at NIF.

  9. Radiation-MHD Simulations of Plasma-Jet-Driven Magneto-Inertial Fusion Gain Using USim

    NASA Astrophysics Data System (ADS)

    Stoltz, Peter; Beckwith, Kristian; Kundrapu, Mahdusudhan; Hsu, Scott; Langendorf, Samuel

    2016-10-01

    One goal of the modeling effort for the PLX- α project is to identify plasma-jet-driven magneto-inertial fusion (PJMIF) configurations with potential net fusion-energy gain. We use USim, which is a tool for modeling high-energy-density plasmas using multi-fluid models coupled to electromagnetics using fully-implicit iterative solvers, combined with finite volume discretizations on unstructured meshes. We include physical viscosity and advanced-EOS modeling capability, and are investigating the effects of different radiation (including flux-limited diffusion) and alpha-transport models. We compare 2D and 1D gain calculations for various liner geometries, parameters, and plasma species, and consider the effects of liner non-uniformities on fusion-gain degradation. Supported by the ARPA-E ALPHA Program.

  10. The UPSCALE project: a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, Matthew; Roberts, Malcolm; Vidale, Pier Luigi; Schiemann, Reinhard; Demory, Marie-Estelle; Strachan, Jane

    2014-05-01

    The development of a traceable hierarchy of HadGEM3 global climate models, based upon the Met Office Unified Model, at resolutions from 135 km to 25 km, now allows the impact of resolution on the mean state, variability and extremes of climate to be studied in a robust fashion. In 2011 we successfully obtained a single-year grant of 144 million core hours of supercomputing time from the PRACE organization to run ensembles of 27 year atmosphere-only (HadGEM3-A GA3.0) climate simulations at 25km resolution, as used in present global weather forecasting, on HERMIT at HLRS. Through 2012 the UPSCALE project (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) ran over 650 years of simulation at resolutions of 25 km (N512), 60 km (N216) and 135 km (N96) to look at the value of high resolution climate models in the study of both present climate and a potential future climate scenario based on RCP8.5. Over 400 TB of data was produced using HERMIT, with additional simulations run on HECToR (UK supercomputer) and MONSooN (Met Office NERC Supercomputing Node). The data generated was transferred to the JASMIN super-data cluster, hosted by STFC CEDA in the UK, where analysis facilities are allowing rapid scientific exploitation of the data set. Many groups across the UK and Europe are already taking advantage of these facilities and we welcome approaches from other interested scientists. This presentation will briefly cover the following points; Purpose and requirements of the UPSCALE project and facilities used. Technical implementation and hurdles (model porting and optimisation, automation, numerical failures, data transfer). Ensemble specification. Current analysis projects and access to the data set. A full description of UPSCALE and the data set generated has been submitted to Geoscientific Model development, with overview information available from http://proj.badc.rl.ac.uk/upscale .

  11. Gyrokinetic Simulation of Energetic Particles Turbulence and Transport in Fusion Plasmas

    NASA Astrophysics Data System (ADS)

    Zhang, Wenlu; Lin, Zhihong; Holod, Ihor; Xiao, Yong; Bierwage, Andreas; Spong, Donald; Chu, Ming

    2009-05-01

    The confinement of the energetic particles (EP) is a critical issue in the International Thermonuclear Experimental Reactor (ITER), since that ignition relies on the self-heating by the fusion products. Shear Alfven wave excitations by EP in toroidal systems, for example Toroidal Alfven Eigenmode (TAE) and Energetic Particle Mode (EPM) have been investigated as primary candidate for fluctuation-induced transport of EP in fusion plasma. In this work, TAE excitations by energetic particles are investigated in large scale first-principle simulations of fusion plasmas using the global gyrokinetic toroidal code (GTC) [Lin, Science 1998]. Comprehensive linear benchmarking results are reported between GTC, GYRO, fluid code TAEFL, and Magnetohydrodynamic-gyrokinetic hybrid code HMGC.

  12. Simulating Halos with the Caterpillar Project

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-04-01

    The Caterpillar Project is a beautiful series of high-resolution cosmological simulations. The goal of this project is to examine the evolution of dark-matter halos like the Milky Ways, to learn about how galaxies like ours formed. This immense computational project is still in progress, but the Caterpillar team is already providing a look at some of its first results.Lessons from Dark-Matter HalosWhy simulate the dark-matter halos of galaxies? Observationally, the formation history of our galaxy is encoded in galactic fossil record clues, like the tidal debris from disrupted satellite galaxies in the outer reaches of our galaxy, or chemical abundance patterns throughout our galactic disk and stellar halo.But to interpret this information in a way that lets us learn about our galaxys history, we need to first test galaxy formation and evolution scenarios via cosmological simulations. Then we can compare the end result of these simulations to what we observe today.This figure illustrates the difference that mass resolution makes. In the left panel, the mass resolution is 1.5*10^7 solar masses per particle. In the right panel, the mass resolution is 3*10^4 solar masses per particle [Griffen et al. 2016]A Computational ChallengeDue to how computationally expensive such simulations are, previous N-body simulations of the growth of Milky-Way-like halos have consisted of only one or a few halos each. But in order to establish a statistical understanding of how galaxy halos form and find out whether the Milky Ways halo is typical or unusual! it is necessary to simulate a larger number of halos.In addition, in order to accurately follow the formation and evolution of substructure within the dark-matter halos, these simulations must be able to resolve the smallest dwarf galaxies, which are around a million solar masses. This requires an extremely high mass resolution, which adds to the computational expense of the simulation.First OutcomesThese are the challenges faced by

  13. StabilimaxNZ® versus simulated fusion: evaluation of adjacent-level effects

    PubMed Central

    Henderson, Gweneth; James, Yue; Timm, Jens Peter

    2007-01-01

    Rationale behind motion preservation devices is to eliminate the accelerated adjacent-level effects (ALE) associated with spinal fusion. We evaluated multidirectional flexibilities and ALEs of StabilimaxNZ® and simulated fusion applied to a decompressed spine. StabilimaxNZ® was applied at L4–L5 after creating a decompression (laminectomy of L4 plus bilateral medial facetectomy at L4–L5). Multidirectional Flexibility and Hybrid tests were performed on six fresh cadaveric human specimens (T12–S1). Decompression increased average flexion–extension rotation to 124.0% of the intact. StabilimaxNZ® and simulated fusion decreased the motion to 62.4 and 23.8% of intact, respectively. In lateral bending, corresponding increase was 121.6% and decreases were 57.5 and 11.9%. In torsion, corresponding increase was 132.7%, and decreases were 36.3% for fusion, and none for StabilimaxNZ® ALE was defined as percentage increase over the intact. The ALE at L3–4 was 15.3% for StabilimaxNZ® versus 33.4% for fusion, while at L5–S1 the ALE were 5.0% vs. 11.3%, respectively. In lateral bending, the corresponding ALE values were 3.0% vs. 19.1%, and 11.3% vs. 35.8%, respectively. In torsion, the corresponding values were 3.7% vs. 20.6%, and 4.0% vs. 33.5%, respectively. In conclusion, this in vitro study using Flexibility and Hybrid test methods showed that StabilimaxNZ® stabilized the decompressed spinal level effectively in sagittal and frontal planes, while allowing a good portion of the normal rotation, and concurrently it did not produce significant ALEs as compared to the fusion. However, it did not stabilize the decompressed specimen in torsion. PMID:17924151

  14. SciDAC Fusiongrid Project--A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    SCHISSEL, D.P.; ABLA, G.; BURRUSS, J.R.; FEIBUSH, E.; FREDIAN, T.W.; GOODE, M.M.; GREENWALD, M.J.; KEAHEY, K.; LEGGETT, T.; LI, K.; McCUNE, D.C.; PAPKA, M.E.; RANDERSON, L.; SANDERSON, A.; STILLERMAN, J.; THOMPSON, M.R.; URAM, T.; WALLACE, G.

    2006-08-31

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was a collaboration itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. Developing a reliable energy system that is economically and environmentally sustainable is the long-term goal of Fusion Energy Science (FES) research. In the U.S., FES experimental research is centered at three large facilities with a replacement value of over $1B. As these experiments have increased in size and complexity, there has been a concurrent growth in the number and importance of collaborations among large groups at the experimental sites and smaller groups located nationwide. Teaming with the experimental community is a theoretical and simulation community whose efforts range from applied analysis of experimental data to fundamental theory (e.g., realistic nonlinear 3D plasma models) that run on massively parallel computers. Looking toward the future, the large-scale experiments needed for FES research are staffed by correspondingly large, globally dispersed teams. The fusion program will be increasingly oriented toward the International Thermonuclear Experimental Reactor (ITER) where even now, a decade before operation begins, a large

  15. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, D.L.; Greenwood, L.R.; Loomis, B.A.

    1988-05-20

    This paper discusses an apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  16. Study of Plasma Liner Driven Magnetized Target Fusion Via Advanced Simulations

    SciTech Connect

    Samulyak, Roman V.; Parks, Paul

    2013-08-31

    The feasibility of the plasma liner driven Magnetized Target Fusion (MTF) via terascale numerical simulations will be assessed. In the MTF concept, a plasma liner, formed by merging of a number (60 or more) of radial, highly supersonic plasma jets, implodes on the target in the form of two compact plasma toroids, and compresses it to conditions of the fusion ignition. By avoiding major difficulties associated with both the traditional laser driven inertial confinement fusion and solid liner driven MTF, the plasma liner driven MTF potentially provides a low-cost and fast R&D path towards the demonstration of practical fusion energy. High fidelity numerical simulations of full nonlinear models associated with the plasma liner MTF using state-of-art numerical algorithms and terascale computing are necessary in order to resolve uncertainties and provide guidance for future experiments. At Stony Brook University, we have developed unique computational capabilities that ideally suite the MTF problem. The FronTier code, developed in collaboration with BNL and LANL under DOE funding including SciDAC for the simulation of 3D multi-material hydro and MHD flows, has beenbenchmarked and used for fundamental and engineering problems in energy science applications. We have performed 3D simulations of converging supersonic plasma jets, their merger and the formation of the plasma liner, and a study of the corresponding oblique shock problem. We have studied the implosion of the plasma liner on the magnetized plasma target by resolving Rayleigh-Taylor instabilities in 2D and 3D and other relevant physics and estimate thermodynamic conditions of the target at the moment of maximum compression and the hydrodynamic efficiency of the method.

  17. Three-dimensional simulations of the implosion of inertial confinement fusion targets

    SciTech Connect

    Town, R.P.J.; Bell, A.R. )

    1991-09-30

    The viability of inertial confinement fusion depends crucially on implosion symmetry. A spherical three-dimensional hydrocode called PLATO has been developed to model the growth in asymmetries during an implosion. Results are presented in the deceleration phase which show indistinguishable linear growth rates, but greater nonlinear growth of the Rayleigh-Taylor instability than is found in two-dimensional cylindrical simulations. The three-dimensional enhancement of the nonlinear growth is much smaller than that found by Sakagami and Nishihara.

  18. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-01-01

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  19. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-03-07

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  20. Analyzing large data sets from XGC1 magnetic fusion simulations using apache spark

    SciTech Connect

    Churchill, R. Michael

    2016-11-21

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  1. Lipid droplets fusion in adipocyte differentiated 3T3-L1 cells: A Monte Carlo simulation

    SciTech Connect

    Boschi, Federico; Rizzatti, Vanni; Zamboni, Mauro; Sbarbati, Andrea

    2014-02-15

    Several human worldwide diseases like obesity, type 2 diabetes, hepatic steatosis, atherosclerosis and other metabolic pathologies are related to the excessive accumulation of lipids in cells. Lipids accumulate in spherical cellular inclusions called lipid droplets (LDs) whose sizes range from fraction to one hundred of micrometers in adipocytes. It has been suggested that LDs can grow in size due to a fusion process by which a larger LD is obtained with spherical shape and volume equal to the sum of the progenitors’ ones. In this study, the size distribution of two populations of LDs was analyzed in immature and mature (5-days differentiated) 3T3-L1 adipocytes (first and second populations, respectively) after Oil Red O staining. A Monte Carlo simulation of interaction between LDs has been developed in order to quantify the size distribution and the number of fusion events needed to obtain the distribution of the second population size starting from the first one. Four models are presented here based on different kinds of interaction: a surface weighted interaction (R2 Model), a volume weighted interaction (R3 Model), a random interaction (Random model) and an interaction related to the place where the LDs are born (Nearest Model). The last two models mimic quite well the behavior found in the experimental data. This work represents a first step in developing numerical simulations of the LDs growth process. Due to the complex phenomena involving LDs (absorption, growth through additional neutral lipid deposition in existing droplets, de novo formation and catabolism) the study focuses on the fusion process. The results suggest that, to obtain the observed size distribution, a number of fusion events comparable with the number of LDs themselves is needed. Moreover the MC approach results a powerful tool for investigating the LDs growth process. Highlights: • We evaluated the role of the fusion process in the synthesis of the lipid droplets. • We compared the

  2. High-level multifunction radar simulation for studying the performance of multisensor data fusion systems

    NASA Astrophysics Data System (ADS)

    Huizing, Albert G.; Bosse, Eloi

    1998-07-01

    This paper presents the basic requirements for a simulation of the main capabilities of a shipborne MultiFunction Radar (MFR) that can be used in conjunction with other sensor simulations in scenarios for studying Multi Sensor Data Fusion (MSDF) systems. This simulation is being used to support an ongoing joint effort (Canada - The Netherlands) in the development of MSDF testbeds. This joint effort is referred as Joint-FACET (Fusion Algorithms & Concepts Exploration Testbed), a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. The question raised here is how realistic should the sensor simulations be to trust the MSDF performance assessment? A partial answer to this question is that at least, the dominant perturbing effects on sensor detection (true or false) are sufficiently represented. Following this philosophy, the MFR model, presented here, takes into account sensor's design parameters and external environmental effects such as clutter, propagation and jamming. Previous radar simulations capture most of these dominant effects. In this paper the emphasis is on an MFR scheduler which is the key element that needs to be added to the previous simulations to represent the MFR capability to search and track a large number of targets and at the same time support a large number of (semi-active) surface-to-air missiles (SAM) for the engagement of multiple hostile targets.

  3. An interprojection sensor fusion approach to estimate blocked projection signal in synchronized moving grid-based CBCT system

    SciTech Connect

    Zhang, Hong; Kong, Vic; Ren, Lei; Giles, William; Zhang, You; Jin, Jian-Yue

    2016-01-15

    Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the

  4. An interprojection sensor fusion approach to estimate blocked projection signal in synchronized moving grid-based CBCT system

    PubMed Central

    Zhang, Hong; Ren, Lei; Kong, Vic; Giles, William; Zhang, You; Jin, Jian-Yue

    2016-01-01

    Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the

  5. Transition from Beam-Target to Thermonuclear Fusion in High-Current Deuterium Z-Pinch Simulations

    NASA Astrophysics Data System (ADS)

    Offermann, Dustin; Welch, Dale; Rose, Dave; Thoma, Carsten; Clark, Robert; Mostrom, Chris; Schmidt, Andrea; Link, Anthony

    2016-10-01

    Fusion yields from dense, Z-pinch plasmas are known to scale with the drive current, which is favorable for many potential applications. Decades of experimental studies, however, show an unexplained drop in yield for currents above a few mega-ampere (MA). In this work, simulations of DD Z-Pinch plasmas have been performed in 1D and 2D for a constant pinch time and initial radius using the code LSP, and observations of a shift in scaling are presented. The results show that yields below 3 MA are enhanced relative to pure thermonuclear scaling by beamlike particles accelerated in the Rayleigh-Taylor induced electric fields, while yields above 3 MA are reduced because of energy lost by the instability and the inability of the beamlike ions to enter the pinch region. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA).

  6. Simulations of mixing in Inertial Confinement Fusion with front tracking and sub-grid scale models

    NASA Astrophysics Data System (ADS)

    Rana, Verinder; Lim, Hyunkyung; Melvin, Jeremy; Cheng, Baolian; Glimm, James; Sharp, David

    2015-11-01

    We present two related results. The first discusses the Richtmyer-Meshkov (RMI) and Rayleigh-Taylor instabilities (RTI) and their evolution in Inertial Confinement Fusion simulations. We show the evolution of the RMI to the late time RTI under transport effects and tracking. The role of the sub-grid scales helps capture the interaction of turbulence with diffusive processes. The second assesses the effects of concentration on the physics model and examines the mixing properties in the low Reynolds number hot spot. We discuss the effect of concentration on the Schmidt number. The simulation results are produced using the University of Chicago code FLASH and Stony Brook University's front tracking algorithm.

  7. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators

    PubMed Central

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-01-01

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented. PMID:26703603

  8. Online Simulation of Radiation Track Structure Project

    NASA Technical Reports Server (NTRS)

    Plante, Ianik

    2015-01-01

    Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.

  9. Neutral Buoyancy Simulator - EASE Project (NB32)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA's Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. Pictured is a Massachusetts Institute of Technology (MIT) student working in a spacesuit on the Experimental Assembly of Structures in Extravehicular Activity (EASE) project which was developed as a joint effort between MFSC and MIT. The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. The MIT student in this photo is assembling two six-beam tetrahedrons.

  10. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment.

    PubMed

    Marzinek, Jan K; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G; Panzade, Sadhana; Verma, Chandra; Bond, Peter J

    2016-01-20

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes.

  11. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    PubMed Central

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes. PMID:26785994

  12. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    NASA Astrophysics Data System (ADS)

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes.

  13. The GEM Detector projective alignment simulation system

    SciTech Connect

    Wuest, C.R.; Belser, F.C.; Holdener, F.R.; Roeben, M.D.; Paradiso, J.A.; Mitselmakher, G.; Ostapchuk, A.; Pier-Amory, J.

    1993-07-09

    Precision position knowledge (< 25 microns RMS) of the GEM Detector muon system at the Superconducting Super Collider Laboratory (SSCL) is an important physics requirement necessary to minimize sagitta error in detecting and tracking high energy muons that are deflected by the magnetic field within the GEM Detector. To validate the concept of the sagitta correction function determined by projective alignment of the muon detectors (Cathode Strip Chambers or CSCs), the basis of the proposed GEM alignment scheme, a facility, called the ``Alignment Test Stand`` (ATS), is being constructed. This system simulates the environment that the CSCs and chamber alignment systems are expected to experience in the GEM Detector, albeit without the 0.8 T magnetic field and radiation environment. The ATS experimental program will allow systematic study and characterization of the projective alignment approach, as well as general mechanical engineering of muon chamber mounting concepts, positioning systems and study of the mechanical behavior of the proposed 6 layer CSCs. The ATS will consist of a stable local coordinate system in which mock-ups of muon chambers (i.e., non-working mechanical analogs, representing the three superlayers of a selected barrel and endcap alignment tower) are implemented, together with a sufficient number of alignment monitors to overdetermine the sagitta correction function, providing a self-consistency check. This paper describes the approach to be used for the alignment of the GEM muon system, the design of the ATS, and the experiments to be conducted using the ATS.

  14. Quasi-spherical direct drive fusion simulations for the Z machine and future accelerators.

    SciTech Connect

    VanDevender, J. Pace; McDaniel, Dillon Heirman; Roderick, Norman Frederick; Nash, Thomas J.

    2007-11-01

    We explored the potential of Quasi-Spherical Direct Drive (QSDD) to reduce the cost and risk of a future fusion driver for Inertial Confinement Fusion (ICF) and to produce megajoule thermonuclear yield on the renovated Z Machine with a pulse shortening Magnetically Insulated Current Amplifier (MICA). Analytic relationships for constant implosion velocity and constant pusher stability have been derived and show that the required current scales as the implosion time. Therefore, a MICA is necessary to drive QSDD capsules with hot-spot ignition on Z. We have optimized the LASNEX parameters for QSDD with realistic walls and mitigated many of the risks. Although the mix-degraded 1D yield is computed to be {approx}30 MJ on Z, unmitigated wall expansion under the > 100 gigabar pressure just before burn prevents ignition in the 2D simulations. A squeezer system of adjacent implosions may mitigate the wall expansion and permit the plasma to burn.

  15. Adaptive multifocus image fusion using block compressed sensing with smoothed projected Landweber integration in the wavelet domain.

    PubMed

    V S, Unni; Mishra, Deepak; Subrahmanyam, G R K S

    2016-12-01

    The need for image fusion in current image processing systems is increasing mainly due to the increased number and variety of image acquisition techniques. Image fusion is the process of combining substantial information from several sensors using mathematical techniques in order to create a single composite image that will be more comprehensive and thus more useful for a human operator or other computer vision tasks. This paper presents a new approach to multifocus image fusion based on sparse signal representation. Block-based compressive sensing integrated with a projection-driven compressive sensing (CS) recovery that encourages sparsity in the wavelet domain is used as a method to get the focused image from a set of out-of-focus images. Compression is achieved during the image acquisition process using a block compressive sensing method. An adaptive thresholding technique within the smoothed projected Landweber recovery process reconstructs high-resolution focused images from low-dimensional CS measurements of out-of-focus images. Discrete wavelet transform and dual-tree complex wavelet transform are used as the sparsifying basis for the proposed fusion. The main finding lies in the fact that sparsification enables a better selection of the fusion coefficients and hence better fusion. A Laplacian mixture model fit is done in the wavelet domain and estimation of the probability density function (pdf) parameters by expectation maximization leads us to the proper selection of the coefficients of the fused image. Using the proposed method compared with the fusion scheme without employing the projected Landweber (PL) scheme and the other existing CS-based fusion approaches, it is observed that with fewer samples itself, the proposed method outperforms other approaches.

  16. Detector Simulations for the COREA Project

    NASA Astrophysics Data System (ADS)

    Lee, Sungwon; Kang, Hyesung

    2006-12-01

    The COREA (COsmic ray Research and Education Array in Korea) project aims to build a ground array of particle detectors distributed over the Korean Peninsular, through collaborations of high school students, educators, and university researchers, in order to study the origin of ultra high energy cosmic rays. COREA array will consist of about 2000 detector stations covering several hundreds of km2 area at its final configuration and detect electrons and muons in extensive air-showers triggered by high energy particles. During the initial pase COREA array will start with a small number of detector stations in Seoul area schools. In this paper, we have studied by Monte Carlo simulations how to select detector sites for optimal detection efficiency for proton triggered air-showers. We considered several model clusters with up to 30 detector stations and calculated the effective number of air-shower events that can be detected per year for each cluster. The greatest detection efficiency is achieved when the mean distance between detector stations of a cluster is comparable to the effective radius of the air-shower of a given proton energy. We find the detection efficiency of a cluster with randomly selected detector sites is comparable to that of clusters with uniform detector spacing. We also considered a hybrid cluster with 60 detector stations that combines a small cluster with Δl ≈ 100 m and a large cluster with Δl ≈ 1 km. We suggest that it can be an ideal configuration for the initial phase study of the COREA project, since it can measure the cosmic rays with a wide energy range, i.e., 1016eV ≤E ≤ 1019eV, with a reasonable detection rate.

  17. Simulations of longitudinal beam dynamics of space-charge dominated beams for heavy ion fusion

    SciTech Connect

    Miller, Debra Ann Callahan

    1994-12-01

    The longitudinal instability has potentially disastrous effects on the ion beams used for heavy ion driven inertial confinement fusion. This instability is a "resistive wall" instability with the impedance coining from the induction modules in the accelerator used as a driver. This instability can greatly amplify perturbations launched from the beam head and can prevent focusing of the beam onto the small spot necessary for fusion. This instability has been studied using the WARPrz particle-in-cell code. WARPrz is a 2 1/2 dimensional electrostatic axisymmetric code. This code includes a model for the impedance of the induction modules. Simulations with resistances similar to that expected in a driver show moderate amounts of growth from the instability as a perturbation travels from beam head to tail as predicted by cold beam fluid theory. The perturbation reflects off the beam tail and decays as it travels toward the beam head. Nonlinear effects cause the perturbation to steepen during reflection. Including the capacitive component of the, module impedance. has a partially stabilizing effect on the longitudinal instability. This reduction in the growth rate is seen in both cold beam fluid theory and in simulations with WARPrz. Instability growth rates for warm beams measured from WARPrz are lower than cold beam fluid theory predicts. Longitudinal thermal spread cannot account for this decrease in the growth rate. A mechanism for coupling the transverse thermal spread to decay of the longitudinal waves is presented. The longitudinal instability is no longer a threat to the heavy ion fusion program. The simulations in this thesis have shown that the growth rate for this instability will not be as large as earlier calculations predicted.

  18. Adjoint Monte Carlo simulation of fusion product activation probe experiment in ASDEX Upgrade tokamak

    NASA Astrophysics Data System (ADS)

    Äkäslompolo, S.; Bonheure, G.; Tardini, G.; Kurki-Suonio, T.; The ASDEX Upgrade Team

    2015-10-01

    The activation probe is a robust tool to measure flux of fusion products from a magnetically confined plasma. A carefully chosen solid sample is exposed to the flux, and the impinging ions transmute the material making it radioactive. Ultra-low level gamma-ray spectroscopy is used post mortem to measure the activity and, thus, the number of fusion products. This contribution presents the numerical analysis of the first measurement in the ASDEX Upgrade tokamak, which was also the first experiment to measure a single discharge. The ASCOT suite of codes was used to perform adjoint/reverse Monte Carlo calculations of the fusion products. The analysis facilitates, for the first time, a comparison of numerical and experimental values for absolutely calibrated flux. The results agree to within a factor of about two, which can be considered a quite good result considering the fact that all features of the plasma cannot be accounted in the simulations.Also an alternative to the present probe orientation was studied. The results suggest that a better optimized orientation could measure the flux from a significantly larger part of the plasma. A shorter version of this contribution is due to be published in PoS at: 1st EPS conference on Plasma Diagnostics

  19. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500

  20. Multi-step formation of a hemifusion diaphragm for vesicle fusion revealed by all-atom molecular dynamics simulations.

    PubMed

    Tsai, Hui-Hsu Gavin; Chang, Che-Ming; Lee, Jian-Bin

    2014-06-01

    Membrane fusion is essential for intracellular trafficking and virus infection, but the molecular mechanisms underlying the fusion process remain poorly understood. In this study, we employed all-atom molecular dynamics simulations to investigate the membrane fusion mechanism using vesicle models which were pre-bound by inter-vesicle Ca(2+)-lipid clusters to approximate Ca(2+)-catalyzed fusion. Our results show that the formation of the hemifusion diaphragm for vesicle fusion is a multi-step event. This result contrasts with the assumptions made in most continuum models. The neighboring hemifused states are separated by an energy barrier on the energy landscape. The hemifusion diaphragm is much thinner than the planar lipid bilayers. The thinning of the hemifusion diaphragm during its formation results in the opening of a fusion pore for vesicle fusion. This work provides new insights into the formation of the hemifusion diaphragm and thus increases understanding of the molecular mechanism of membrane fusion. This article is part of a Special Issue entitled: Membrane Structure and Function: Relevance in the Cell's Physiology, Pathology and Therapy.

  1. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Estep, Donald

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  2. Computational Plasma Physics at the Bleeding Edge: Simulating Kinetic Turbulence Dynamics in Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Tang, William

    2013-04-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research in the 21st Century. The imperative is to translate the combination of the rapid advances in super-computing power together with the emergence of effective new algorithms and computational methodologies to help enable corresponding increases in the physics fidelity and the performance of the scientific codes used to model complex physical systems. If properly validated against experimental measurements and verified with mathematical tests and computational benchmarks, these codes can provide more reliable predictive capability for the behavior of complex systems, including fusion energy relevant high temperature plasmas. The magnetic fusion energy research community has made excellent progress in developing advanced codes for which computer run-time and problem size scale very well with the number of processors on massively parallel supercomputers. A good example is the effective usage of the full power of modern leadership class computational platforms from the terascale to the petascale and beyond to produce nonlinear particle-in-cell simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. Illustrative results provide great encouragement for being able to include increasingly realistic dynamics in extreme-scale computing campaigns to enable predictive simulations with unprecedented physics fidelity. Some illustrative examples will be presented of the algorithmic progress from the magnetic fusion energy sciences area in dealing with low memory per core extreme scale computing challenges for the current top 3 supercomputers worldwide. These include advanced CPU systems (such as the IBM-Blue-Gene-Q system and the Fujitsu K Machine) as well as the GPU-CPU hybrid system (Titan).

  3. Progress in theory and simulation of ion cyclotron emission from magnetic confinement fusion plasmas

    NASA Astrophysics Data System (ADS)

    Dendy, Richard; Chapman, Ben; Chapman, Sandra; Cook, James; Reman, Bernard; McClements, Ken; Carbajal, Leopoldo

    2016-10-01

    Suprathermal ion cyclotron emission (ICE) is detected from all large tokamak and stellarator plasmas. Its frequency spectrum has narrow peaks at sequential cyclotron harmonics of the energetic ion population (fusion-born or neutral beam-injected) at the outer edge of the plasma. ICE was the first collective radiative instability driven by confined fusion-born ions observed in deuterium-tritium plasmas in JET and TFTR, and the magnetoacoustic cyclotron instability is the most likely emission mechanism. Contemporary ICE measurements are taken at very high sampling rates from the LHD stellarator and from the conventional aspect ratio KSTAR tokamak. A correspondingly advanced modelling capability for the ICE emission mechanism has been developed using 1D3V PIC and hybrid-PIC codes, supplemented by analytical theory. These kinetic codes simulate the self-consistent full orbit dynamics of energetic and thermal ions, together with the electric and magnetic fields and the electrons. We report recent progress in theory and simulation that addresses: the scaling of ICE intensity with energetic particle density; the transition between super-Alfvénic and sub-Alfvénic regimes for the collectively radiating particles; and the rapid time evolution that is seen for some ICE measurements. This work was supported in part by the RCUK Energy Programme [Grant Number EP/I501045] and by Euratom.

  4. Investigation on reduced thermal models for simulating infrared images in fusion devices

    NASA Astrophysics Data System (ADS)

    Gerardin, J.; Aumeunier, M.-H.; Firdaouss, M.; Gardarein, J.-L.; Rigollet, F.

    2016-09-01

    In fusion facilities, the in-vessel wall receives high heat flux density up to 20 MW/m2. The monitoring of in-vessel components is usually ensured by infra-red (IR) thermography but with all-metallic walls, disturbance phenomenon as reflections may lead to inaccurate temperature estimates, potentially endangering machine safety. A full predictive photonic simulation is then used to assess accurately the IR measurements. This paper investigates some reduced thermal models (semi-infinite wall, thermal quadrupole) to predict the surface temperature from the particle loads on components for a given plasma scenario. The results are compared with a reference 3D Finite Element Method (Ansys Mechanical) and used as input for simulating IR images. The performances of reduced thermal models are analysed by comparing the resulting IR images.

  5. Verification of particle simulation of radio frequency waves in fusion plasmas

    SciTech Connect

    Kuley, Animesh; Lin, Z.; Wang, Z. X.; Wessel, F.

    2013-10-15

    Radio frequency (RF) waves can provide heating, current and flow drive, as well as instability control for steady state operations of fusion experiments. A particle simulation model has been developed in this work to provide a first-principles tool for studying the RF nonlinear interactions with plasmas. In this model, ions are considered as fully kinetic particles using the Vlasov equation and electrons are treated as guiding centers using the drift kinetic equation. This model has been implemented in a global gyrokinetic toroidal code using real electron-to-ion mass ratio. To verify the model, linear simulations of ion plasma oscillation, ion Bernstein wave, and lower hybrid wave are carried out in cylindrical geometry and found to agree well with analytic predictions.

  6. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  7. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    NASA Astrophysics Data System (ADS)

    Haines, Brian M.; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna C.; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael J.; Wilhelmy, Jerry B.

    2016-07-01

    We present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a "CD Mixcap," is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  8. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    DOE PAGES

    Haines, Brian Michael; Grim, Gary P.; Fincke, James R.; ...

    2016-07-29

    Here, we present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employmore » any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  9. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    SciTech Connect

    Haines, Brian Michael; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna Catherine; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael James; Wilhelmy, Jerry B.

    2016-07-29

    Here, we present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  10. Fusion core start-up, ignition and burn simulations of reversed-field pinch (RFP) reactors

    SciTech Connect

    Chu, Yuh-Yi

    1988-01-01

    A transient reactor simulation model is developed to investigate and simulate the start-up, ignition and burn of a reversed-field pinch reactor. The simulation is based upon a spatially averaged plasma balance model with field profiles obtained from MHD quasi-equilibrium analysis. Alpha particle heating is estimated from Fokker-Planck calculations. The instantaneous plasma current is derived from a self-consistent circuit analysis for plasma/coil/eddy current interactions. The simulation code is applied to the TITAN RFP reactor design which features a compact, high-power-density reversed-field pinch fusion system. A contour analysis is performed using the steady-state global plasma balance. The results are presented with contours of constant plasma current. A saddle point is identified in the contour plot which determines the minimum value of plasma current required to achieve ignition. An optimized start-up to ignition and burn path can be obtained by passing through the saddle point. The simulation code is used to study and optimize the start-up scenario. In the simulations of the TITAN RFP reactor, the OH-driven superconducting EF coils are found to deviate from the required equilibrium values as the induced plasma current increases. This results in the modification of superconducting EF coils and the addition of a set of EF trim coils. The design of the EF coil system is performed with the simulation code subject to the optimization of trim-coil power and current. In addition, the trim-coil design is subject to the constraints of vertical-field stability index and maintenance access. A power crowbar is also needed to prevent the superconducting EF coils from generating excessive vertical field. A set of basic results from the simulation of TITAN RFP reactor yield a picture of RFP plasma operation in a reactor. Investigations of eddy current are also presented. 145 refs., 37 figs., 2 tabs.

  11. FATRAS - the ATLAS Fast Track Simulation project

    NASA Astrophysics Data System (ADS)

    Mechnich, Jörg; ATLAS Collaboration

    2011-12-01

    The Monte Carlo simulation of the detector response is an integral component of any analysis performed with data from the LHC experiments. As these simulated data sets must be both large and precise, their production is a CPU-intensive task. ATLAS has developed full and fast detector simulation techniques to achieve this goal within the computing limits of the collaboration. At the current early stages of data-taking, it is necessary to reprocess the Monte Carlo event samples continuously, while integrating adaptations to the simulation modules in order to improve the agreement with data taken by means of the detector itself. FATRAS is a fast track simulation engine which produces a Monte Carlo simulation based on modules and the geometry of the standard ATLAS track reconstruction algorithm. It can be combined with a fast parametrized-response simulation of the calorimeters. This approach shows a high level of agreement with the full simulation, while achieving a relative timing gain of two orders of magnitude. FATRAS was designed to provide a fast feedback cycle for tuning the MC simulation with real data: this includes the material distribution inside the detector, the integration of misalignment and current conditions, as well as calibration at the hit level. We present the updated and calibrated version of FATRAS based on the first LHC data. Extensive comparisons of the fast track simulation with the full simulation and data at 900 GeV are shown.

  12. Angular radiation temperature simulation for time-dependent capsule drive prediction in inertial confinement fusion

    SciTech Connect

    Jing, Longfei; Yang, Dong; Li, Hang; Zhang, Lu; Lin, Zhiwei; Li, Liling; Kuang, Longyu; Jiang, Shaoen Ding, Yongkun; Huang, Yunbao

    2015-02-15

    The x-ray drive on a capsule in an inertial confinement fusion setup is crucial for ignition. Unfortunately, a direct measurement has not been possible so far. We propose an angular radiation temperature simulation to predict the time-dependent drive on the capsule. A simple model, based on the view-factor method for the simulation of the radiation temperature, is presented and compared with the experimental data obtained using the OMEGA laser facility and the simulation results acquired with VISRAD code. We found a good agreement between the time-dependent measurements and the simulation results obtained using this model. The validated model was then used to analyze the experimental results from the Shenguang-III prototype laser facility. More specifically, the variations of the peak radiation temperatures at different view angles with the albedo of the hohlraum, the motion of the laser spots, the closure of the laser entrance holes, and the deviation of the laser power were investigated. Furthermore, the time-dependent radiation temperature at different orientations and the drive history on the capsule were calculated. The results indicate that the radiation temperature from “U20W112” (named according to the diagnostic hole ID on the target chamber) can be used to approximately predict the drive temperature on the capsule. In addition, the influence of the capsule on the peak radiation temperature is also presented.

  13. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  14. Neural-network accelerated fusion simulation with self-consistent core-pedestal coupling

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Candy, J.; Snyder, P. B.; Staebler, G.; Belli, E.

    2016-10-01

    Practical fusion Whole Device Modeling (WDM) simulations require the ability to perform predictions that are fast, but yet account for the sensitivity of the fusion performance to the boundary constraint that is imposed by the pedestal structure of H-mode plasmas due to the stiff core transport models. This poster presents the development of a set of neural-network (NN) models for the pedestal structure (as predicted by the EPED model), and the neoclassical and turbulent transport fluxes (as predicted by the NEO and TGLF codes, respectively), and their self-consistent coupling within the TGYRO transport code. The results are benchmarked with the ones obtained via the coupling scheme described in [Meneghini PoP 2016]. By substituting the most demanding codes with their NN-accelerated versions, the solution can be found at a fraction of the computation cost of the original coupling scheme, thereby combining the accuracy of a high-fidelity model with the fast turnaround time of a reduced model. Work supported by U.S. DOE DE-FC02-04ER54698 and DE-FG02-95ER54309.

  15. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    SciTech Connect

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  16. Three-Dimensional Simulations of the Deceleration Phase of Inertial Fusion Implosions

    NASA Astrophysics Data System (ADS)

    Woo, K. M.; Betti, R.; Bose, A.; Epstein, R.; Delettrez, J. A.; Anderson, K. S.; Yan, R.; Chang, P.-Y.; Jonathan, D.; Charissis, M.

    2015-11-01

    The three-dimensional radiation-hydrodynamics code DEC3D has been developed to model the deceleration phase of direct-drive inertial confinement fusion implosions. The code uses the approximate Riemann solver on a moving mesh to achieve high resolution near discontinuities. The domain decomposition parallelization strategy is implemented to maintain high computation efficiency for the 3-D calculation through message passing interface. The implicit thermal diffusion is solved by the parallel successive-over-relaxation iteration. Results from 3-D simulations of low-mode Rayleigh-Taylor instability are presented and compared with 2-D results. A systematic comparison of yields, pressures, temperatures, and areal densities between 2-D and 3-D is carried out to determine the additional degradation in target performance caused by the three-dimensionality of the nonuniformities. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944 and DE-FC02-04ER54789 (Fusion Science Center).

  17. Project SAFE: Simulating Alternative Futures in Education.

    ERIC Educational Resources Information Center

    Debenham, Jerry Dean

    Simulating Alternative Futures in Education (SAFE) is a simulation game dealing with the future of education from 1975 to 2024 and beyond. It is computerized on an APL direct-interaction system and can be played at any location over telephone lines. It takes approximately 1.8 hours of computer time to play, with 5 to 9 hours of preparation, and…

  18. Integrated simulation of magnetic-field-assist fast ignition laser fusion

    NASA Astrophysics Data System (ADS)

    Johzaki, T.; Nagatomo, H.; Sunahara, A.; Sentoku, Y.; Sakagami, H.; Hata, M.; Taguchi, T.; Mima, K.; Kai, Y.; Ajimi, D.; Isoda, T.; Endo, T.; Yogo, A.; Arikawa, Y.; Fujioka, S.; Shiraga, H.; Azechi, H.

    2017-01-01

    To enhance the core heating efficiency in fast ignition laser fusion, the concept of relativistic electron beam guiding by external magnetic fields was evaluated by integrated simulations for FIREX class targets. For the cone-attached shell target case, the core heating performance deteriorates by applying magnetic fields since the core is considerably deformed and most of the fast electrons are reflected due to the magnetic mirror formed through the implosion. On the other hand, in the case of a cone-attached solid ball target, the implosion is more stable under the kilo-tesla-class magnetic field. In addition, feasible magnetic field configuration is formed through the implosion. As a result, the core heating efficiency doubles by magnetic guiding. The dependence of core heating properties on the heating pulse shot timing was also investigated for the solid ball target.

  19. Multimode guidance project low frequency ECM simulator: Hardware description

    NASA Astrophysics Data System (ADS)

    Kaye, H. M.

    1982-10-01

    The Multimode Guidance(MMG) Project, part of the Army/Navy Area Defense SAM Technology Prototyping Program, was established to conduct a feasibility demonstration of multimode guidance concepts. Prototype guidance units for advanced, long range missiles are being built and tested under MMG Project sponsorship. The Johns Hopkins University Applied Physics Laboratory has been designated as Government Agent for countermeasures for this project. In support of this effort, a family of computer-controlled ECM simulators is being developed for validation of contractor's multimode guidance prototype designs. The design of the Low Frequency ECM Simulator is documented in two volumes. This report, Volume A, describes the hardware design of the simulator; Volume B describes the software design. This computer-controlled simulator can simulate up to six surveillance frequency jammers in B through F bands and will be used to evaluate the performance of home-on-jamming guidance modes in multiple jammer environments.

  20. Modelling and simulation of new generation powerful gyrotrons for the fusion research

    NASA Astrophysics Data System (ADS)

    Sabchevski, S.; Zhelyazkov, I.

    2007-04-01

    One of the important issues related with the cyclotron resonance heating (CRH) and current drive of fusion plasmas in thermonuclear reactors (tokamaks and stellarators) is the development of multi-megawatt class gyrotrons. There are generally three stages of the implementation of that task, notably (i) elaborating a novel generation of software tools for the physical modelling and simulation of such kind of gyrotrons, (ii) their computer aided design (CAD) and construction on the basis of the simulation's results, and finally, (iii) gyrotrons' testing in real experimental conditions. This tutorial paper concerns the first item-the development of software tools. In co-operation with the Institute for Pulsed Power and Microwave Technology at the Forschungszentrum Karlsruhe, Germany, and Centre de Recherches en Physique des Plasmas at École Polytechnique Fédérale de Lausanne, Switzerland, we work on the conceptual design of the software tools under development. The basic conclusions are that the numerical codes for gyrotrons' modelling should possess the following essential characteristics: (a) portability, (b) extensibility, (c) to be oriented toward the solution of practical problems (i.e., elaborating of computer programs that can be used in the design process), (d) to be based on self-consistent 3D physical models, which take into account the departure from axial symmetry, and (e) ability to simulate time dependent processes (electrostatic PIC simulation) alongside with a trajectory analysis (ray tracing simulation). Here, we discuss how various existing numerical codes have to be improved and implemented via the advanced programming technologies for state-of-the-art computer systems including clusters, grid, parallel platforms, and supercomputers.

  1. Three dimensional simulations of space charge dominated heavy ion beams with applications to inertial fusion energy

    SciTech Connect

    Grote, David Peter

    1994-11-01

    Heavy ion fusion requires injection, transport and acceleration of high current beams. Detailed simulation of such beams requires fully self-consistent space charge fields and three dimensions. WARP3D, developed for this purpose, is a particle-in-cell plasma simulation code optimized to work within the framework of an accelerator`s lattice of accelerating, focusing, and bending elements. The code has been used to study several test problems and for simulations and design of experiments. Two applications are drift compression experiments on the MBE-4 facility at LBL and design of the electrostatic quadrupole injector for the proposed ILSE facility. With aggressive drift compression on MBE-4, anomalous emittance growth was observed. Simulations carried out to examine possible causes showed that essentially all the emittance growth is result of external forces on the beam and not of internal beam space-charge fields. Dominant external forces are the dodecapole component of focusing fields, the image forces on the surrounding pipe and conductors, and the octopole fields that result from the structure of the quadrupole focusing elements. Goal of the design of the electrostatic quadrupole injector is to produce a beam of as low emittance as possible. The simulations show that the dominant effects that increase the emittance are the nonlinear octopole fields and the energy effect (fields in the axial direction that are off-axis). Injectors were designed that minimized the beam envelope in order to reduce the effect of the nonlinear fields. Alterations to the quadrupole structure that reduce the nonlinear fields further were examined. Comparisons were done with a scaled experiment resulted in very good agreement.

  2. Overview of the Simulation of Wave Interactions with MHD Project (SWIM)

    NASA Astrophysics Data System (ADS)

    Batchelor, Donald

    2010-11-01

    The SWIM center has the scientific objectives of: improving our understanding of interactions that both RF wave and particle sources have on extended-MHD phenomena, improving our capability for predicting and optimizing the performance of burning plasmas, developing an integrated computational system for treating multi-physics phenomena with the required flexibility and extensibility to serve as a prototype for the Fusion Simulation Project, addressing mathematics issues related to the multi-scale, coupled physics of RF waves and extended MHD, and optimizing the integrated system on high performance computers. Our Center has now built an end-to-end computational system that allows existing physics codes to be able to function together in a parallel environment and connects them to utility software components and data management systems. We have used this framework to couple together state-of-the-art fusion energy codes to produce a unique and world-class simulation capability. A physicist's overview of the Integrated Plasma Simulator (IPS) will be given and applications described. For example the IPS is being employed to support ITER with operational scenario studies.

  3. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  4. Using a Scientific Process for Curriculum Development and Formative Evaluation: Project FUSION

    ERIC Educational Resources Information Center

    Doabler, Christian; Cary, Mari Strand; Clarke, Benjamin; Fien, Hank; Baker, Scott; Jungjohann, Kathy

    2011-01-01

    Given the vital importance of using a scientific approach for curriculum development, the authors employed a design experiment methodology (Brown, 1992; Shavelson et al., 2003) to develop and evaluate, FUSION, a first grade mathematics intervention intended for students with or at-risk for mathematics disabilities. FUSION, funded through IES…

  5. Final Report for the "Fusion Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Cary, John R; Kruger, Scott

    2014-10-02

    The FACETS project over its lifetime developed the first self-consistent core-edge coupled capabilities, a new transport solver for modeling core transport in tokamak cores, developed a new code for modeling wall physics over long time scales, and significantly improved the capabilities and performance of legacy components, UEDGE, NUBEAM, GLF23, GYRO, and BOUT++. These improved capabilities leveraged the team’s expertise in applied mathematics (solvers and algorithms) and computer science (performance improvements and language interoperability). The project pioneered new methods for tackling the complexity of simulating the concomitant complexity of tokamak experiments.

  6. Project ITCH: Interactive Digital Simulation in Electrical Engineering Education.

    ERIC Educational Resources Information Center

    Bailey, F. N.; Kain, R. Y.

    A two-stage project is investigating the educational potential of a low-cost time-sharing system used as a simulation tool in Electrical Engineering (EE) education. Phase I involves a pilot study and Phase II a full integration. The system employs interactive computer simulation to teach engineering concepts which are not well handled by…

  7. The IDA Advanced Technology Combat Simulation Project

    DTIC Science & Technology

    1990-09-01

    Codes Dt Avail and/or r DtDDist Special4 A I I ! I I 5 PREFACE This paper was prepared as part of IDA Project 9000-623 under the IDA Central Research...Grotte, Ken Ratkiewicz , Phillip Merkey, Paul B. Schneck, Eleanor L. Schwartz, Shawn Sheridan, William Stoltz, Victor U.goff, Lowell Miller, Valyncia...benefit from the use of these methods. v HI I CONTENTS1 P R E F A C E

  8. FY2014 FES (Fusion Energy Sciences) Theory & Simulation Performance Target, Final Report

    SciTech Connect

    Fu, Guoyong; Budny, Robert; Gorelenkov, Nikolai; Poli, Francesca; Chen, Yang; McClenaghan, Joseph; Lin, Zhihong; Spong, Don; Bass, Eric; Waltz, Ron

    2014-10-14

    We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfven modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (R. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady-state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally, the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.

  9. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  10. Fusion studies with low-intensity radioactive ion beams using an active-target time projection chamber

    NASA Astrophysics Data System (ADS)

    Kolata, J. J.; Howard, A. M.; Mittig, W.; Ahn, T.; Bazin, D.; Becchetti, F. D.; Beceiro-Novo, S.; Chajecki, Z.; Febbrarro, M.; Fritsch, A.; Lynch, W. G.; Roberts, A.; Shore, A.; Torres-Isea, R. O.

    2016-09-01

    The total fusion excitation function for 10Be+40Ar has been measured over the center-of-momentum (c.m.) energy range from 12 to 24 MeV using a time-projection chamber (TPC). The main purpose of this experiment, which was carried out in a single run of duration 90 h using a ≈100 particle per second (pps) 10Be beam, was to demonstrate the capability of an active-target TPC to determine fusion excitation functions for extremely weak radioactive ion beams. Cross sections as low as 12 mb were measured with acceptable (50%) statistical accuracy. It also proved to be possible to separate events in which charged particles were emitted from the fusion residue from those in which only neutrons were evaporated. The method permits simultaneous measurement of incomplete fusion, break-up, scattering, and transfer reactions, and therefore fully exploits the opportunities presented by the very exotic beams that will be available from the new generation of radioactive beam facilities.

  11. Large-scale molecular dynamics simulations of dense plasmas: The Cimarron Project

    NASA Astrophysics Data System (ADS)

    Graziani, Frank R.; Batista, Victor S.; Benedict, Lorin X.; Castor, John I.; Chen, Hui; Chen, Sophia N.; Fichtl, Chris A.; Glosli, James N.; Grabowski, Paul E.; Graf, Alexander T.; Hau-Riege, Stefan P.; Hazi, Andrew U.; Khairallah, Saad A.; Krauss, Liam; Langdon, A. Bruce; London, Richard A.; Markmann, Andreas; Murillo, Michael S.; Richards, David F.; Scott, Howard A.; Shepherd, Ronnie; Stanton, Liam G.; Streitz, Fred H.; Surh, Michael P.; Weisheit, Jon C.; Whitley, Heather D.

    2012-03-01

    We describe the status of a new time-dependent simulation capability for dense plasmas. The backbone of this multi-institutional effort - the Cimarron Project - is the massively parallel molecular dynamics (MD) code "ddcMD," developed at Lawrence Livermore National Laboratory. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Z elements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This paper summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision, describes two new experimental efforts that play a central role in our validation work, highlights some significant results obtained to date, outlines concepts now being explored to deal more efficiently with the very disparate dynamical timescales that arise in fusion plasmas, and provides a careful comparison of quantum effects on electron trajectories predicted by more elaborate dynamical methods.

  12. Proposed best practice for projects that involve modelling and simulation.

    PubMed

    M, O'Kelly; V, Anisimov; C, Campbell; S, Hamilton

    2016-11-03

    Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices.

  13. Using gaming engines and editors to construct simulations of fusion algorithms for situation management

    NASA Astrophysics Data System (ADS)

    Lewis, Lundy M.; DiStasio, Nolan; Wright, Christopher

    2010-04-01

    In this paper we discuss issues in testing various cognitive fusion algorithms for situation management. We provide a proof-of-principle discussion and demo showing how gaming technologies and platforms could be used to devise and test various fusion algorithms, including input, processing, and output, and we look at how the proof-of-principle could lead to more advanced test beds and methods for high-level fusion in support of situation management. We develop four simple fusion scenarios and one more complex scenario in which a simple rule-based system is scripted to govern the behavior of battlespace entities.

  14. Mechanisms of Plastic and Fracture Instabilities for Alloy Development of Fusion Materials. Final Project Report for period July 15, 1998 - July 14, 2003

    SciTech Connect

    Ghoniem, N. M.

    2003-07-14

    The main objective of this research was to develop new computational tools for the simulation and analysis of plasticity and fracture mechanisms of fusion materials, and to assist in planning and assessment of corresponding radiation experiments.

  15. A web-based repository of surgical simulator projects.

    PubMed

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  16. Modeling and simulation support for ICRF heating of fusion plasmas. Annual report, 1990

    SciTech Connect

    1990-03-15

    Recent experimental, theoretical and computational results have shown the need and usefulness of a combined approach to the design, analysis and evaluation of ICH antenna configurations. The work at the University of Wisconsin (UW) in particular has shown that much needed information on the vacuum operation of ICH antennas can be obtained by a modest experimental and computational effort. These model experiments at UW and SAIC simulations have shown dramatically the potential for positive impact upon the ICRF program. Results of the UW-SAIC joint ICRF antenna analysis effort have been presented at several international meetings and numerous meetings in the United States. The PPPL bay M antenna has been modeled using the ARGUS code. The results of this effort are shown in Appendix C. SAIC has recently begun a collaboration with the ICRF antenna design and analysis group at ORNL. At present there are two separate projects underway. The first is associated with the simulation of and determination of the effect of adding slots in the antenna septum and side walls. The second project concerns the modeling and simulation of the ORNL folded waveguide (FWG) concept.

  17. Parallel mesh support for particle-in-cell methods in magnetic fusion simulations

    NASA Astrophysics Data System (ADS)

    Yoon, Eisung; Shephard, Mark S.; Seol, E. Seegyoung; Kalyanaraman, Kaushik; Ibanez, Daniel

    2016-10-01

    As supercomputing power continues to increase Particle-In-Cell (PIC) methods are being widely adopted for transport simulations of magnetic fusion devices. Current implementations place a copy of the entire continuum mesh and its fields used in the PIC calculations on every node. This is in general not a scalable solution as computational power continues to grow faster than node level memory. To address this scalability issue, while still maintaining sufficient mesh per node to control costly inter-node communication, a new unstructured mesh distribution methods and associated mesh based PIC calculation procedure is being developed building on the parallel unstructured mesh infrastructure (PUMI). Key components to be outlined in the presentation include (i) the mesh distribution strategy, (ii) how the particles are tracked during a push cycle taking advantage of the unstructured mesh adjacency structures and searches based on that structure, and (iii) how the field solve steps and particle migration are controlled. Performance comparisons to the current approach will also be presented.

  18. Atomistic simulations of deuterium irradiation on iron-based alloys in future fusion reactors

    DOE PAGES

    Safi, E.; Polvi, J.; Lasa, A.; ...

    2016-10-14

    Iron-based alloys are now being considered as plasma-facing materials for the first wall of future fusion reactors. Therefore, the iron (Fe) and carbon (C) erosion will play a key role in predicting the life-time and viability of reactors with steel walls. In this work, the surface erosion and morphology changes due to deuterium (D) irradiation in pure Fe, Fe with 1% C impurity and the cementite, are studied using molecular dynamics (MD) simulations, varying surface temperature and impact energy. The sputtering yields for both Fe and C were found to increase with incoming energy. In iron carbide, C sputtering wasmore » preferential to Fe and the deuterium was mainly trapped as D2 in bubbles, while mostly atomic D was present in Fe and Fe–1%C. The sputtering yields obtained from MD were compared to SDTrimSP yields. At lower impact energies, the sputtering mechanism was of both physical and chemical origin, while at higher energies (>100 eV) the physical sputtering dominated.« less

  19. Simulation of X-ray Irradiation on Optics and Chamber Wall Materials for Inertial Fusion Energy

    SciTech Connect

    Reyes, S; Latkowski, J F; Abbott, R P; Stein, W

    2003-09-10

    We have used the ABLATOR code to analyze the effect of the x-ray emission from direct drive targets on the optics and the first wall of a conceptual laser Inertial Fusion Energy (IFE) power plant. For this purpose, the ABLATOR code has been modified to incorporate the predicted x-ray spectrum from a generic direct drive target. We have also introduced elongation calculations in ABLATOR to predict the thermal stresses in the optic and first wall materials. These results have been validated with thermal diffusion calculations, using the LLNL heat transfer and dynamic structural finite element codes Topaz3d and Dyna3d. One of the most relevant upgrades performed in the ABLATOR code consists of the possibility to accommodate multi-material simulations. This new feature allows for a more realistic modeling of typical IFE optics and first wall materials, which may have a number of different layers. Finally, we have used the XAPPER facility, at LLNL, to develop our predictive capability and validate the results. The ABLATOR code will be further modified, as necessary, to predict the effects of x-ray irradiation in both the IFE real case and our experiments on the XAPPER facility.

  20. Atomistic simulations of deuterium irradiation on iron-based alloys in future fusion reactors

    SciTech Connect

    Safi, E.; Polvi, J.; Lasa, A.; Nordlund, K.

    2016-10-14

    Iron-based alloys are now being considered as plasma-facing materials for the first wall of future fusion reactors. Therefore, the iron (Fe) and carbon (C) erosion will play a key role in predicting the life-time and viability of reactors with steel walls. In this work, the surface erosion and morphology changes due to deuterium (D) irradiation in pure Fe, Fe with 1% C impurity and the cementite, are studied using molecular dynamics (MD) simulations, varying surface temperature and impact energy. The sputtering yields for both Fe and C were found to increase with incoming energy. In iron carbide, C sputtering was preferential to Fe and the deuterium was mainly trapped as D2 in bubbles, while mostly atomic D was present in Fe and Fe–1%C. The sputtering yields obtained from MD were compared to SDTrimSP yields. At lower impact energies, the sputtering mechanism was of both physical and chemical origin, while at higher energies (>100 eV) the physical sputtering dominated.

  1. Microscopic dynamics simulations of heavy-ion fusion reactions induced by neutron-rich nuclei

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Ou, Li; Zhang, Yingxun; Li, Zhuxia

    2014-06-01

    The heavy-ion fusion reactions induced by neutron-rich nuclei are investigated with the improved quantum molecular dynamics (ImQMD) model. With a subtle consideration of the neutron skin thickness of nuclei and the symmetry potential, the stability of nuclei and the fusion excitation functions of heavy-ion fusion reactions O16 + Ge76, O16 + Sm154, Ca40 + Zr96, and Sn132 + Ca40 are systematically studied. The fusion cross sections of these reactions at energies around the Coulomb barrier can be well reproduced by using the ImQMD model. The corresponding slope parameter of the symmetry energy adopted in the calculations is L ≈78 MeV and the surface energy coefficient is gsur=18±1.5 MeV fm2. In addition, it is found that the surface-symmetry term significantly influences the fusion cross sections of neutron-rich fusion systems. For sub-barrier fusion, the dynamical fluctuations in the densities of the reaction partners and the enhanced surface diffuseness at neck side result in the lowering of the fusion barrier.

  2. ICCS network simulation LDRD project final report summary

    SciTech Connect

    Bryant, B

    1999-01-09

    A critical component of the NIF Integrated Computer Controls System (ICCS) is the local area network (LAN) that enables timely and reliable communication between control applications running on the 600+ computer systems distributed throughout the NIF facility. This project analyzed critical portions of the NIF ICCS network (referred to as "the network" in this report) applying the OPNET Modeler discrete event simulation package to model and simulate network operation and the Network Associates Distributed Sniffer network analyzer to collect actual network performance data in the ICCS Testbed. These tools were selected and procured for use on this project. Simulations and initial network analysis indicate that the network is capable of meeting system requirements. ICCS application software is currently in development, so test software was used to collect performance data. As application software is tested in the Testbed environment, more accurate timing information can be collected which will allow for more accurate large-scale simulations.

  3. Fast discontinuous Galerkin lattice-Boltzmann simulations on GPUs via maximal kernel fusion

    NASA Astrophysics Data System (ADS)

    Mazzeo, Marco D.

    2013-03-01

    A GPU implementation of the discontinuous Galerkin lattice-Boltzmann method with square spectral elements, and highly optimised for speed and precision of calculations is presented. An extensive analysis of the numerous variants of the fluid solver unveils that best performance is obtained by maximising CUDA kernel fusion and by arranging the resulting kernel tasks so as to trigger memory coherent and scattered loads in a specific manner, albeit at the cost of introducing cross-thread load unbalancing. Surprisingly, any attempt to vanish this, to maximise thread occupancy and to adopt conventional work tiling or distinct custom kernels highly tuned via ad hoc data and computation layouts invariably deteriorate performance. As such, this work sheds light into the possibility to hide fetch latencies of workloads involving heterogeneous loads in a way that is more effective than what is achieved with frequently suggested techniques. When simulating the lid-driven cavity on a NVIDIA GeForce GTX 480 via a 5-stage 4th-order Runge-Kutta (RK) scheme, the first four digits of the obtained centreline velocity values, or more, converge to those of the state-of-the-art literature data at a simulation speed of 7.0G primitive variable updates per second during the collision stage and 4.4G ones during each RK step of the advection by employing double-precision arithmetic (DPA) and a computational grid of 642 4×4-point elements only. The new programming engine leads to about 2× performance w.r.t. the best programming guidelines in the field. The new fluid solver on the above GPU is also 20-30 times faster than a highly optimised version running on a single core of a Intel Xeon X5650 2.66 GHz.

  4. M3D project for simulation studies of plasmas

    SciTech Connect

    Park, W.; Belova, E.V.; Fu, G.Y.; Strauss, H.R.; Sugiyama, L.E.

    1998-12-31

    The M3D (Multi-level 3D) project carries out simulation studies of plasmas of various regimes using multi-levels of physics, geometry, and mesh schemes in one code package. This paper and papers by Strauss, Sugiyama, and Belova in this workshop describe the project, and present examples of current applications. The currently available physics models of the M3D project are MHD, two-fluids, gyrokinetic hot particle/MHD hybrid, and gyrokinetic particle ion/two-fluid hybrid models. The code can be run with both structured and unstructured meshes.

  5. Response to FESAC survey, non-fusion connections to Fusion Energy Sciences. Applications of the FES-supported beam and plasma simulation code, Warp

    SciTech Connect

    Friedman, A.; Grote, D. P.; Vay, J. L.

    2015-05-29

    The Fusion Energy Sciences Advisory Committee’s subcommittee on non-fusion applications (FESAC NFA) is conducting a survey to obtain information from the fusion community about non-fusion work that has resulted from their DOE-funded fusion research. The subcommittee has requested that members of the community describe recent developments connected to the activities of the DOE Office of Fusion Energy Sciences. Two questions in particular were posed by the subcommittee. This document contains the authors’ responses to those questions.

  6. SIMRAND I- SIMULATION OF RESEARCH AND DEVELOPMENT PROJECTS

    NASA Technical Reports Server (NTRS)

    Miles, R. F.

    1994-01-01

    The Simulation of Research and Development Projects program (SIMRAND) aids in the optimal allocation of R&D resources needed to achieve project goals. SIMRAND models the system subsets or project tasks as various network paths to a final goal. Each path is described in terms of task variables such as cost per hour, cost per unit, availability of resources, etc. Uncertainty is incorporated by treating task variables as probabilistic random variables. SIMRAND calculates the measure of preference for each alternative network. The networks yielding the highest utility function (or certainty equivalence) are then ranked as the optimal network paths. SIMRAND has been used in several economic potential studies at NASA's Jet Propulsion Laboratory involving solar dish power systems and photovoltaic array construction. However, any project having tasks which can be reduced to equations and related by measures of preference can be modeled. SIMRAND analysis consists of three phases: reduction, simulation, and evaluation. In the reduction phase, analytical techniques from probability theory and simulation techniques are used to reduce the complexity of the alternative networks. In the simulation phase, a Monte Carlo simulation is used to derive statistics on the variables of interest for each alternative network path. In the evaluation phase, the simulation statistics are compared and the networks are ranked in preference by a selected decision rule. The user must supply project subsystems in terms of equations based on variables (for example, parallel and series assembly line tasks in terms of number of items, cost factors, time limits, etc). The associated cumulative distribution functions and utility functions for each variable must also be provided (allowable upper and lower limits, group decision factors, etc). SIMRAND is written in Microsoft FORTRAN 77 for batch execution and has been implemented on an IBM PC series computer operating under DOS.

  7. NASA/Haughton-Mars Project 2006 Lunar Medical Contingency Simulation

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.

    2007-01-01

    A viewgraph presentation describing NASA's Haughton-Mars Project (HMP) medical requirements and lunar surface operations is shown. The topics onclude: 1) Mission Purpose/ Overview; 2) HMP as a Moon/Mars Analog; 3) Simulation objectives; 4) Discussion; and 5) Forward work.

  8. Exploring International Investment through a Classroom Portfolio Simulation Project

    ERIC Educational Resources Information Center

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  9. Integrated Simulation Studies of Plasma Performances and Fusion Reactions in the Deuterium Experiment of LHD

    NASA Astrophysics Data System (ADS)

    Murakami, S.; Yamaguchi, H.; Homma, M.; Maeta, S.; Saito, Y.; Fukuyama, A.; Nagaoka, K.; Takahashi, H.; Nakano, H.; Osakabe, M.; Yokoyama, M.; Tanaka, K.; Ida, K.; Yoshinuma, M.; Isobe, M.; Tomita, H.; Ogawa, K.; LHD Exp Group Team

    2016-10-01

    The deuterium experiment project from 2017 is planned in LHD, where the deuterium NBI heating beams with the power more than 30MW are injected into the deuterium plasma. Principal objects of this project are to clarify the isotope effect on the heat and particle transport in the helical plasma and to study energetic particle confinement in a helical magnetic configuration measuring triton burn-up neutrons. We study the deuterium experiment plasma of LHD applying the integrated simulation code, TASK3D [Murakami, PPCF2015], and the 5-D drift kinetic equation solver, GNET [Murakami, NF2006]. (i) More than 20% of ion temperature increment is obtained in the deuterium plasma (nD /nH +nD = 0.8) due to the isotope effect assuming the turbulent transport model based on the H/He plasma experiment of LHD. (ii) The triton burn-up simulation shows the triton slowing down distribution and the strong magnetic configuration dependency of the triton burn-up ratio in LHD. This work was supported by JSPS KAKENHI Grant Number 26420851.

  10. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  11. Thermal-to-fusion neutron convertor and Monte Carlo coupled simulation of deuteron/triton transport and secondary products generation

    NASA Astrophysics Data System (ADS)

    Wang, Guan-bo; Liu, Han-gang; Wang, Kan; Yang, Xin; Feng, Qi-jie

    2012-09-01

    Thermal-to-fusion neutron convertor has being studied in China Academy of Engineering Physics (CAEP). Current Monte Carlo codes, such as MCNP and GEANT, are inadequate when applied in this multi-step reactions problems. A Monte Carlo tool RSMC (Reaction Sequence Monte Carlo) has been developed to simulate such coupled problem, from neutron absorption, to charged particle ionization and secondary neutron generation. "Forced particle production" variance reduction technique has been implemented to improve the calculation speed distinctly by making deuteron/triton induced secondary product plays a major role. Nuclear data is handled from ENDF or TENDL, and stopping power from SRIM, which described better for low energy deuteron/triton interactions. As a validation, accelerator driven mono-energy 14 MeV fusion neutron source is employed, which has been deeply studied and includes deuteron transport and secondary neutron generation. Various parameters, including fusion neutron angle distribution, average neutron energy at different emission directions, differential and integral energy distributions, are calculated with our tool and traditional deterministic method as references. As a result, we present the calculation results of convertor with RSMC, including conversion ratio of 1 mm 6LiD with a typical thermal neutron (Maxwell spectrum) incidence, and fusion neutron spectrum, which will be used for our experiment.

  12. The GeantV project: Preparing the future of simulation

    SciTech Connect

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  13. The GeantV project: preparing the future of simulation

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-01

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. A set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  14. The GeantV project: Preparing the future of simulation

    DOE PAGES

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; ...

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energymore » Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.« less

  15. WE-EF-207-04: An Inter-Projection Sensor Fusion (IPSF) Approach to Estimate Missing Projection Signal in Synchronized Moving Grid (SMOG) System

    SciTech Connect

    Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W

    2015-06-15

    Purpose: A synchronized moving grid (SMOG) has been proposed to reduce scatter and lag artifacts in cone beam computed tomography (CBCT). However, information is missing in each projection because certain areas are blocked by the grid. A previous solution to this issue is acquiring 2 complimentary projections at each position, which increases scanning time. This study reports our first Result using an inter-projection sensor fusion (IPSF) method to estimate missing projection in our prototype SMOG-based CBCT system. Methods: An in-house SMOG assembling with a 1:1 grid of 3 mm gap has been installed in a CBCT benchtop. The grid moves back and forth in a 3-mm amplitude and up-to 20-Hz frequency. A control program in LabView synchronizes the grid motion with the platform rotation and x-ray firing so that the grid patterns for any two neighboring projections are complimentary. A Catphan was scanned with 360 projections. After scatter correction, the IPSF algorithm was applied to estimate missing signal for each projection using the information from the 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was applied to reconstruct CBCT images. The CBCTs were compared to those reconstructed using normal projections without applying the SMOG system. Results: The SMOG-IPSF method may reduce image dose by half due to the blocked radiation by the grid. The method almost completely removed scatter related artifacts, such as the cupping artifacts. The evaluation of line pair patterns in the CatPhan suggested that the spatial resolution degradation was minimal. Conclusion: The SMOG-IPSF is promising in reducing scatter artifacts and improving image quality while reducing radiation dose.

  16. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  17. Community Petascale Project for Accelerator Science and Simulation

    SciTech Connect

    Warren B. Mori

    2013-02-01

    The UCLA Plasma Simulation Group is a major partner of the "Community Petascale Project for Accelerator Science and Simulation. This is the final technical report. We include an overall summary, a list of publications and individual progress reports for each years. During the past five years we have made tremendous progress in enhancing the capabilities of OSIRIS and QuickPIC, in developing new algorithms and data structures for PIC codes to run on GPUS and many future core architectures, and in using these codes to model experiments and in making new scientific discoveries. Here we summarize some highlights for which SciDAC was a major contributor.

  18. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  19. Adaptive quantum computation in changing environments using projective simulation.

    PubMed

    Tiersch, M; Ganahl, E J; Briegel, H J

    2015-08-11

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent's learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent's performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover's search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  20. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  1. The AGORA High-resolution Galaxy Simulations Comparison Project

    NASA Astrophysics Data System (ADS)

    Kim, Ji-hoon; Abel, Tom; Agertz, Oscar; Bryan, Greg L.; Ceverino, Daniel; Christensen, Charlotte; Conroy, Charlie; Dekel, Avishai; Gnedin, Nickolay Y.; Goldbaum, Nathan J.; Guedes, Javiera; Hahn, Oliver; Hobbs, Alexander; Hopkins, Philip F.; Hummels, Cameron B.; Iannuzzi, Francesca; Keres, Dusan; Klypin, Anatoly; Kravtsov, Andrey V.; Krumholz, Mark R.; Kuhlen, Michael; Leitner, Samuel N.; Madau, Piero; Mayer, Lucio; Moody, Christopher E.; Nagamine, Kentaro; Norman, Michael L.; Onorbe, Jose; O'Shea, Brian W.; Pillepich, Annalisa; Primack, Joel R.; Quinn, Thomas; Read, Justin I.; Robertson, Brant E.; Rocha, Miguel; Rudd, Douglas H.; Shen, Sijing; Smith, Britton D.; Szalay, Alexander S.; Teyssier, Romain; Thompson, Robert; Todoroki, Keita; Turk, Matthew J.; Wadsley, James W.; Wise, John H.; Zolotov, Adi; AGORA Collaboration29,the

    2014-01-01

    We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ~100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle of baryons in and out of eight galaxies with halo masses M vir ~= 1010, 1011, 1012, and 1013 M ⊙ at z = 0 and two different ("violent" and "quiescent") assembly histories. The numerical techniques and implementations used in this project include the smoothed particle hydrodynamics codes GADGET and GASOLINE, and the adaptive mesh refinement codes ART, ENZO, and RAMSES. The codes share common initial conditions and common astrophysics packages including UV background, metal-dependent radiative cooling, metal and energy yields of supernovae, and stellar initial mass function. These are described in detail in the present paper. Subgrid star formation and feedback prescriptions will be tuned to provide a realistic interstellar and circumgalactic medium using a non-cosmological disk galaxy simulation. Cosmological runs will be systematically compared with each other using a common analysis toolkit and validated against observations to verify that the solutions are robust—i.e., that the astrophysical assumptions are responsible for any success, rather than artifacts of particular implementations. The goals of the AGORA project are, broadly speaking, to raise the realism and predictive power of galaxy simulations and the understanding of the feedback processes that regulate galaxy "metabolism." The initial conditions for the AGORA galaxies as well as simulation outputs at various epochs will be made publicly available to the community. The proof-of-concept dark-matter-only test of the formation of a galactic halo with a z = 0 mass of M vir ~= 1.7 × 1011 M ⊙ by nine different

  2. IERAPSI project: simulation of a canal wall-up mastoidectomy.

    PubMed

    Neri, E; Sellari Franceschini, S; Berrettini, S; Caramella, D; Bartolozzi, C

    2006-03-01

    Among the various EU research projects concerning the medical application of virtual reality, the project Ist-1999-12175, called IERAPSI (Integrated Environment for the Rehearsal and Planning of Surgical Interventions), specifically addressed the creation of a virtual and interactive surgical field for the temporal bone using three-dimensional images derived from CT data. We report on the experience obtained in the IERAPSI project in simulating a canal wall-up mastoidectomy. A surgeon with extensive experience in surgery of the petrous bone performed the mastoidectomy. The operative field included the mastoid, with its substantial differences in density between the cortex and the pneumatized bone, together with soft tissue structures, both on the border and inside the bone. The simulation is better in the first part of the operation than in the second part, suffering from a lack of haptic feedback from soft tissue and the surgical tool in deeper contexts, and under-representation of the variability inherent in pneumatized bone. This said, the excellent representation of dust production and removal, 3D simulation through color, and very good visual and haptic feedback in the early stage of the procedure are impressive. IERAPSI represents a potential surgical planning theater for the training of students and young surgeons, but is also expected to aid expert surgeons in the preoperative planning of difficult cases.

  3. Review of fusion synfuels

    SciTech Connect

    Fillo, J.A.

    1980-01-01

    Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of approx. 40 to 60% and hydrogen production efficiencies by high-temperature electrolysis of approx. 50 to 65% are projected for fusion reactors using high-temperatures blankets. Fusion/coal symbiotic systems appear economically promising for the first generation of commercial fusion synfuels plants. Coal production requirements and the environmental effects of large-scale coal usage would be greatly reduced by a fusion/coal system. In the long term, there could be a gradual transition to an inexhaustible energy system based solely on fusion.

  4. Equation Free Projective Integration and its Applicability for Simulating Plasma

    NASA Astrophysics Data System (ADS)

    Jemella, B.; Shay, M. A.; Drake, J. F.; Dorland, W.

    2004-12-01

    We examine a novel simulation scheme called equation free projective integration1 which has the potential to allow global simulations of plasmas while still including the global effects of microscale physics. These simulation codes would be ideal for such multiscale problems as the Earth's magnetosphere, tokamaks, and the solar corona. In this method, the global plasma variables stepped forward in time are not time-integrated directly using dynamical differential equations, hence the name "equation free." Instead, these variables are represented on a microgrid using a kinetic simulation. This microsimulation is integrated forward long enough to determine the time derivatives of the global plasma variables, which are then used to integrate forward the global variables with much larger time steps. We are exploring the feasibility of applying this scheme to simulate plasma, and we will present the results of exploratory test problems including the development of 1-D shocks and magnetic reconnection. 1 I. G. Kevrekidis et. al., ``Equation-free multiscale computation: Enabling microscopic simulators to perform system-level tasks,'' arXiv:physics/0209043.

  5. Equation free projective integration and its applicability for simulating plasma

    NASA Astrophysics Data System (ADS)

    Shay, Michael A.; Drake, James F.; Dorland, William; Swisdak, Marc

    2004-11-01

    We examine a novel simulation scheme called equation free projective integration^1 which has the potential to allow global simulations of plasmas while still including the global effects of microscale physics. These simulation codes would be ideal for such multiscale problems as tokamaks, the Earth's magnetosphere, and the solar corona. In this method, the global plasma variables stepped forward in time are not time-integrated directly using dynamical differential equations, hence the name ``equation free.'' Instead, these variables are represented on a microgrid using a kinetic simulation. This microsimulation is integrated forward long enough to determine the time derivatives of the global plasma variables, which are then used to integrate forward the global variables with much larger time steps. We are exploring the feasibility of applying this scheme to simulate plasma, and we will present the results of exploratory test problems including the development of 1-D shocks and magnetic reconnection. ^1 I. G. Kevrekidis et. al., ``Equation-free multiscale computation: Enabling microscopic simulators to perform system-level tasks,'' arXiv:physics/0209043.

  6. Progress of the NASAUSGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas; McLemore, C.; Stoeser, D.; Schrader, C.; Fikes, J.; Street, K.

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. It was recognized in early 2006 there were serious limitations with the standard approach of simply taking a single terrestrial rock and grinding it. To a geologist, even a cursory examination of the Lunar Sourcebook shows that matching lunar heterogeneity, crystal size, relative mineral abundances, lack of H2O, plagioclase chemistry and glass abundance simply can not be done with any simple combination of terrestrial rocks. Thus the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards.

  7. Three-dimensional gyrokinetic particle-in-cell simulation of plasmas on a massively parallel computer: Final report on LDRD Core Competency Project, FY 1991--FY 1993

    SciTech Connect

    Byers, J.A.; Williams, T.J.; Cohen, B.I.; Dimits, A.M.

    1994-04-27

    One of the programs of the Magnetic fusion Energy (MFE) Theory and computations Program is studying the anomalous transport of thermal energy across the field lines in the core of a tokamak. We use the method of gyrokinetic particle-in-cell simulation in this study. For this LDRD project we employed massively parallel processing, new algorithms, and new algorithms, and new formal techniques to improve this research. Specifically, we sought to take steps toward: researching experimentally-relevant parameters in our simulations, learning parallel computing to have as a resource for our group, and achieving a 100 {times} speedup over our starting-point Cray2 simulation code`s performance.

  8. A Particle-in-Cell Simulation for the Traveling Wave Direct Energy Converter (TWDEC) for Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Chap, Andrew; Tarditi, Alfonso G.; Scott, John H.

    2013-01-01

    A Particle-in-cell simulation model has been developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC) applied to the conversion of charged fusion products into electricity. In this model the availability of a beam of collimated fusion products is assumed; the simulation is focused on the conversion of the beam kinetic energy into alternating current (AC) electric power. The model is electrostatic, as the electro-dynamics of the relatively slow ions can be treated in the quasistatic approximation. A two-dimensional, axisymmetric (radial-axial coordinates) geometry is considered. Ion beam particles are injected on one end and travel along the axis through ring-shaped electrodes with externally applied time-varying voltages, thus modulating the beam by forming a sinusoidal pattern in the beam density. Further downstream, the modulated beam passes through another set of ring electrodes, now electrically oating. The modulated beam induces a time alternating potential di erence between adjacent electrodes. Power can be drawn from the electrodes by connecting a resistive load. As energy is dissipated in the load, a corresponding drop in beam energy is measured. The simulation encapsulates the TWDEC process by reproducing the time-dependent transfer of energy and the particle deceleration due to the electric eld phase time variations.

  9. SciDAC - Center for Plasma Edge Simulation - Project Summary

    SciTech Connect

    Parker, Scott

    2014-11-03

    Final Technical Report: Center for Plasma Edge Simulation (CPES) Principal Investigator: Scott Parker, University of Colorado, Boulder Description/Abstract First-principle simulations of edge pedestal micro-turbulence are performed with the global gyrokinetic turbulence code GEM for both low and high confinement tokamak plasmas. The high confinement plasmas show a larger growth rate, but nonlinearly a lower particle and heat flux. Numerical profiles are obtained from the XGC0 neoclassical code. XGC0/GEM code coupling is implemented under the EFFIS (“End-to-end Framework for Fusion Integrated Simulation”) framework. Investigations are underway to clearly identify the micro-instabilities in the edge pedestal using global and flux-tube gyrokinetic simulation with realistic experimental high confinement profiles. We use both experimental profiles and those obtained using the EFFIS XGC0/GEM coupled code framework. We find there are three types of instabilities at the edge: a low-n, high frequency electron mode, a high-n, low frequency ion mode, and possibly an ion mode like kinetic ballooning mode (KBM). Investigations are under way for the effects of the radial electric field. Finally, we have been investigating how plasmas dominated by ion-temperature gradient (ITG) driven turbulence, how cold Deuterium and Tritium ions near the edge will naturally pinch radially inward towards the core. We call this mechanism “natural fueling.” It is due to the quasi-neutral heat flux dominated nature of the turbulence and still applies when trapped and passing kinetic electron effects are included. To understand this mechanism, examine the situation where the electrons are adiabatic, and there is an ion heat flux. In such a case, lower energy particles move inward and higher energy particles move outward. If a trace amount of cold particles are added, they will move inward.

  10. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. III. Collisionless tearing mode

    SciTech Connect

    Liu, Dongjian; Bao, Jian; Han, Tao; Wang, Jiaqi; Lin, Zhihong

    2016-02-15

    A finite-mass electron fluid model for low frequency electromagnetic fluctuations, particularly the collisionless tearing mode, has been implemented in the gyrokinetic toroidal code. Using this fluid model, linear properties of the collisionless tearing mode have been verified. Simulations verify that the linear growth rate of the single collisionless tearing mode is proportional to D{sub e}{sup 2}, where D{sub e} is the electron skin depth. On the other hand, the growth rate of a double tearing mode is proportional to D{sub e} in the parameter regime of fusion plasmas.

  11. 15 MW HArdware-in-the-loop Grid Simulation Project

    SciTech Connect

    Rigas, Nikolaos; Fox, John Curtiss; Collins, Randy; Tuten, James; Salem, Thomas; McKinney, Mark; Hadidi, Ramtin; Gislason, Benjamin; Boessneck, Eric; Leonard, Jesse

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  12. Uncertainties of soil moisture in historical simulations and future projections

    NASA Astrophysics Data System (ADS)

    Cheng, Shanjun; Huang, Jianping; Ji, Fei; Lin, Lei

    2017-02-01

    Uncertainties of soil moisture in historical simulations (1920-2005) and future projections (2006-2080) were investigated by using the outputs from the Coupled Model Intercomparison Project Phase 5 and Community Earth System Model. The results showed that soil moisture climatology varies greatly among models despite the good agreement between the ensemble mean of simulated soil moisture and the Global Land Data Assimilation System data. The uncertainties of initial conditions and model structure showed similar spatial patterns and magnitudes, with high uncertainties in dry regions and low uncertainties in wet regions. In addition, the long-term variability of model structure uncertainty rapidly decreased before 1980 and increased thereafter, but the uncertainty in initial conditions showed an upward trend over the entire time span. The model structure and initial conditions can cause uncertainties at all time scales. Despite these large uncertainties, almost all of the simulations showed significant decreasing linear trends in soil moisture for the 21st century, especially in the Mediterranean region, northeast and southwest South America, southern Africa, and southwestern USA.

  13. Three-dimensional simulation strategy to determine the effects of turbulent mixing on inertial-confinement-fusion capsule performance.

    PubMed

    Haines, Brian M; Grinstein, Fernando F; Fincke, James R

    2014-05-01

    In this paper, we present and justify an effective strategy for performing three-dimensional (3D) inertial-confinement-fusion (ICF) capsule simulations. We have evaluated a frequently used strategy in which two-dimensional (2D) simulations are rotated to 3D once sufficient relevant 2D flow physics has been captured and fine resolution requirements can be restricted to relatively small regions. This addresses situations typical of ICF capsules which are otherwise prohibitively intensive computationally. We tested this approach for our previously reported fully 3D simulations of laser-driven reshock experiments where we can use the available 3D data as reference. Our studies indicate that simulations that begin as purely 2D lead to significant underprediction of mixing and turbulent kinetic energy production at later time when compared to the fully 3D simulations. If, however, additional suitable nonuniform perturbations are applied at the time of rotation to 3D, we show that one can obtain good agreement with the purely 3D simulation data, as measured by vorticity distributions as well as integrated mixing and turbulent kinetic energy measurements. Next, we present results of simulations of a simple OMEGA-type ICF capsule using the developed strategy. These simulations are in good agreement with available experimental data and suggest that the dominant mechanism for yield degradation in ICF implosions is hydrodynamic instability growth seeded by long-wavelength surface defects. This effect is compounded by drive asymmetries and amplified by repeated shock interactions with an increasingly distorted shell, which results in further yield reduction. Our simulations are performed with and without drive asymmetries in order to compare the importance of these effects to those of surface defects; our simulations indicate that long-wavelength surface defects degrade yield by approximately 60% and short-wavelength drive asymmetry degrades yield by a further 30%.

  14. Progress of the NASA/USGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; MLemore, Carole; Wilson, Steve; Stoeser, Doug; Schrader, Christian; Fikes, John; Street, Kenneth

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. Beginning in 2006 the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards. Major progress has been made in all five areas. A substantial draft of a formal requirements document now exists and has been largely stable since 2007. It does evolve as specific details of the standards and Lunar Analysis efforts proceed. Lunar Analysis has turned out to be vastly more difficult than anticipated. After great effort to mine existing published and gray literature, the team has realized the necessity of making new measurements of the Apollo samples, an effort that is currently in progress. Process development is substantially ahead of expectations in 2006. It is now practical to synthesize glasses of appropriate composition and purity. It is also possible to make agglutinate particles in significant quantities. A series of minerals commonly found on the Moon has been synthesized. Separation of mineral constituents from starting rock material is also proceeding. Customized grinding and mixing processes have been developed and tested are now being documented. Identification and development of appropriate feedstocks has been both easier and more difficult than anticipated. The Stillwater Mining Company, operating in the Stillwater layered mafic intrusive complex of Montana, has been an amazing resource for the project, but finding adequate sources for some of the components

  15. Integrated fusion simulation with self-consistent core-pedestal coupling

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-01

    Accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile, and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self-consistent solution to this strongly coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Testing against a DIII-D discharge shows that the workflow is capable of robustly predicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in a good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff .

  16. Integrated fusion simulation with self-consistent core-pedestal coupling

    SciTech Connect

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.

  17. Integrated fusion simulation with self-consistent core-pedestal coupling

    DOE PAGES

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; ...

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments.more » An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.« less

  18. Inertial electrostatic confinement and DD fusion at interelectrode media of nanosecond vacuum discharge. PIC simulations and experiment

    NASA Astrophysics Data System (ADS)

    Kurilenkov, Yu K.; Tarakanov, V. P.; Skowronek, M.; Guskov, S. Yu; Dufty, J.

    2009-05-01

    The generation of energetic ions and DD neutrons from microfusion at the interelectrode space of a low-energy nanosecond vacuum discharge has been demonstrated recently [1, 2]. However, the physics of fusion processes and some results regarding the neutron yield from the database accumulated were poorly understood. The present work presents a detailed particle-in-cell (PIC) simulation of the discharge experimental conditions using a fully electrodynamic code. The dynamics of all charge particles was reconstructed in time and anode-cathode (AC) space. The principal role of a virtual cathode (VC) and the corresponding single and double potential wells formed in the interelectrode space are recognized. The calculated depth of the quasistationary potential well (PW) of the VC is about 50-60 keV, and the D+ ions being trapped by this well accelerate up to energy values needed to provide collisional DD nuclear synthesis. The correlation between the calculated potential well structures (and dynamics) and the neutron yield observed is discussed. In particular, ions in the potential well undergo high-frequency (~80 MHz) harmonic oscillations accompanied by a corresponding regime of oscillatory neutron yield. Both experiment and PIC simulations illustrate favorable scaling of the fusion power density for the chosen IECF scheme based on nanosecond vacuum discharge.

  19. Laser fusion

    SciTech Connect

    Smit, W.A.; Boskma, P.

    1980-12-01

    Unrestricted laser fusion offers nations an opportunity to circumvent arms control agreements and develop thermonuclear weapons. Early laser weapons research sought a clean radiation-free bomb to replace the fission bomb, but this was deceptive because a fission bomb was needed to trigger the fusion reaction and additional radioactivity was induced by generating fast neutrons. As laser-implosion experiments focused on weapons physics, simulating weapons effects, and applications for new weapons, the military interest shifted from developing a laser-ignited hydrogen bomb to more sophisticated weapons and civilian applications for power generation. Civilian and military research now overlap, making it possible for several countries to continue weapons activities and permitting proliferation of nuclear weapons. These countries are reluctant to include inertial confinement fusion research in the Non-Proliferation Treaty. 16 references. (DCK)

  20. The Jefferson Project: Large-eddy simulations of a watershed

    NASA Astrophysics Data System (ADS)

    Watson, C.; Cipriani, J.; Praino, A. P.; Treinish, L. A.; Tewari, M.; Kolar, H.

    2015-12-01

    The Jefferson Project is a new endeavor at Lake George, NY by IBM Research, Rensselaer Polytechnic Institute (RPI) and The Fund for Lake George. Lake George is an oligotrophic lake - one of low nutrients - and a 30-year study recently published by RPI's Darrin Fresh Water Institute highlighted the renowned water quality is declining from the injection of salt (from runoff), algae, and invasive species. In response, the Jefferson Project is developing a system to provide extensive data on relevant physical, chemical and biological parameters that drive ecosystem function. The system will be capable of real-time observations and interactive modeling of the atmosphere, watershed hydrology, lake circulation and food web dynamics. In this presentation, we describe the development of the operational forecast system used to simulate the atmosphere in the model stack, Deep ThunderTM (a configuration of the ARW-WRF model). The model performs 48-hr forecasts twice daily in a nested configuration, and in this study we present results from ongoing tests where the innermost domains are dx = 333-m and 111-m. We discuss the model's ability to simulate boundary layer processes, lake surface conditions (an input into the lake model), and precipitation (an input into the hydrology model) during different weather regimes, and the challenges of data assimilation and validation at this scale. We also explore the potential for additional nests over select regions of the watershed to better capture turbulent boundary layer motions.

  1. Tomographic data fusion with CFD simulations associated with a planar sensor

    NASA Astrophysics Data System (ADS)

    Liu, J.; Liu, S.; Sun, S.; Zhou, W.; Schlaberg, I. H. I.; Wang, M.; Yan, Y.

    2017-04-01

    Tomographic techniques have great abilities to interrogate the combustion processes, especially when it is combined with the physical models of the combustion itself. In this study, a data fusion algorithm is developed to investigate the flame distribution of a swirl-induced environmental (EV) burner, a new type of burner for low NOx combustion. An electric capacitance tomography (ECT) system is used to acquire 3D flame images and computational fluid dynamics (CFD) is applied to calculate an initial distribution of the temperature profile for the EV burner. Experiments were also carried out to visualize flames at a series of locations above the burner. While the ECT images essentially agree with the CFD temperature distribution, discrepancies exist at a certain height. When data fusion is applied, the discrepancy is visibly reduced and the ECT images are improved. The methods used in this study can lead to a new route where combustion visualization can be much improved and applied to clean energy conversion and new burner development.

  2. An Electrothermal Plasma Source Developed for Simulation of Transient Heat Loads in Future Large Fusion Devices

    NASA Astrophysics Data System (ADS)

    Gebhart, Trey; Baylor, Larry; Winfrey, Leigh

    2016-10-01

    The realization of fusion energy requires materials that can withstand high heat and particle fluxes at the plasma material interface. In this work, an electrothermal (ET) plasma source has been designed as a possible transient heat flux source for a linear plasma material interaction device. An ET plasma source operates in the ablative arc regime, which is driven by a DC capacitive discharge. The current travels through the 4mm bore of a boron nitride liner and subsequently ablates and ionizes the liner material. This results in a high density plasma with a large unidirectional bulk flow out of the source exit. The pulse length for the ET source has been optimized using a pulse forming network to have a duration of 1ms at full-width half maximum. The peak currents and maximum source energies seen in this system are 2kA and 5kJ. The goal of this work is to show that the ET source produces electron densities and heat fluxes that are comparable to transient events in future large magnetic confinement fusion devices. Heat flux, plasma temperature, and plasma density were determined for each test shot using infrared imaging and optical spectroscopy techniques. This work will compare the ET source output (heat flux, temperature, and density) with and without an applied magnetic field. Research sponsored by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the U. S. Department of Energy.

  3. Membrane insertion of fusion peptides from Ebola and Marburg viruses studied by replica-exchange molecular dynamics simulations.

    PubMed

    Olson, Mark A; Lee, Michael S; Yeh, In-Chul

    2017-01-28

    This work presents replica-exchange molecular dynamics simulations of inserting a 16-residue Ebola virus fusion peptide into a membrane bilayer. A computational approach is applied for modeling the peptide at the explicit all-atom level and the membrane-aqueous bilayer by a generalized Born continuum model with a smoothed switching function (GBSW). We provide an assessment of the model calculations in terms of three metrics: (1) the ability to reproduce the NMR structure of the peptide determined in the presence of SDS micelles and comparable structural data on other fusion peptides; (2) determination of the effects of the mutation Trp-8 to Ala and sequence discrimination of the homologous Marburg virus; and (3) calculation of potentials of mean force for estimating the partitioning free energy and their comparison to predictions from the Wimley-White interfacial hydrophobicity scale. We found the GBSW implicit membrane model to produce results of limited accuracy in conformational properties of the peptide when compared to the NMR structure, yet the model resolution is sufficient to determine the effect of sequence differentiation on peptide-membrane integration. © 2016 Wiley Periodicals, Inc.

  4. NASA GRC UAS Project: Communications Modeling and Simulation Status

    NASA Technical Reports Server (NTRS)

    Kubat, Greg

    2013-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and/or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC team, will provide a view of the overall planned simulation effort and objectives, a description of the simulation concept and status of the design and development that has occurred to date.

  5. Accuracy and convergence of coupled finite-volume/Monte Carlo codes for plasma edge simulations of nuclear fusion reactors

    SciTech Connect

    Ghoos, K.; Dekeyser, W.; Samaey, G.; Börner, P.; Baelmans, M.

    2016-10-01

    The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracy by making use of averaging in the Random Noise coupling technique.

  6. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    SciTech Connect

    Hager, Robert; Yoon, E.S.; Ku, S.; D'Azevedo, E.F.; Worley, P.H.; Chang, C.S.

    2016-06-15

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.

  7. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    SciTech Connect

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.

  8. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    DOE PAGES

    Hager, Robert; Yoon, E. S.; Ku, S.; ...

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computingmore » systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.« less

  9. A fully non-linear multi-species Fokker-Planck-Landau collision operator for simulation of fusion plasma

    NASA Astrophysics Data System (ADS)

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-06-01

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker-Planck-Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker-Planck-Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.

  10. NASA GRC UAS Project - Communications Modeling and Simulation Development Status

    NASA Technical Reports Server (NTRS)

    Apaza, Rafael; Bretmersky, Steven; Dailey, Justin; Satapathy, Goutam; Ditzenberger, David; Ye, Chris; Kubat, Greg; Chevalier, Christine; Nguyen, Thanh

    2014-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC Modeling and Simulation team, will provide an update to this ongoing effort at NASA GRC as follow-up to the overview of the planned simulation effort presented at ICNS in 2013. The objective

  11. GUMICS4 Synthetic and Dynamic Simulations of the ECLAT Project

    NASA Astrophysics Data System (ADS)

    Facsko, G.; Palmroth, M. M.; Gordeev, E.; Hakkinen, L. V.; Honkonen, I. J.; Janhunen, P.; Sergeev, V. A.; Kauristie, K.; Milan, S. E.

    2012-12-01

    The European Commission funded the European Cluster Assimilation Techniques (ECLAT) project as a collaboration of five leader European universities and research institutes. A main contribution of the Finnish Meteorological Institute (FMI) is to provide a wide range of global MHD runs with the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS). The runs are divided in two categories: synthetic runs investigating the extent of solar wind drivers that can influence magnetospheric dynamics, as well as dynamic runs using measured solar wind data as input. Here we consider the first set of runs with synthetic solar wind input. The solar wind density, velocity and the interplanetary magnetic field had different magnitudes and orientations; furthermore two F10.7 flux values were selected for solar radiation minimum and maximum values. The solar wind parameter values were constant such that a constant stable solution was archived. All configurations were run several times with three different (-15°, 0°, +15°) tilt angles in the GSE X-Z plane. The Cray XT supercomputer of the FMI provides a unique opportunity in global magnetohydrodynamic simulation: running the GUMICS-4 based on one year real solar wind data. Solar wind magnetic field, density, temperature and velocity data based on Advanced Composition Explorer (ACE) and WIND measurements are downloaded from the OMNIWeb open database and a special input file is created for each Cluster orbit. All data gaps are replaced with linear interpolations between the last and first valid data values before and after the data gap. Minimum variance transformation is applied for the Interplanetary Magnetic Field data to clean and avoid the code of divergence. The Cluster orbits are divided into slices allowing parallel computation and each slice has an average tilt angle value. The file timestamps start one hour before the perigee to provide time for building up a magnetosphere in the simulation space. The real

  12. MULTI-IFE-A one-dimensional computer code for Inertial Fusion Energy (IFE) target simulations

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.

    2016-06-01

    The code MULTI-IFE is a numerical tool devoted to the study of Inertial Fusion Energy (IFE) microcapsules. It includes the relevant physics for the implosion and thermonuclear ignition and burning: hydrodynamics of two component plasmas (ions and electrons), three-dimensional laser light ray-tracing, thermal diffusion, multigroup radiation transport, deuterium-tritium burning, and alpha particle diffusion. The corresponding differential equations are discretized in spherical one-dimensional Lagrangian coordinates. Two typical application examples, a high gain laser driven capsule and a low gain radiation driven marginally igniting capsule are discussed. In addition to phenomena relevant for IFE, the code includes also components (planar and cylindrical geometries, transport coefficients at low temperature, explicit treatment of Maxwell's equations) that extend its range of applicability to laser-matter interaction at moderate intensities (<1016 W cm-2). The source code design has been kept simple and structured with the aim to encourage user's modifications for specialized purposes.

  13. Neutral Buoyancy Simulator-EASE Project (NB32)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA's Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. Pictured is a Massachusetts Institute of Technology (MIT) student working in a spacesuit on the Experimental Assembly of Structures in Extravehicular Activity (EASE) project which was developed as a joint effort between MFSC and MIT. The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. The MIT student in this photo is assembling two six-beam tetrahedrons.

  14. Computer simulations for minds-on learning with ``Project Spectra!''

    NASA Astrophysics Data System (ADS)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  15. Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model

    USGS Publications Warehouse

    Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William

    1982-01-01

    The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.

  16. Simulation and Experimental Study on the Efficiency of Traveling Wave Direct Energy Conversion for Application to Aneutronic Fusion Reactions

    NASA Astrophysics Data System (ADS)

    Tarditi, Alfonso; Chap, Andrew; Miley, George; Scott, John

    2013-10-01

    A study based on both Particle-in-cell (PIC) simulation and experiments is being developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC,) with the perspective of application to aneutronic fusion reaction products and space propulsion. The PIC model is investigating in detail the key TWDEC physics process by simulating the time-dependent transfer of energy from the ion beam to an electric load connected to ring-type electrodes in cylindrical symmetry. An experimental effort is in progress on a TWDEC test article at NASA, Johnson Space Center with the purpose of studying the conditions for improving the efficiency of the direct energy conversion process. Using a scaled-down ion energy source, the experiment is primarily focused on the effect of the (bunched) beam density on the efficiency and on the optimization of the electrode design. The simulation model is guiding the development of the experimental configuration and will provide details of the beam dynamics for direct comparison with experimental diagnostics. Work supported by NASA, Johnson Space Center.

  17. Models and Simulations of C60-Fullerene Plasma Jets for Disruption Mitigation and Magneto-Inertial Fusion

    NASA Astrophysics Data System (ADS)

    Bogatu, Ioan-Niculae; Galkin, Sergei A.; Kim, Jin-Soo

    2009-11-01

    We present the models and simulation results of C60-fullerene plasma jets proposed to be used for the disruption mitigation on ITER and for magneto-inertial fusion (MIF). The model describing the fast production of a large mass of C60 molecular gas in the pulsed power source by explosive sublimation of C60 micro-grains is detailed. Several aspects of the magnetic ``piston'' model and the 2D interchange (magnetic Rayleigh-Taylor) instability in the rail gun arc dynamics are described. A plasma jet adiabatic expansion model is used to investigate the in-flight three-body recombination during jet transport to the plasma boundary. Our LSP PIC code 3D simulations show that heavy C60 plasmoid penetrates deeply through a transverse magnetic barrier demonstrating self-polarization and magnetic field expulsion effects. The LSP code 3D simulation of two plasma jets head-on injection along a magnetic field lines for MIF are also discussed.

  18. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    SciTech Connect

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE`s Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP`s charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program.

  19. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    ERIC Educational Resources Information Center

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  20. Hydrogen species in diamond: Molecular dynamics simulation in bulk diamond for fusion applications

    NASA Astrophysics Data System (ADS)

    Delgado, D.; Vila, R.

    2014-09-01

    For an electron cyclotron resonance heating system, a diamond window seems to be the only material able to withstand the high microwave power and radiation effects and at the same time act as a tritium barrier. Therefore it is important to understand the evolution of hydrogen isotopes in diamond. Both, hydrogen content and radiation can quite rapidly degrade its excellent properties. Hydrogen isotopes can be introduced in the material by two processes: (1) during the growth process of synthetic samples and (2) as a neutron radiation effect when devices are exposed to a fusion irradiation environment. In the last case, both device performance (thermal, optical and dielectric properties degradation) and hands-on maintenance of the window (tritium inventory), demand a good knowledge of hydrogen species concentrations and their evolution with lattice damage. In this paper, a classical molecular dynamics study analyses the hydrogen equilibrium sites in diamond, and also their bulk and interstitial vibrational characteristics, including isotopic shifts. Some interesting results are presented and discussed. We confirm that the bond-centred site is the more stable configuration for H. Vibrational studies show lines in the C-H stretching region. Isotopic studies reveal ratios close to the theoretical ones for BC and ET sites. On the contrary, the AB site vibrations obtained suggest the existence of a local carbon oscillation.

  1. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    NASA Astrophysics Data System (ADS)

    Vold, E. L.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.; Molvig, K.

    2015-11-01

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  2. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, Erik Lehman; Joglekar, Archis S.; Ortega, Mario I.; Moll, Ryan; Fenn, Daniel; Molvig, Kim

    2015-11-20

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion(ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. In this paper, we have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasmaviscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasmaviscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasmaviscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Finally, plasmaviscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  3. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    DOE PAGES

    Vold, Erik Lehman; Joglekar, Archis S.; Ortega, Mario I.; ...

    2015-11-20

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion(ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. In this paper, we have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasmaviscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasmaviscosity andmore » to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasmaviscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Finally, plasmaviscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.« less

  4. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, E. L.; Molvig, K.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.

    2015-11-15

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  5. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  6. A simulation-based and analytic analysis of the off-Hugoniot response of alternative inertial confinement fusion ablator materials

    NASA Astrophysics Data System (ADS)

    Moore, Alastair S.; Prisbrey, Shon; Baker, Kevin L.; Celliers, Peter M.; Fry, Jonathan; Dittrich, Thomas R.; Wu, Kuang-Jen J.; Kervin, Margaret L.; Schoff, Michael E.; Farrell, Mike; Nikroo, Abbas; Hurricane, Omar A.

    2016-09-01

    The attainment of self-propagating fusion burn in an inertial confinement target at the National Ignition Facility will require the use of an ablator with high rocket-efficiency and ablation pressure. The ablation material used during the National Ignition Campaign (Lindl et al. 2014) [1], a glow-discharge polymer (GDP), does not couple as efficiently as simulations indicated to the multiple-shock inducing radiation drive environment created by laser power profile (Robey et al., 2012). We investigate the performance of two other ablators, boron carbide (B4C) and high-density carbon (HDC) compared to the performance of GDP under the same hohlraum conditions. Ablation performance is determined through measurement of the shock speed produced in planar samples of the ablator material subjected to the identical multiple-shock inducing radiation drive environments that are similar to a generic three-shock ignition drive. Simulations are in better agreement with the off-Hugoniot performance of B4C than either HDC or GDP, and analytic estimations of the ablation pressure indicate that while the pressure produced by B4C and GDP is similar when the ablator is allowed to release, the pressure reached by B4C seems to exceed that of HDC when backed by a Au/quartz layer.

  7. Negative ion extraction via particle simulation for fusion: critical assessment of recent contributions

    NASA Astrophysics Data System (ADS)

    Garrigues, L.; Fubiani, G.; Boeuf, J. P.

    2017-01-01

    Particle-in-cell (PIC) models have been extensively used in the last few years to describe negative ion extraction for neutral beam injection applications. We show that some of these models have been employed in conditions far from the requirements of particle simulations and that questionable conclusions about negative ion extraction, not supported by experimental evidence, have been obtained. We present a critical analysis of the method that has led to these conclusions and propose directions toward a more accurate and realistic description of negative ion extraction. We show in particular that, as expected in PIC simulations, mesh convergence is reached only if the grid spacing is on the order of or smaller than the minimum Debye length in the simulation domain, and that strong aberrations in the extracted beam are observed if this constraint is not respected. The method of injection of charged particles in the simulated plasma is also discussed, and we show that some injection methods used in the literature lead to unphysical results.

  8. Fusion Studies in Japan

    NASA Astrophysics Data System (ADS)

    Ogawa, Yuichi

    2016-05-01

    A new strategic energy plan decided by the Japanese Cabinet in 2014 strongly supports the steady promotion of nuclear fusion development activities, including the ITER project and the Broader Approach activities from the long-term viewpoint. Atomic Energy Commission (AEC) in Japan formulated the Third Phase Basic Program so as to promote an experimental fusion reactor project. In 2005 AEC has reviewed this Program, and discussed on selection and concentration among many projects of fusion reactor development. In addition to the promotion of ITER project, advanced tokamak research by JT-60SA, helical plasma experiment by LHD, FIREX project in laser fusion research and fusion engineering by IFMIF were highly prioritized. Although the basic concept is quite different between tokamak, helical and laser fusion researches, there exist a lot of common features such as plasma physics on 3-D magnetic geometry, high power heat load on plasma facing component and so on. Therefore, a synergetic scenario on fusion reactor development among various plasma confinement concepts would be important.

  9. Model-data fusion across ecosystems: from multi-site optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-05-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net carbon (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multi-site approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are: reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modeling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multi-site parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modeled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global scale evaluation with remote sensing NDVI measurements indicates an improvement of the simulated seasonal variations of

  10. Model-data fusion across ecosystems: from multisite optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-11-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net ecosystem exchange (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multisite approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modelling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP - gross primary productivity) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multisite parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modelled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global-scale evaluation with remote sensing NDVI (normalized difference vegetation index

  11. Magnetic fusion reactor economics

    SciTech Connect

    Krakowski, R.A.

    1995-12-01

    An almost primordial trend in the conversion and use of energy is an increased complexity and cost of conversion systems designed to utilize cheaper and more-abundant fuels; this trend is exemplified by the progression fossil fission {yields} fusion. The present projections of the latter indicate that capital costs of the fusion ``burner`` far exceed any commensurate savings associated with the cheapest and most-abundant of fuels. These projections suggest competitive fusion power only if internal costs associate with the use of fossil or fission fuels emerge to make them either uneconomic, unacceptable, or both with respect to expensive fusion systems. This ``implementation-by-default`` plan for fusion is re-examined by identifying in general terms fusion power-plant embodiments that might compete favorably under conditions where internal costs (both economic and environmental) of fossil and/or fission are not as great as is needed to justify the contemporary vision for fusion power. Competitive fusion power in this context will require a significant broadening of an overly focused program to explore the physics and simbiotic technologies leading to more compact, simplified, and efficient plasma-confinement configurations that reside at the heart of an attractive fusion power plant.

  12. Simulation of plasma-surface interactions in a fusion reactor by means of QSPA plasma streams: recent results and prospects

    NASA Astrophysics Data System (ADS)

    Garkusha, I. E.; Aksenov, N. N.; Byrka, O. V.; Makhlaj, V. A.; Herashchenko, S. S.; Malykhin, S. V.; Petrov, Yu V.; Staltsov, V. V.; Surovitskiy, S. V.; Wirtz, M.; Linke, J.; Sadowski, M. J.; Skladnik-Sadowska, E.

    2016-09-01

    This paper is devoted to plasma-surface interaction issues at high heat-loads which are typical for fusion reactors. For the International Thermonuclear Experimental Reactor (ITER), which is now under construction, the knowledge of erosion processes and the behaviour of various constructional materials under extreme conditions is a very critical issue, which will determine a successful realization of the project. The most important plasma-surface interaction (PSI) effects in 3D geometry have been studied using a QSPA Kh-50 powerful quasi-stationary plasma accelerator. Mechanisms of the droplet and dust generation have been investigated in detail. It was found that the droplets emission from castellated surfaces has a threshold character and a cyclic nature. It begins only after a certain number of the irradiating plasma pulses when molten and shifted material is accumulated at the edges of the castellated structure. This new erosion mechanism, connected with the edge effects, results in an increase in the size of the emitted droplets (as compared with those emitted from a flat surface). This mechanism can even induce the ejection of sub-mm particles. A concept of a new-generation QSPA facility, the current status of this device maintenance, and prospects for further experiments are also presented.

  13. Teaching Engineering Statistics with Technology, Group Learning, Contextual Projects, Simulation Models and Student Presentations

    ERIC Educational Resources Information Center

    Romeu, Jorge Luis

    2008-01-01

    This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…

  14. Cyclokinetic models and simulations for high-frequency turbulence in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Deng, Zhao; Waltz, R. E.; Wang, Xiaogang

    2016-10-01

    Gyrokinetics is widely applied in plasma physics. However, this framework is limited to weak turbulence levels and low drift-wave frequencies because high-frequency gyro-motion is reduced by the gyro-phase averaging. In order to test where gyrokinetics breaks down, Waltz and Zhao developed a new theory, called cyclokinetics [R. E. Waltz and Zhao Deng, Phys. Plasmas 20, 012507 (2013)]. Cyclokinetics dynamically follows the high-frequency ion gyro-motion which is nonlinearly coupled to the low-frequency drift-waves interrupting and suppressing gyro-averaging. Cyclokinetics is valid in the high-frequency (ion cyclotron frequency) regime or for high turbulence levels. The ratio of the cyclokinetic perturbed distribution function over equilibrium distribution function δf/ F can approach 1. This work presents, for the first time, a numerical simulation of nonlinear cyclokinetic theory for ions, and describes the first attempt to completely solve the ion gyro-phase motion in a nonlinear turbulence system. Simulations are performed [Zhao Deng and R. E. Waltz, Phys. Plasmas 22(5), 056101 (2015)] in a local flux-tube geometry with the parallel motion and variation suppressed by using a newly developed code named rCYCLO, which is executed in parallel by using an implicit time-advanced Eulerian (or continuum) scheme [Zhao Deng and R. E. Waltz, Comp. Phys. Comm. 195, 23 (2015)]. A novel numerical treatment of the magnetic moment velocity space derivative operator guarantee saccurate conservation of incremental entropy. By comparing the more fundamental cyclokinetic simulations with the corresponding gyrokinetic simulations, the gyrokinetics breakdown condition is quantitatively tested. Gyrokinetic transport and turbulence level recover those of cyclokinetics at high relative ion cyclotron frequencies and low turbulence levels, as required. Cyclokinetic transport and turbulence level are found to be lower than those of gyrokinetics at high turbulence levels and low- Ω* values

  15. Subcascade formation in displacement cascade simulations: Implications for fusion reactor materials

    SciTech Connect

    Stoller, R.E.; Greenwood, L.R.

    1998-03-01

    Primary radiation damage formation in iron has been investigated by the method of molecular dynamics (MD) for cascade energies up to 40 keV. The initial energy EMD given to the simulated PKA is approximately equivalent to the damage energy in the standard secondary displacement model by Norgett, Robinson, and Torrens (NRT); hence, EMD is less than the corresponding PKA energy. Using the values of EMD in Table 1, the corresponding EPKA and the NRT defects in iron have been calculated using the procedure described in Ref. 1 with the recommended 40 eV displacement threshold. These values are also listed in Table 1. Note that the difference between the EMD and the PKA energy increases as the PKA energy increases and that the highest simulated PKA energy of 61.3 keV is the average for a collision with a 1.77 MeV neutron. Thus, these simulations have reached well into the fast neutron energy regime. For purposes of comparison, the parameters for the maximum DT neutron energy of 14.1 MeV are also included in Table 1. Although the primary damage parameters derived from the MD cascades exhibited a strong dependence on cascade energy up to 10 keV, this dependence was diminished and slightly reversed between 20 and 40 keV, apparently due to the formation of well-defined subcascades in this energy region. Such an explanation is only qualitative at this time, and additional analysis of the high energy cascades is underway in an attempt to obtain a quantitative measure of the relationship between cascade morphology and defect survival.

  16. Kinetic simulations of stimulated Raman backscattering and related processes for the shock-ignition approach to inertial confinement fusion

    SciTech Connect

    Riconda, C.; Weber, S.; Tikhonchuk, V. T.; Heron, A.

    2011-09-15

    A detailed description of stimulated Raman backscattering and related processes for the purpose of inertial confinement fusion requires multi-dimensional kinetic simulations of a full speckle in a high-temperature, large-scale, inhomogeneous plasma. In particular for the shock-ignition scheme operating at high laser intensities, kinetic aspects are predominant. High- (I{lambda}{sub o}{sup 2}{approx}5x10{sup 15}W{mu}m{sup 2}/cm{sup 2}) as well as low-intensity (I{lambda}{sub o}{sup 2}{approx}10{sup 15}W{mu}m{sup 2}/cm{sup 2}) cases show the predominance of collisionless, collective processes for the interaction. While the two-plasmon decay instability and the cavitation scenario are hardly affected by intensity variation, inflationary Raman backscattering proves to be very sensitive. Brillouin backscattering evolves on longer time scales and dominates the reflectivities, although it is sensitive to the intensity. Filamentation and self-focusing do occur for all cases but on time scales too long to affect Raman backscattering.

  17. Fission thrust sail as booster for high Δv fusion based propulsion

    NASA Astrophysics Data System (ADS)

    Ceyssens, Frederik; Wouters, Kristof; Driesen, Maarten

    2015-12-01

    The fission thrust sail as booster for nuclear fusion-based rocket propulsion for future starships is introduced and studied. First order calculations are used together with Monte Carlo simulations to assess system performance. If a D-D fusion rocket such as e.g. considered in Project Icarus has relatively low efficiency (~30%) in converting fusion fuel to a directed exhaust, adding a fission sail is shown to be beneficial for the obtainable delta-v. In addition, this type of fission-fusion hybrid propulsion has the potential to improve acceleration and act as a micrometeorite shield.

  18. Exponential yield sensitivity to long-wavelength asymmetries in three-dimensional simulations of inertial confinement fusion capsule implosions

    SciTech Connect

    Haines, Brian M.

    2015-08-15

    In this paper, we perform a series of high-resolution 3D simulations of an OMEGA-type inertial confinement fusion (ICF) capsule implosion with varying levels of initial long-wavelength asymmetries in order to establish the physical energy loss mechanism for observed yield degradation due to long-wavelength asymmetries in symcap (gas-filled capsule) implosions. These simulations demonstrate that, as the magnitude of the initial asymmetries is increased, shell kinetic energy is increasingly retained in the shell instead of being converted to fuel internal energy. This is caused by the displacement of fuel mass away from and shell material into the center of the implosion due to complex vortical flows seeded by the long-wavelength asymmetries. These flows are not fully turbulent, but demonstrate mode coupling through non-linear instability development during shell stagnation and late-time shock interactions with the shell interface. We quantify this effect by defining a separation lengthscale between the fuel mass and internal energy and show that this is correlated with yield degradation. The yield degradation shows an exponential sensitivity to the RMS magnitude of the long-wavelength asymmetries. This strong dependence may explain the lack of repeatability frequently observed in OMEGA ICF experiments. In contrast to previously reported mechanisms for yield degradation due to turbulent instability growth, yield degradation is not correlated with mixing between shell and fuel material. Indeed, an integrated measure of mixing decreases with increasing initial asymmetry magnitude due to delayed shock interactions caused by growth of the long-wavelength asymmetries without a corresponding delay in disassembly.

  19. Simulation of motions of the plasma in a fusion reactor for obtaining of future energy

    NASA Astrophysics Data System (ADS)

    Zhumabekov, Askhat

    2017-01-01

    According to the most conservative estimates, by the middle of the XXI century in the world energy consumption will double. This will be a consequence of the global economic development, population growth and other geopolitical and economic factors. Energy consumption in the world is growing much faster than its production and industrial use of new advanced technologies in the energy sector, for objective reasons, will not begin until 2030. This paper discusses how to obtain and develop nuclear energy on the experience of the National Nuclear Center. Implemented model for the problem of plasma confinement, and also presents the main achievements of modern construction and Megaproject National Nuclear Center in Kurchatov, the Republic of Kazakhstan. Spend a social survey in the East Kazakhstan region on the theme: “Prospects for the development of nuclear energy in Kazakhstan” and the citizens’ opinion. Narration new priorities for May 22, 2015 in Ust-Kamenogorsk in the industrial park “Altai” based on the competition of innovation projects green technology in the international exhibition “OSKEMEN EXPO – 2015”, with the participation of the regional authorities of the Republic of Kazakhstan, representatives of JSC NC “Astana Expo” and delegations from Japan, Russia, Canada, USA, South Korea.

  20. Experimental Characterization of a Plasma Deflagration Accelerator for Simulating Fusion Wall Response to Disruption Events

    NASA Astrophysics Data System (ADS)

    Underwood, Thomas; Loebner, Keith; Cappelli, Mark

    2016-10-01

    In this work, the suitability of a pulsed deflagration accelerator to simulate the interaction of edge-localized modes with plasma first wall materials is investigated. Experimental measurements derived from a suite of diagnostics are presented that focus on the both the properties of the plasma jet and the manner in which such jets couple with material interfaces. Detailed measurements of the thermodynamic plasma state variables within the jet are presented using a quadruple Langmuir probe operating in current-saturation mode. This data in conjunction with spectroscopic measurements of H α Stark broadening via a fast-framing, intensified CCD camera provide spatial and temporal measurements of how the plasma density and temperature scale as a function of input energy. Using these measurements, estimates for the energy flux associated with the deflagration accelerator are found to be completely tunable over a range spanning 150 MW m-2 - 30 GW m-2. The plasma-material interface is investigated using tungsten tokens exposed to the plasma plume under variable conditions. Visualizations of resulting shock structures are achieved through Schlieren cinematography and energy transfer dynamics are discussed by presenting temperature measurements of exposed materials. This work is supported by the U.S. Department of Energy Stewardship Science Academic Program in addition to the National Defense Science Engineering Graduate Fellowship.

  1. Numerical analysis of applied magnetic field dependence in Malmberg-Penning Trap for compact simulator of energy driver in heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Sato, T.; Park, Y.; Soga, Y.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob

    2016-05-01

    To simulate a pulse compression process of space charge dominated beams in heavy ion fusion, we have demonstrated a multi-particle numerical simulation as an equivalent beam using the Malmberg-Penning trap device. The results show that both transverse and longitudinal velocities as a function of external magnetic field strength are increasing during the longitudinal compression. The influence of space-charge effect, which is related to the external magnetic field, was observed as the increase of high velocity particles at the weak external magnetic field.

  2. The Maya Project: Numerical Simulations of Black Hole Collisions

    NASA Astrophysics Data System (ADS)

    Smith, Kenneth; Calabrese, Gioel; Garrison, David; Kelly, Bernard; Laguna, Pablo; Lockitch, Keith; Pullin, Jorge; Shoemaker, Deirdre; Tiglio, Manuel

    2001-04-01

    The main objective of the MAYA project is the development of a numerical code to solve the vacuum Einstein's field equations for spacetimes containing multiple black hole singularities. Incorporating knowledge gained from previous similar efforts (Binary Black Holes Alliance and the AGAVE project) as well as one-dimensional numerical studies, MAYA has been built from the ground up within the architecture of Cactus 4.0, with particular attention paid to the software engineering aspects of code development. The goal of this new effort is to ultimately have a robust, efficient, readable, and stable numerical code for black hole evolution. This poster presents an overview of the project, focusing on the innovative aspects of the project as well as its current development status.

  3. Secretarial Administration: Project In/Vest: Insurance Simulation Insures Learning

    ERIC Educational Resources Information Center

    Geier, Charlene

    1978-01-01

    Describes a simulated model office to replicate various insurance occupations set up in Greenfield High School, Wisconsin. Local insurance agents and students from other disciplines, such as distributive education, are involved in the simulation. The training is applicable to other business office positions, as it models not only an insurance…

  4. Project Shuttle simulation math model coordination catalog, revision 1

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A catalog is presented of subsystem and environment math models used or planned for space shuttle simulations. The purpose is to facilitate sharing of similar math models between shuttle simulations. It provides information on mach model requirements, formulations, schedules, and contact persons for further information.

  5. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  6. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  7. Warm starting the projected Gauss-Seidel algorithm for granular matter simulation

    NASA Astrophysics Data System (ADS)

    Wang, Da; Servin, Martin; Berglund, Tomas

    2016-03-01

    The effect on the convergence of warm starting the projected Gauss-Seidel solver for nonsmooth discrete element simulation of granular matter are investigated. It is found that the computational performance can be increased by a factor 2-5.

  8. How historic simulation-observation discrepancy affects future warming projections in a very large model ensemble

    NASA Astrophysics Data System (ADS)

    Goodwin, Philip

    2016-10-01

    Projections of future climate made by model-ensembles have credibility because the historic simulations by these models are consistent with, or near-consistent with, historic observations. However, it is not known how small inconsistencies between the ranges of observed and simulated historic climate change affects the future projections made by a model ensemble. Here, the impact of historical simulation-observation inconsistencies on future warming projections is quantified in a 4-million member Monte Carlo ensemble from a new efficient Earth System Model (ESM). Of the 4-million ensemble members, a subset of 182,500 are consistent with historic ranges of warming, heat uptake and carbon uptake simulated by the Climate Model Intercomparison Project 5 (CMIP5) ensemble. This simulation-consistent subset projects similar future warming ranges to the CMIP5 ensemble for all four RCP scenarios, indicating the new ESM represents an efficient tool to explore parameter space for future warming projections based on historic performance. A second subset of 14,500 ensemble members are consistent with historic observations for warming, heat uptake and carbon uptake. This observation-consistent subset projects a narrower range for future warming, with the lower bounds of projected warming still similar to CMIP5, but the upper warming bounds reduced by 20-35 %. These findings suggest that part of the upper range of twenty-first century CMIP5 warming projections may reflect historical simulation-observation inconsistencies. However, the agreement of lower bounds for projected warming implies that the likelihood of warming exceeding dangerous levels over the twenty-first century is unaffected by small discrepancies between CMIP5 models and observations.

  9. Non-Gaussian fluctuations and non-Markovian effects in the nuclear fusion process: Langevin dynamics emerging from quantum molecular dynamics simulations.

    PubMed

    Wen, Kai; Sakata, Fumihiko; Li, Zhu-Xia; Wu, Xi-Zhen; Zhang, Ying-Xun; Zhou, Shan-Gui

    2013-07-05

    Macroscopic parameters as well as precise information on the random force characterizing the Langevin-type description of the nuclear fusion process around the Coulomb barrier are extracted from the microscopic dynamics of individual nucleons by exploiting the numerical simulation of the improved quantum molecular dynamics. It turns out that the dissipation dynamics of the relative motion between two fusing nuclei is caused by a non-Gaussian distribution of the random force. We find that the friction coefficient as well as the time correlation function of the random force takes particularly large values in a region a little bit inside of the Coulomb barrier. A clear non-Markovian effect is observed in the time correlation function of the random force. It is further shown that an emergent dynamics of the fusion process can be described by the generalized Langevin equation with memory effects by appropriately incorporating the microscopic information of individual nucleons through the random force and its time correlation function.

  10. Revitalizing Fusion via Fission Fusion

    NASA Astrophysics Data System (ADS)

    Manheimer, Wallace

    2001-10-01

    Existing tokamaks could generate significant nuclear fuel. TFTR, operating steady state with DT might generate enough fuel for a 300 MW nuclear reactor. The immediate goals of the magnetic fusion program would necessarily shift from a study of advanced plasma regimes in larger sized devices, to mostly known plasmas regimes, but at steady state or high duty cycle operation in DT plasmas. The science and engineering of breeding blankets would be equally important. Follow on projects could possibly produce nuclear fuel in large quantity at low price. Although today there is strong opposition to nuclear power in the United States, in a 21st century world of 10 billion people, all of whom will demand a middle class life style, nuclear energy will be important. Concern over greenhouse gases will also drive the world toward nuclear power. There are studies indicating that the world will need 10 TW of carbon free energy by 2050. It is difficult to see how this can be achieved without the breeding of nuclear fuel. By using the thorium cycle, proliferation risks are minimized. [1], [2]. 1 W. Manheimer, Fusion Technology, 36, 1, 1999, 2.W. Manheimer, Physics and Society, v 29, #3, p5, July, 2000

  11. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    ERIC Educational Resources Information Center

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  12. The Redesign of PROJECT SIMULATION for Microcomputer-Assisted Instruction in Psychology and Research Methodology.

    ERIC Educational Resources Information Center

    King, Alan R.; King, Barry F.

    1988-01-01

    This article offers guidelines to assist simulation developers in maximizing the lifespan of their software products through structured designs and creative attempts at integrating their programs into standard courses. Current efforts to redesign PROJECT SIMULATION, a computer-assisted instructional software package for teaching methodology in…

  13. Final Technical Report for Center for Plasma Edge Simulation Research

    SciTech Connect

    Pankin, Alexei Y.; Bateman, Glenn; Kritz, Arnold H.

    2012-02-29

    The CPES research carried out by the Lehigh fusion group has sought to satisfy the evolving requirements of the CPES project. Overall, the Lehigh group has focused on verification and validation of the codes developed and/or integrated in the CPES project. Consequently, contacts and interaction with experimentalists have been maintained during the course of the project. Prof. Arnold Kritz, the leader of the Lehigh Fusion Group, has participated in the executive management of the CPES project. The code development and simulation studies carried out by the Lehigh fusion group are described in more detail in the sections below.

  14. The Multi-SAG project: filling the MultiDark simulations with semi-analytic galaxies

    NASA Astrophysics Data System (ADS)

    Vega-Martínez, C. A.; Cora, S. A.; Padilla, N. D.; Muñoz Arancibia, A. M.; Orsi, A. A.; Ruiz, A. N.

    2016-08-01

    The semi-analytical model sag is a code of galaxy formation and evolution which is applied to halo catalogs and merger trees extracted from cosmological -body simulations of dark matter. This contribution describes the project of constructing a catalog of simulated galaxies by adapting and applying the model sag over two dark matter simulations of the spanish MultiDark Project publicly available. Those simulations have particles, each, in boxes with sizes of 1000 Mpc and 400 Mpc respectively with Planck cosmological parameters. They cover a large range of masses and have halo mass resolutions of , therefore each simulation is able to produce more than 150 millions of simulated galaxies. A detailed description of the method is explained, and the first statistical results are shown.

  15. WE-EF-207-08: Improve Cone Beam CT Using a Synchronized Moving Grid, An Inter-Projection Sensor Fusion and a Probability Total Variation Reconstruction

    SciTech Connect

    Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W

    2015-06-15

    Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose.

  16. Spinal Fusion

    MedlinePlus

    ... concept of fusion is similar to that of welding in industry. Spinal fusion surgery, however, does not ... bone taken from the patient has a long history of use and results in predictable healing. Autograft ...

  17. A system simulation development project: Leveraging resources through partnerships

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Owen, A. Karl; Davis, Milt W.

    1995-01-01

    Partnerships between government agencies are an intellectually attractive method of conducting scientific research; the goal is to establish mutually beneficial participant roles for technology exchange that ultimately pays-off in a stronger R&D program for each partner. Anticipated and current aerospace research budgetary pressures through the 90's provide additional impetus for Government research agencies to candidly assess their R&D for those simulation activities no longer unique enough to warrant 'going it alone,' or for those elements where partnerships or teams can offset development costs. This paper describes a specific inter-agency system simulation activity that leverages the development cost of mutually beneficial R&D. While the direct positive influence of partnerships on complex technology developments is our main thesis, we also address on-going teaming issues and hope to impart to the reader the immense indirect (sometimes immeasurable) benefits that meaningful interagency partnerships can produce.

  18. Fast simulation of x-ray projections of spline-based surfaces using an append buffer

    NASA Astrophysics Data System (ADS)

    Maier, Andreas; Hofmann, Hannes G.; Schwemmer, Chris; Hornegger, Joachim; Keil, Andreas; Fahrig, Rebecca

    2012-10-01

    Many scientists in the field of x-ray imaging rely on the simulation of x-ray images. As the phantom models become more and more realistic, their projection requires high computational effort. Since x-ray images are based on transmission, many standard graphics acceleration algorithms cannot be applied to this task. However, if adapted properly, the simulation speed can be increased dramatically using state-of-the-art graphics hardware. A custom graphics pipeline that simulates transmission projections for tomographic reconstruction was implemented based on moving spline surface models. All steps from tessellation of the splines, projection onto the detector and drawing are implemented in OpenCL. We introduced a special append buffer for increased performance in order to store the intersections with the scene for every ray. Intersections are then sorted and resolved to materials. Lastly, an absorption model is evaluated to yield an absorption value for each projection pixel. Projection of a moving spline structure is fast and accurate. Projections of size 640 × 480 can be generated within 254 ms. Reconstructions using the projections show errors below 1 HU with a sharp reconstruction kernel. Traditional GPU-based acceleration schemes are not suitable for our reconstruction task. Even in the absence of noise, they result in errors up to 9 HU on average, although projection images appear to be correct under visual examination. Projections generated with our new method are suitable for the validation of novel CT reconstruction algorithms. For complex simulations, such as the evaluation of motion-compensated reconstruction algorithms, this kind of x-ray simulation will reduce the computation time dramatically.

  19. Extensible Modeling and Simulation Framework (XMSF) 2004 Project Summary Report

    DTIC Science & Technology

    2007-11-02

    common solution is the use of metamodels. We recommend starting work on a common layered metamodel to organize and map the proposed standards, very...process; e.g., one (or more) describing the natural environment, one (or more) describing military and non-military forces to be represented in the...simulation. Markup languages based on XML, Internet technologies, and Web services are combining to enable a new generation of distributed M&S

  20. Tactical Aviation Mission System Simulation Situational Awareness Project

    DTIC Science & Technology

    2004-04-01

    CH - 146 Griffon helicopter flown by the Department of National Defence (DND). The Carleton...the CH - 146 Griffon . HELISIM accepts inputs from the collective, cyclic and pedals of the simulator and using the defined flight model , updates... helicopters , (d) one wrecked CH149, (e) two CH149s flying in small loop formations, (f) two hovering CH - 146 Griffon helicopters , (g) one formation of

  1. Community Project for Accelerator Science and Simulation (ComPASS)

    SciTech Connect

    Simmons, Christopher; Carey, Varis

    2016-10-12

    After concluding our initial exercise (solving a simplified statistical inverse problem with unknown parameter laser intensity) of coupling Vorpal and our parallel statistical library QUESO, we shifted the application focus to DLA. Our efforts focused on developing a Gaussian process (GP) emulator within QUESO for efficient optimization of power couplers within woodpiles. The smaller simulation size (compared with LPA) allows for sufficient “training runs” to develop a reasonable GP statistical emulator for a parameter space of moderate dimension.

  2. A GPU tool for efficient, accurate, and realistic simulation of cone beam CT projections

    PubMed Central

    Jia, Xun; Yan, Hao; Cerviño, Laura; Folkerts, Michael; Jiang, Steve B.

    2012-01-01

    Purpose: Simulation of x-ray projection images plays an important role in cone beam CT (CBCT) related research projects, such as the design of reconstruction algorithms or scanners. A projection image contains primary signal, scatter signal, and noise. It is computationally demanding to perform accurate and realistic computations for all of these components. In this work, the authors develop a package on graphics processing unit (GPU), called gDRR, for the accurate and efficient computations of x-ray projection images in CBCT under clinically realistic conditions. Methods: The primary signal is computed by a trilinear ray-tracing algorithm. A Monte Carlo (MC) simulation is then performed, yielding the primary signal and the scatter signal, both with noise. A denoising process specifically designed for Poisson noise removal is applied to obtain a smooth scatter signal. The noise component is then obtained by combining the difference between the MC primary and the ray-tracing primary signals, and the difference between the MC simulated scatter and the denoised scatter signals. Finally, a calibration step converts the calculated noise signal into a realistic one by scaling its amplitude according to a specified mAs level. The computations of gDRR include a number of realistic features, e.g., a bowtie filter, a polyenergetic spectrum, and detector response. The implementation is fine-tuned for a GPU platform to yield high computational efficiency. Results: For a typical CBCT projection with a polyenergetic spectrum, the calculation time for the primary signal using the ray-tracing algorithms is 1.2–2.3 s, while the MC simulations take 28.1–95.3 s, depending on the voxel size. Computation time for all other steps is negligible. The ray-tracing primary signal matches well with the primary part of the MC simulation result. The MC simulated scatter signal using gDRR is in agreement with EGSnrc results with a relative difference of 3.8%. A noise calibration process is

  3. Radioscapholunate Fusions

    PubMed Central

    McGuire, Duncan Thomas; Bain, Gregory Ian

    2012-01-01

    Radiocarpal fusions are performed for a variety of indications, most commonly for debilitating painful arthritis. The goal of a wrist fusion is to fuse the painful, diseased joints and to preserve motion through the healthy joints. Depending on the extent of the disease process, radiocarpal fusions may take the form of radiolunate, radioscapholunate, or total wrist fusions. Surgical techniques and instrumentation have advanced over the last few decades, and consequently the functional outcomes have improved and complications decreased. Techniques for partial carpal fusions have improved and now include distal scaphoid and triquetrum excision, which improves range of motion and fusion rates. In this article we discuss the various surgical techniques and fixation methods available and review the corresponding evidence in the literature. The authors' preferred surgical technique of radioscapholunate fusion with distal scaphoid and triquetrum excision is outlined. New implants and new concepts are also discussed. PMID:24179717

  4. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    ERIC Educational Resources Information Center

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  5. Projection collimator optics for DMD-based infrared scene simulator

    NASA Astrophysics Data System (ADS)

    Zheng, Yawei; Hu, Yu; Li, Junnan; Huang, Meili; Gao, Jiaobo; Wang, Jun; Sun, Kefeng; Li, Jianjun; Zhang, Fang

    2016-10-01

    The design of the collimator for dynamic infrared (IR) scene simulation based on the digital micro-mirror devices (DMD) is present in this paper. The collimator adopts a reimaging configuration to limit in physical size availability and cost. The aspheric lens is used in the relay optics to improve the image quality and simplify the optics configuration. The total internal reflection (TIR) prisms is located between the last surface of the optics and the DMD to fold the raypaths of the IR light source. The optics collimates the output from 1024×768 element DMD in the 8 10.3μm waveband and enables an imaging system to be tested out of 8° Field Of View (FOV). The long pupil distance of 800mm ensures the remote location seekers under the test.

  6. Modeling and Simulation Optimization and Feasibility Studies for the Neutron Detection without Helium-3 Project

    SciTech Connect

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.; Lintereur, Azaree T.

    2013-01-01

    This report details the results of the modeling and simulation work accomplished for the ‘Neutron Detection without Helium-3’ project during the 2011 and 2012 fiscal years. The primary focus of the project is to investigate commercially available technologies that might be used in safeguards applications in the relatively near term. Other technologies that are being developed may be more applicable in the future, but are outside the scope of this study.

  7. Projected 2050 Model Simulations for the Chesapeake Bay ...

    EPA Pesticide Factsheets

    The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  8. Project ARGO: Gas phase formation in simulated microgravity

    NASA Technical Reports Server (NTRS)

    Powell, Michael R.; Waligora, James M.; Norfleet, William T.; Kumar, K. Vasantha

    1993-01-01

    The ARGO study investigated the reduced incidence of joint pain decompression sickness (DCS) encountered in microgravity as compared with an expected incidence of joint pain DCS experienced by test subjects in Earth-based laboratories (unit gravity) with similar protocols. Individuals who are decompressed from saturated conditions usually acquire joint pain DCS in the lower extremities. Our hypothesis is that the incidence of joint pain DCS can be limited by a significant reduction in the tissue gas micronuclei formed by stress-assisted nucleation. Reductions in dynamic and kinetic stresses in vivo are linked to hypokinetic and adynamic conditions of individuals in zero g. We employed the Doppler ultrasound bubble detection technique in simulated microgravity studies to determine quantitatively the degree of gas phase formation in the upper and lower extremities of test subjects during decompression. We found no evidence of right-to-left shunting through pulmonary vasculature. The volume of gas bubble following decompression was examined and compared with the number following saline contrast injection. From this, we predict a reduced incidence of DCS on orbit, although the incidence of predicted mild DCS still remains larger than that encountered on orbit.

  9. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  10. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  11. Ten years of computer visual simulations on large scale projects in the western United States

    SciTech Connect

    Ellsworth, J.C.

    1999-07-01

    Computer visual simulations are used to portray proposed landscape changes with true color, photo-realistic quality, and high levels of accuracy and credibility. this sophisticated technology is a valuable tool for planners, landscape architects, architects, engineers, environmental consultants, government agencies and private operators in the design and planning of surface mining operations. This paper presents examples of the application of computer visual simulations on large scale projects in the western United States, including those which generally require environmental impact statement under the National Environmental Policy Act of 1969 (e.g., open pit coal mines, gold surface mines, highways and bridges, oil and gas development, and alpine ski areas). This presentation will describe the development criteria, process, and use of the computer visual simulations of these types of projects. The issues of computer visual simulation accuracy, bias, credibility, ethics, and realism will be discussed with emphasis on application in real world situations. the use of computer visual simulations as a tool in the planning and design of these types of projects will be presented, along with discussion of their use in project permitting and public involvement.

  12. Final Report for LDRD Project on Rapid Problem Setup for Mesh-Based Simulation (Rapsodi)

    SciTech Connect

    Brown, D L; Henshaw, W; Petersson, N A; Fast, P; Chand, K

    2003-02-07

    Under LLNL Exploratory Research LDRD funding, the Rapsodi project developed rapid setup technology for computational physics and engineering problems that require computational representations of complex geometry. Many simulation projects at LLNL involve the solution of partial differential equations in complex 3-D geometries. A significant bottleneck in carrying out these simulations arises in converting some specification of a geometry, such as a computer-aided design (CAD) drawing to a computationally appropriate 3-D mesh that can be used for simulation and analysis. Even using state-of-the-art mesh generation software, this problem setup step typically has required weeks or months, which is often much longer than required to carry out the computational simulation itself. The Rapsodi project built computational tools and designed algorithms that help to significantly reduce this setup time to less than a day for many realistic problems. The project targeted rapid setup technology for computational physics and engineering problems that use mixed-element unstructured meshes, overset meshes or Cartesian-embedded boundary (EB) meshes to represent complex geometry. It also built tools that aid in constructing computational representations of geometry for problems that do not require a mesh. While completely automatic mesh generation is extremely difficult, the amount of manual labor required can be significantly reduced. By developing novel, automated, component-based mesh construction procedures and automated CAD geometry repair and cleanup tools, Rapsodi has significantly reduced the amount of hand crafting required to generate geometry and meshes for scientific simulation codes.

  13. Fusion breeder

    SciTech Connect

    Moir, R.W.

    1982-04-20

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the US fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the US fusion program and the US nuclear energy program. The purpose of this paper is to suggest this policy change be made and tell why it should be made, and to outline specific research and development goals so that the fusion breeder will be developed in time to meet fissile fuel needs.

  14. Fusion breeder

    SciTech Connect

    Moir, R.W.

    1982-02-22

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the US fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the US fusion program and the US nuclear energy program. The purpose of this paper is to suggest this policy change be made and tell why it should be made, and to outline specific research and development goals so that the fusion breeder will be developed in time to meet fissile fuel needs.

  15. Information integration for data fusion

    SciTech Connect

    Bray, O.H.

    1997-01-01

    Data fusion has been identified by the Department of Defense as a critical technology for the U.S. defense industry. Data fusion requires combining expertise in two areas - sensors and information integration. Although data fusion is a rapidly growing area, there is little synergy and use of common, reusable, and/or tailorable objects and models, especially across different disciplines. The Laboratory-Directed Research and Development project had two purposes: to see if a natural language-based information modeling methodology could be used for data fusion problems, and if so, to determine whether this methodology would help identify commonalities across areas and achieve greater synergy. The project confirmed both of the initial hypotheses: that the natural language-based information modeling methodology could be used effectively in data fusion areas and that commonalities could be found that would allow synergy across various data fusion areas. The project found five common objects that are the basis for all of the data fusion areas examined: targets, behaviors, environments, signatures, and sensors. Many of the objects and the specific facts related to these objects were common across several areas and could easily be reused. In some cases, even the terminology remained the same. In other cases, different areas had their own terminology, but the concepts were the same. This commonality is important with the growing use of multisensor data fusion. Data fusion is much more difficult if each type of sensor uses its own objects and models rather than building on a common set. This report introduces data fusion, discusses how the synergy generated by this LDRD would have benefited an earlier successful project and contains a summary information model from that project, describes a preliminary management information model, and explains how information integration can facilitate cross-treaty synergy for various arms control treaties.

  16. A fusion of minds

    NASA Astrophysics Data System (ADS)

    Corfield, Richard

    2013-02-01

    Mystery still surrounds the visit of the astronomer Sir Bernard Lovell to the Soviet Union in 1963. But his collaboration - and that of other British scientists - eased geopolitical tensions at the height of the Cold War and paved the way for today's global ITER fusion project, as Richard Corfield explains.

  17. ITER Fusion Energy

    ScienceCinema

    Dr. Norbert Holtkamp

    2016-07-12

    ITER (in Latin “the way”) is designed to demonstrate the scientific and technological feasibility of fusion energy. Fusion is the process by which two light atomic nuclei combine to form a heavier over one and thus release energy. In the fusion process two isotopes of hydrogen – deuterium and tritium – fuse together to form a helium atom and a neutron. Thus fusion could provide large scale energy production without greenhouse effects; essentially limitless fuel would be available all over the world. The principal goals of ITER are to generate 500 megawatts of fusion power for periods of 300 to 500 seconds with a fusion power multiplication factor, Q, of at least 10. Q ? 10 (input power 50 MW / output power 500 MW). The ITER Organization was officially established in Cadarache, France, on 24 October 2007. The seven members engaged in the project – China, the European Union, India, Japan, Korea, Russia and the United States – represent more than half the world’s population. The costs for ITER are shared by the seven members. The cost for the construction will be approximately 5.5 billion Euros, a similar amount is foreseen for the twenty-year phase of operation and the subsequent decommissioning.

  18. Final Report for "Community Petascale Project for Accelerator Science and Simulations".

    SciTech Connect

    Cary, J. R.; Bruhwiler, D. L.; Stoltz, P. H.; Cormier-Michel, E.; Cowan, B.; Schwartz, B. T.; Bell, G.; Paul, K.; Veitzer, S.

    2013-04-19

    This final report describes the work that has been accomplished over the past 5 years under the Community Petascale Project for Accelerator and Simulations (ComPASS) at Tech-X Corporation. Tech-X had been involved in the full range of ComPASS activities with simulation of laser plasma accelerator concepts, mainly in collaboration with LOASIS program at LBNL, simulation of coherent electron cooling in collaboration with BNL, modeling of electron clouds in high intensity accelerators, in collaboration with researchers at Fermilab and accurate modeling of superconducting RF cavity in collaboration with Fermilab, JLab and Cockcroft Institute in the UK.

  19. Retrieval process development and enhancements project Fiscal year 1995: Simulant development technology task progress report

    SciTech Connect

    Golcar, G.R.; Bontha, J.R.; Darab, J.G.

    1997-01-01

    The mission of the Retrieval Process Development and Enhancements (RPD&E) project is to develop an understanding of retrieval processes, including emerging and existing technologies, gather data on these technologies, and relate the data to specific tank problems such that end-users have the requisite technical bases to make retrieval and closure decisions. The development of waste simulants is an integral part of this effort. The work of the RPD&E simulant-development task is described in this document. The key FY95 accomplishments of the RPD&E simulant-development task are summarized below.

  20. The Use of Physiological Indices in Simulation Research: A Report on Project CORES (Covert and Overt Responses to Educational Simulations). A Symposium.

    ERIC Educational Resources Information Center

    Dyrenfurth, Michael; And Others

    In two separate reports the founding and set up of Project CORES was outlined, and then a specific research project was described. Project CORES began in the efforts of three men who felt a more systematic investigation of simulation effects was needed. The criteria felt most sensitive were the physiological activities of galvanic skin potential…

  1. Project Report on DOE Young Investigator Grant (Contract No. DE-FG02-02ER25525) Dynamic Scheduling and Fusion of Irregular Computation (August 15, 2002 to August 14, 2005)

    SciTech Connect

    Ding, Chen

    2005-08-16

    Computer simulation has become increasingly important in many scientiï¬ c disciplines, but its performance and scalability are severely limited by the memory throughput on today's computer systems. With the support of this grant, we ï¬ rst designed training-based prediction, which accurately predicts the memory performance of large applications before their execution. Then we developed optimization techniques using dynamic computation fusion and large-scale data transformation. The research work has three major components. The ï¬ rst is modeling and prediction of cache behav- ior. We have developed a new technique, which uses reuse distance information from training inputs then extracts a parameterized model of the program's cache miss rates for any input size and for any size of fully associative cache. Using the model we have built a web-based tool using three dimensional visualization. The new model can help to build cost-effective computer systems, design better benchmark suites, and improve task scheduling on heterogeneous systems. The second component is global computation for improving cache performance. We have developed an algorithm for dynamic data partitioning using sampling theory and probability distribution. Recent work from a number of groups show that manual or semi-manual computation fusion has signiï¬ cant beneï¬ ts in physical, mechanical, and biological simulations as well as information retrieval and machine veriï¬ cation. We have developed an au- tomatic tool that measures the potential of computation fusion. The new system can be used by high-performance application programmers to estimate the potential of locality improvement for a program before trying complex transformations for a speciï¬ c cache system. The last component studies models of spatial locality and the problem of data layout. In scientific programs, most data are stored in arrays. Grand challenge problems such as hydrodynamics simulation and data mining may use

  2. The Virtual Liver Project: Simulating Tissue Injury Through Molecular and Cellular Processes

    EPA Science Inventory

    Efficiently and humanely testing the safety of thousands of environmental chemicals is a challenge. The US EPA Virtual Liver Project (v-Liver™) is aimed at simulating the effects of environmental chemicals computationally in order to estimate the risk of toxic outcomes in humans...

  3. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  4. Simulating Limb Formation in the U.S. EPA Virtual Embryo - Risk Assessment Project

    EPA Science Inventory

    The U.S. EPA’s Virtual Embryo project (v-Embryo™) is a computer model simulation of morphogenesis that integrates cell and molecular level data from mechanistic and in vitro assays with knowledge about normal development processes to assess in silico the effects of chemicals on d...

  5. Simulator verification effort at the South Texas project electric generating station

    SciTech Connect

    Bellmore, P.E.; Albury, C.R.

    1987-01-01

    This paper presents the work being done at Houston Lighting and Power Company to verify the South Texas Project Electric Generating Station (STPEGS) simulator. The purpose of that work is to assure that the STPEGS simulator adequately reflects plant response during normal and abnormal transients. An enhanced understanding of the engineering and organizational needs of a simulator verification program is significant. This paper presents the techniques used to develop a best-estimate model. The best-estimate model generates plant response data for comparison with the STPEGS simulator. A typical licensing model is inadequate for this work because of the conservative assumptions in the model. The authors examine, in this paper, the interaction between the various groups responsible for simulator verification.

  6. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  7. Magnetized Target Fusion

    NASA Technical Reports Server (NTRS)

    Griffin, Steven T.

    2002-01-01

    Magnetized target fusion (MTF) is under consideration as a means of building a low mass, high specific impulse, and high thrust propulsion system for interplanetary travel. This unique combination is the result of the generation of a high temperature plasma by the nuclear fusion process. This plasma can then be deflected by magnetic fields to provide thrust. Fusion is initiated by a small traction of the energy generated in the magnetic coils due to the plasma's compression of the magnetic field. The power gain from a fusion reaction is such that inefficiencies due to thermal neutrons and coil losses can be overcome. Since the fusion reaction products are directly used for propulsion and the power to initiate the reaction is directly obtained from the thrust generation, no massive power supply for energy conversion is required. The result should be a low engine mass, high specific impulse and high thrust system. The key is to successfully initiate fusion as a proof-of-principle for this application. Currently MSFC is implementing MTF proof-of-principle experiments. This involves many technical details and ancillary investigations. Of these, selected pertinent issues include the properties, orientation and timing of the plasma guns and the convergence and interface development of the "pusher" plasma. Computer simulations of the target plasma's behavior under compression and the convergence and mixing of the gun plasma are under investigation. This work is to focus on the gun characterization and development as it relates to plasma initiation and repeatability.

  8. Toward the credibility of Northeast United States summer precipitation projections in CMIP5 and NARCCAP simulations

    NASA Astrophysics Data System (ADS)

    Thibeault, Jeanne M.; Seth, A.

    2015-10-01

    Precipitation projections for the northeast United States and nearby Canada (Northeast) are examined for 15 Fifth Phase of the Coupled Model Intercomparison Project (CMIP5) models. A process-based evaluation of atmospheric circulation features associated with wet Northeast summers is performed to examine whether credibility can be differentiated within the multimodel ensemble. Based on these evaluations, and an analysis of the interannual statistical properties of area-averaged precipitation, model subsets were formed. Multimodel precipitation projections from each subset were compared to the multimodel projection from all of the models. Higher-resolution North American Regional Climate Change Assessment Program (NARCCAP) regional climate models (RCMs) were subjected to a similar evaluation, grouping into subsets, and examination of future projections. CMIP5 models adequately simulate most large-scale circulation features associated with wet Northeast summers, though all have errors in simulating observed sea level pressure and moisture divergence anomalies in the western tropical Atlantic/Gulf of Mexico. Relevant large-scale processes simulated by the RCMs resemble those of their driving global climate models (GCMs), which are not always realistic. Future RCM studies could benefit from a process analysis of potential driving GCMs prior to dynamical downscaling. No CMIP5 or NARCCAP models were identified as clearly more credible, but six GCMs and four RCMs performed consistently better. Among the "Better" models, there is no consistency in the direction of future summer precipitation change. CMIP5 projections suggest that the Northeast precipitation response depends on the dynamics of the North Atlantic anticyclone and associated circulation and moisture convergence patterns, which vary among "Better" models. Even when model credibility cannot be clearly differentiated, examination of simulated processes provides important insights into their evolution under

  9. Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.

    2008-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through

  10. The fusion of gerontology and technology in nursing education: History and demonstration of the Gerontological Informatics Reasoning Project--GRIP.

    PubMed

    Dreher, H Michael; Cornelius, Fran; Draper, Judy; Pitkar, Harshad; Manco, Janet; Song, Il-Yeol

    2006-01-01

    Phase I of our Gerontological Reasoning Informatics Project (GRIP) began in the summer of 2002 when all 37 senior undergraduate nursing students in our accelerated BSN nursing program were given PDAs. These students were oriented to use a digitalized geriatric nursing assessment tool embedded into their PDA in a variety of geriatric clinical agencies. This informatics project was developed to make geriatric nursing more technology oriented and focused on seven modules of geriatric assessment: intellect (I), nutrition (N), self-concept (S), physical activity (P), interpersonal functioning (I), restful sleep (R), and elimination (E)--INSPIRE. Through phase II and now phase III, the GRIP Project has become a major collaboration between the College of Nursing & Health Professions and College of Information Science and Technology at Drexel University. The digitalized geriatric nursing health assessment tool has undergone a second round of reliability and validity testing and is now used to conduct a 20 minute comprehensive geriatric health assessment on the PDA, making our undergraduate gerontology course the most high tech clinical course in our nursing curriculum.

  11. The Physics Education Technology Project: Web-based interactive simulations to support student learning

    NASA Astrophysics Data System (ADS)

    Adams, Wendy; Perkins, Kathy; Finkelstein, Noah; Lemaster, Ron; Reid, Sam; Dubson, Mike; Wieman, Carl

    2004-05-01

    We introduce the Physics Education Technology (PhET) Project^1,2, a new initiative to provide a suite of online tools for teaching and learning introductory physics at the high school and college levels. The project focuses on the development of relatively elaborate Java- and Flash-based animated simulations that are designed to help students develop visual and conceptual models of physical phenomena. We are also developing guiding questions that will utilize the simulation to address specific conceptual difficulties, help students experience the relationships among variables, and connect physics to real-world experiences and observations. These simulations create an interactive experience for the student that is designed to promote active thinking and encourage experimentation. We have implemented the simulations as lecture demonstrations, homework tools, a replacement for laboratory equipment, and as a preparation activity for class. We will present a summary of the simulations currently available and our preliminary research results on what makes a simulation effective and how it can be used effectively as an educational tool. 1. See http://www.colorado.edu/physics/phet 2. Supported by NSF, the Kavli Foundation, and CU.

  12. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  13. Asian summer monsoon onset in simulations and CMIP5 projections using four Chinese climate models

    NASA Astrophysics Data System (ADS)

    Zou, Liwei; Zhou, Tianjun

    2015-06-01

    The reproducibility and future changes of the onset of the Asian summer monsoon were analyzed based on the simulations and projections under the Representative Concentration Pathways (RCP) scenario in which anthropogenic emissions continue to rise throughout the 21st century (i.e. RCP8.5) by all realizations from four Chinese models that participated in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Delayed onset of the monsoon over the Arabian Sea was evident in all simulations for present-day climate, which was associated with a too weak simulation of the low-level Somali jet in May. A consistent advanced onset of the monsoon was found only over the Arabian Sea in the projections, where the advanced onset of the monsoon was accompanied by an increase of rainfall and an anomalous anticyclone over the northern Indian Ocean. In all the models except FGOALS-g2, the enhanced low-level Somali jet transported more water vapor to the Arabian Sea, whereas in FGOALS-g2 the enhanced rainfall was determined more by the increased wind convergence. Furthermore, and again in all models except FGOALS-g2, the equatorial SST warming, with maximum increase over the eastern Pacific, enhanced convection in the central West Pacific and reduced convection over the eastern Indian Ocean and Maritime Continent region, which drove the anomalous anticyclonic circulation over the western Indian Ocean. In contrast, in FGOALS-g2, there was minimal (near-zero) warming of projected SST in the central equatorial Pacific, with decreased convection in the central West Pacific and enhanced convection over the Maritime Continent. The broader-scale differences among the models across the Pacific were related to both the differences in the projected SST pattern and in the present-day simulations.

  14. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  15. Description of convective-scale numerical weather simulation use in a flight simulator within the Flysafe project

    NASA Astrophysics Data System (ADS)

    Pradier-Vabre, S.; Forster, C.; Heesbeen, W. W. M.; Pagé, C.; Sénési, S.; Tafferner, A.; Bernard-Bouissières, I.; Caumont, O.; Drouin, A.; Ducrocq, V.; Guillou, Y.; Josse, P.

    2009-03-01

    Within the framework of the Flysafe project, dedicated tools aiming at improving flight safety are developed. In particular, efforts are directed towards the development of the Next Generation-Integrated Surveillance System (NG-ISS), i.e. a combination of new on-board systems and ground-based tools which provides the pilot with integrated information on three risks playing a major role in aircraft accidents: collision with another aircraft, collision with terrain, and adverse weather conditions. For the latter, Weather Information Management Systems (WIMSs) based on nowcasts of atmospheric hazards are developed. This paper describes the set-up of a test-bed for the NG-ISS incorporating two types of WIMS data, those related to aircraft in-flight icing and thunderstorm risks. The test-bed is based on convective-scale numerical simulations of a particular weather scenario with thunderstorms and icing in the area of the Innsbruck airport. Raw simulated fields as well as more elaborate diagnostics (synthetic reflectivity and satellite brightness temperature) feed both the flight simulator including the NG-ISS and the algorithms in charge of producing WIMS data. WIMS outputs based on the synthetic data are discussed, and it is indicated that the high-resolution simulated fields are beneficial for the NG-ISS test-bed purposes and its technical feasibility.

  16. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    SciTech Connect

    Shen, Xipeng

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  17. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Agocs, A.; Aiola, S.; Antolini, R.; Avanzini, C.; Baldini Ferroli, R.; Bencivenni, G.; Bossini, E.; Bressan, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D'Incecco, M.; Dreucci, M.; Fabbri, F. L.; Frolov, V.; Garbini, M.; Gemme, G.; Gnesi, I.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Li, S.; Librizzi, F.; Maggiora, A.; Massai, M.; Miozzi, S.; Panareo, M.; Paoletti, R.; Perasso, L.; Pilo, F.; Piragino, G.; Regano, A.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Spandre, G.; Squarcia, S.; Taiuti, M.; Tosello, F.; Votano, L.; Williams, M. C. S.; Yánez, G.; Zichichi, A.; Zuyeuski, R.

    2014-08-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 105 km2. In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  18. Beam dynamics simulations and measurements at the Project X Test Facility

    SciTech Connect

    Gianfelice-Wendt, E.; Scarpine, V.E.; Webber, R.C.; /Fermilab

    2011-03-01

    Project X, under study at Fermilab, is a multitask high-power superconducting RF proton beam facility, aiming to provide high intensity protons for rare processes experiments and nuclear physics at low energy, and simultaneously for the production of neutrinos, as well as muon beams in the long term. A beam test facility - former known as High Intensity Neutrino Source (HINS) - is under commissioning for testing critical components of the project, e.g. dynamics and diagnostics at low beam energies, broadband beam chopping, RF power generation and distribution. In this paper we describe the layout of the test facility and present beam dynamics simulations and measurements.

  19. A Hybrid Parachute Simulation Environment for the Orion Parachute Development Project

    NASA Technical Reports Server (NTRS)

    Moore, James W.

    2011-01-01

    A parachute simulation environment (PSE) has been developed that aims to take advantage of legacy parachute simulation codes and modern object-oriented programming techniques. This hybrid simulation environment provides the parachute analyst with a natural and intuitive way to construct simulation tasks while preserving the pedigree and authority of established parachute simulations. NASA currently employs four simulation tools for developing and analyzing air-drop tests performed by the CEV Parachute Assembly System (CPAS) Project. These tools were developed at different times, in different languages, and with different capabilities in mind. As a result, each tool has a distinct interface and set of inputs and outputs. However, regardless of the simulation code that is most appropriate for the type of test, engineers typically perform similar tasks for each drop test such as prediction of loads, assessment of altitude, and sequencing of disreefs or cut-aways. An object-oriented approach to simulation configuration allows the analyst to choose models of real physical test articles (parachutes, vehicles, etc.) and sequence them to achieve the desired test conditions. Once configured, these objects are translated into traditional input lists and processed by the legacy simulation codes. This approach minimizes the number of sim inputs that the engineer must track while configuring an input file. An object oriented approach to simulation output allows a common set of post-processing functions to perform routine tasks such as plotting and timeline generation with minimal sensitivity to the simulation that generated the data. Flight test data may also be translated into the common output class to simplify test reconstruction and analysis.

  20. Early Career. Harnessing nanotechnology for fusion plasma-material interface research in an in-situ particle-surface interaction facility

    SciTech Connect

    Allain, Jean Paul

    2014-08-08

    This project consisted of fundamental and applied research of advanced in-situ particle-beam interactions with surfaces/interfaces to discover novel materials able to tolerate intense conditions at the plasma-material interface (PMI) in future fusion burning plasma devices. The project established a novel facility that is capable of not only characterizing new fusion nanomaterials but, more importantly probing and manipulating materials at the nanoscale while performing subsequent single-effect in-situ testing of their performance under simulated environments in fusion PMI.

  1. Integrated Vehicle Health Management Project-Modeling and Simulation for Wireless Sensor Applications

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.

    2009-01-01

    This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.

  2. Fusion Power.

    ERIC Educational Resources Information Center

    Dingee, David A.

    1979-01-01

    Discusses the extraordinary potential, the technical difficulties, and the financial problems that are associated with research and development of fusion power plants as a major source of energy. (GA)

  3. Hardware Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-04-12

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32 bit floating point texture capabilities to obtain validated solutions to the radiative transport equation for X-rays. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedra that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester. We show that the hardware accelerated solution is faster than the current technique used by scientists.

  4. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  5. Modelling and Simulation of National Electronic Product Code Network Demonstrator Project

    NASA Astrophysics Data System (ADS)

    Mo, John P. T.

    The National Electronic Product Code (EPC) Network Demonstrator Project (NDP) was the first large scale consumer goods track and trace investigation in the world using full EPC protocol system for applying RFID technology in supply chains. The NDP demonstrated the methods of sharing information securely using EPC Network, providing authentication to interacting parties, and enhancing the ability to track and trace movement of goods within the entire supply chain involving transactions among multiple enterprise. Due to project constraints, the actual run of the NDP was 3 months only and was unable to consolidate with quantitative results. This paper discusses the modelling and simulation of activities in the NDP in a discrete event simulation environment and provides an estimation of the potential benefits that can be derived from the NDP if it was continued for one whole year.

  6. Fusion of cone-beam CT and 3D photographic images for soft tissue simulation in maxillofacial surgery

    NASA Astrophysics Data System (ADS)

    Chung, Soyoung; Kim, Joojin; Hong, Helen

    2016-03-01

    During maxillofacial surgery, prediction of the facial outcome after surgery is main concern for both surgeons and patients. However, registration of the facial CBCT images and 3D photographic images has some difficulties that regions around the eyes and mouth are affected by facial expressions or the registration speed is low due to their dense clouds of points on surfaces. Therefore, we propose a framework for the fusion of facial CBCT images and 3D photos with skin segmentation and two-stage surface registration. Our method is composed of three major steps. First, to obtain a CBCT skin surface for the registration with 3D photographic surface, skin is automatically segmented from CBCT images and the skin surface is generated by surface modeling. Second, to roughly align the scale and the orientation of the CBCT skin surface and 3D photographic surface, point-based registration with four corresponding landmarks which are located around the mouth is performed. Finally, to merge the CBCT skin surface and 3D photographic surface, Gaussian-weight-based surface registration is performed within narrow-band of 3D photographic surface.

  7. Haughton-Mars Project (HMP)/NASA 2006 Lunar Medical Contingency Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Scheuring, R. A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.; Hodgson, E.; Sullivan, P.; Wilkinson, N.

    2006-01-01

    Medical requirements are currently being developed for NASA's space exploration program. Lunar surface operations for crews returning to the moon will be performed on a daily basis to conduct scientific research and construct a lunar habitat. Inherent to aggressive surface activities is the potential risk of injury to crew members. To develop an evidence-base for handling medical contingencies on the lunar surface, a simulation project was conducted using the moon-Mars analog environment at Devon Island, Nunavut, high Canadian Arctic. A review of the Apollo lunar surface activities and personal communications with Apollo lunar crew members provided a knowledge base of plausible scenarios that could potentially injure an astronaut during a lunar extravehicular activity. Objectives were established to 1) demonstrate stabilization, field extraction and transfer an injured crew member to the habitat and 2) evaluate audio, visual and biomedical communication capabilities with ground controllers at multiple mission control centers. The simulation project s objectives were achieved. Among these objectives were 1) extracting a crew member from a sloped terrain by a two-member team in a 1-g analog environment, 2) establishing real-time communication to multiple space centers, 3) providing biomedical data to flight controllers and crew members, and 4) establishing a medical diagnosis and treatment plan from a remote site. The simulation project provided evidence for the types of equipment and methods needed for planetary space exploration. During the project, the crew members were confronted with a number of unexpected scenarios including environmental, communications, EVA suit, and navigation challenges. These trials provided insight into the challenges of carrying out a medical contingency in an austere environment. The knowledge gained from completing the objectives of this project will be incorporated into the exploration medical requirements involving an incapacited

  8. Hanford Waste Simulants Created to Support the Research and Development on the River Protection Project - Waste Treatment Plant

    SciTech Connect

    Eibling, R.E.

    2001-07-26

    The development of nonradioactive waste simulants to support the River Protection Project - Waste Treatment Plant bench and pilot-scale testing is crucial to the design of the facility. The report documents the simulants development to support the SRTC programs and the strategies used to produce the simulants.

  9. The diffusive finite state projection algorithm for efficient simulation of the stochastic reaction-diffusion master equation

    PubMed Central

    Drawert, Brian; Lawson, Michael J.; Petzold, Linda; Khammash, Mustafa

    2010-01-01

    We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm. PMID:20170209

  10. Towards the petaflop for Lattice QCD simulations the PetaQCD project

    NASA Astrophysics Data System (ADS)

    Anglès d'Auriac, Jean-Christian; Barthou, Denis; Becirevic, Damir; Bilhaut, René; Bodin, François; Boucaud, Philippe; Brand-Foissac, Olivier; Carbonell, Jaume; Eisenbeis, Christine; Gallard, Pascal; Grosdidier, Gilbert; Guichon, Pierre; Honoré, Pierre-François; Le Meur, Guy; Pène, Olivier; Rilling, Louis; Roudeau, Patrick; Seznec, André; Stocchi, Achille; Touze, François

    2010-04-01

    The study and design of a very ambitious petaflop cluster exclusively dedicated to Lattice QCD simulations started in early '08 among a consortium of 7 laboratories (IN2P3, CNRS, INRIA, CEA) and 2 SMEs. This consortium received a grant from the French ANR agency in July '08, and the PetaQCD project kickoff took place in January '09. Building upon several years of fruitful collaborative studies in this area, the aim of this project is to demonstrate that the simulation of a 256 x 1283 lattice can be achieved through the HMC/ETMC software, using a machine with efficient speed/cost/reliability/power consumption ratios. It is expected that this machine can be built out of a rather limited number of processors (e.g. between 1000 and 4000), although capable of a sustained petaflop CPU performance. The proof-of-concept should be a mock-up cluster built as much as possible with off-the-shelf components, and 2 particularly attractive axis will be mainly investigated, in addition to fast all-purpose multi-core processors: the use of the new brand of IBM-Cell processors (with on-chip accelerators) and the very recent Nvidia GP-GPUs (off-chip co-processors). This cluster will obviously be massively parallel, and heterogeneous. Communication issues between processors, implied by the Physics of the simulation and the lattice partitioning, will certainly be a major key to the project.

  11. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  12. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    NASA Astrophysics Data System (ADS)

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Séguin, F. H.

    2016-01-01

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK=λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with the experimental data and with the results of hydrodynamical simulations, some of which include an ad hoc modeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. The remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.

  13. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    DOE PAGES

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; ...

    2016-01-06

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK = λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with themore » experimental data and with the results of hydrodynamical simulations, some of which include an ad hocmodeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. Furthermore, the remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.« less

  14. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    SciTech Connect

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Seguin, F. H.

    2016-01-06

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK = λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with the experimental data and with the results of hydrodynamical simulations, some of which include an ad hocmodeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. Furthermore, the remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.

  15. Education and Public Outreach at The Pavilion Lake Research Project: Fusion of Science and Education using Web 2.0

    NASA Astrophysics Data System (ADS)

    Cowie, B. R.; Lim, D. S.; Pendery, R.; Laval, B.; Slater, G. F.; Brady, A. L.; Dearing, W. L.; Downs, M.; Forrest, A.; Lees, D. S.; Lind, R. A.; Marinova, M.; Reid, D.; Seibert, M. A.; Shepard, R.; Williams, D.

    2009-12-01

    The Pavilion Lake Research Project (PLRP) is an international multi-disciplinary science and exploration effort to explain the origin and preservation potential of freshwater microbialites in Pavilion Lake, British Columbia, Canada. Using multiple exploration platforms including one person DeepWorker submersibles, Autonomous Underwater Vehicles, and SCUBA divers, the PLRP acts as an analogue research site for conducting science in extreme environments, such as the Moon or Mars. In 2009, the PLRP integrated several Web 2.0 technologies to provide a pilot-scale Education and Public Outreach (EPO) program targeting the internet savvy generation. The seamless integration of multiple technologies including Google Earth, Wordpress, Youtube, Twitter and Facebook, facilitated the rapid distribution of exciting and accessible science and exploration information over multiple channels. Field updates, science reports, and multimedia including videos, interactive maps, and immersive visualization were rapidly available through multiple social media channels, partly due to the ease of integration of these multiple technologies. Additionally, the successful application of videoconferencing via a readily available technology (Skype) has greatly increased the capacity of our team to conduct real-time education and public outreach from remote locations. The improved communication afforded by Web 2.0 has increased the quality of EPO provided by the PLRP, and has enabled a higher level of interaction between the science team and the community at large. Feedback from these online interactions suggest that remote communication via Web 2.0 technologies were effective tools for increasing public discourse and awareness of the science and exploration activity at Pavilion Lake.

  16. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  17. A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    Schissel, David P.; Abla, G.; Burruss, J. R.; Feibush, E.; Fredian, T. W.; Goode, M. M.; Greenwald, M. J.; Keahey, K.; Leggett, T.; Li, K.; McCune, D. C.; Papka, M. E.; Randerson, L.; Sanderson, A.; Stillerman, J.; Thompson, M. R.; Uram, T.; Wallace, G.

    2012-12-20

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. The original objective of the NFC project was to develop and deploy a national FES Grid (FusionGrid) that would be a system for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid was to allow scientists at remote sites to participate as fully in experiments and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community. The vision for FusionGrid was that experimental and simulation data, computer codes, analysis routines, visualization tools, and remote collaboration tools are to be thought of as network services. In this model, an application service provider (ASP provides and maintains software resources as well as the necessary hardware resources. The project would create a robust, user-friendly collaborative software environment and make it available to the US FES community. This Grid's resources would be protected by a shared security infrastructure including strong authentication to identify users and authorization to allow stakeholders to control their own resources. In this environment, access to services is stressed rather than data or software portability.

  18. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    SciTech Connect

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; Sato, Mitsuhisa; Tang, William; Wang, Bei

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensional gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.

  19. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    DOE PAGES

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; ...

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less

  20. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    SciTech Connect

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  1. Fusion - An energy source for synthetic fuels

    NASA Astrophysics Data System (ADS)

    Fillo, J. A.; Powell, J.; Steinberg, M.

    1980-05-01

    An important first step in the synthesis of liquid and gaseous fuels is the production of hydrogen. Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of 40 to 60% and hydrogen production efficiencies by high temperature electrolysis of 50 to 70% are projected for fusion reactors using high temperature blankets. Fusion/coal symbiotic systems appear economically promising for the first generation of commercial fusion synfuels plants. In the long term, there could be a gradual transition to an inexhaustible energy system based solely on fusion.

  2. (Fusion energy research)

    SciTech Connect

    Phillips, C.A.

    1988-01-01

    This report discusses the following topics: principal parameters achieved in experimental devices (FY88); tokamak fusion test reactor; Princeton beta Experiment-Modification; S-1 Spheromak; current drive experiment; x-ray laser studies; spacecraft glow experiment; plasma deposition and etching of thin films; theoretical plasma; tokamak modeling; compact ignition tokamak; international thermonuclear experimental reactor; Engineering Department; Project Planning and Safety Office; quality assurance and reliability; and technology transfer.

  3. Numerical tokamak turbulence project (OFES grand challenge)

    SciTech Connect

    Beer, M; Cohen, B I; Crotinger, J; Dawson, J; Decyk, V; Dimits, A M; Dorland, W D; Hammett, G W; Kerbel, G D; Leboeuf, J N; Lee, W W; Lin, Z; Nevins, W M; Reynders, J; Shumaker, D E; Smith, S; Sydora, R; Waltz, R E; Williams, T

    1999-08-27

    The primary research objective of the Numerical Tokamak Turbulence Project (NTTP) is to develop a predictive ability in modeling turbulent transport due to drift-type instabilities in the core of tokamak fusion experiments, through the use of three-dimensional kinetic and fluid simulations and the derivation of reduced models.

  4. Fusion Power measurement at ITER

    SciTech Connect

    Bertalot, L.; Barnsley, R.; Krasilnikov, V.; Stott, P.; Suarez, A.; Vayakis, G.; Walsh, M.

    2015-07-01

    Nuclear fusion research aims to provide energy for the future in a sustainable way and the ITER project scope is to demonstrate the feasibility of nuclear fusion energy. ITER is a nuclear experimental reactor based on a large scale fusion plasma (tokamak type) device generating Deuterium - Tritium (DT) fusion reactions with emission of 14 MeV neutrons producing up to 700 MW fusion power. The measurement of fusion power, i.e. total neutron emissivity, will play an important role for achieving ITER goals, in particular the fusion gain factor Q related to the reactor performance. Particular attention is given also to the development of the neutron calibration strategy whose main scope is to achieve the required accuracy of 10% for the measurement of fusion power. Neutron Flux Monitors located in diagnostic ports and inside the vacuum vessel will measure ITER total neutron emissivity, expected to range from 1014 n/s in Deuterium - Deuterium (DD) plasmas up to almost 10{sup 21} n/s in DT plasmas. The neutron detection systems as well all other ITER diagnostics have to withstand high nuclear radiation and electromagnetic fields as well ultrahigh vacuum and thermal loads. (authors)

  5. Accelerator & Fusion Research Division 1991 summary of activities

    SciTech Connect

    Not Available

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  6. Accelerator Fusion Research Division 1991 summary of activities

    SciTech Connect

    Berkner, Klaus H.

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  7. Toward Unanimous Projections for Sea Ice Using CMIP5 Multi-model Simulations

    NASA Astrophysics Data System (ADS)

    Yang, S.; Christensen, J. H.; Langen, P. P.; Thejll, P.

    2015-12-01

    Coupled global climate models have been used to provide future climate projections as major objective tools based on physical laws that govern the dynamics and thermodynamics of the climate system. However, while climate models in general predict declines in Arctic sea ice cover (i.e., ice extent and volume) from late 20th century through the next decades in response to increase of anthropogenic forcing, the model simulated Arctic sea ice demonstrates considerable biases in both the mean and the declining trend in comparison with the observations over the satellite era (1979-present). The models also show wide inter-model spread in hindcast and projected sea ice decline, raising the question of uncertainty in model predicted polar climate. In order to address the model uncertainty in the Arctic sea ice projection, we analyze the Arctic sea ice extent under the context of surface air temperature (SAT) as simulated in the historical, RCP4.5 and RCP8.5 experiments by 27 CMIP5 models. These 27 models are all we could obtain from the CMIP5 archive with sufficient gird information for processing the sea ice data. Unlike many previous studies in which only limited number of models were selected based on metrics of modeled sea ice characteristics for getting projected ice with reduced uncertainty, our analysis is applied to all model simulations with no discrimination. It is found that the changes in total Arctic sea ice in various seasons from one model are closely related to the changes in global mean SAT in the corresponding model. This relationship appears very similar in all models and agrees well with that in the observational data. In particular, the ratio of the total Arctic sea ice changes in March, September and annual mean with respect to the baseline climatology (1979-2008) are seen to linearly correlate to the global mean annual SAT anomaly, suggesting unanimous projection of the sea ice extent may be possible with this relationship. Further analysis is

  8. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-10-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  9. 3-D simulation of urban warming in Tokyo and proposal of air-cooled city project

    SciTech Connect

    Saitoh, T.S.; Yamada, Noboru

    1999-07-01

    Recent computer projection of the urban warming in Tokyo metropolitan area around the year 2030 showed the authors that the urban temperature near Otemachi, heart of Tokyo, will exceed 43{+-}2 degree Celsius (110 degree Fahrenheit) at 6 p.m. in the summer. In the present paper, modeling and 3-D simulation results of urban warming in the Tokyo metropolitan area were presented and discussed. Furthermore, the effect of the reduction of carbon dioxide (CO{sub 2}) emissions was discussed by using a newly developed 3-D simulation code. Finally, the authors proposed a new concept; cool-air ventilated city project, which alleviates the urban warming, air pollution, and urban discomfort. In this project, the urban outdoor and indoor spaces are ventilated by clean cooled-air, which is produced in the rural or mountainous regions located far away from the urban area. Water of a huge reservoir is cooled below 4 degree Celsius in winter by utilizing sky radiation cooling and will be kept until the summer for indoor and outdoor space cooling. In this study, the feasibility of this system was discussed.

  10. Examinations of cloud variability and future change in the coupled model intercomparison project phase 3 simulations

    NASA Astrophysics Data System (ADS)

    Shin, Sun-Hee; Lee, Myong-In; Kim, Ok-Yeon

    2014-08-01

    Low-level cloud variability is critical to the radiation balance of Earth due to its wide spatial coverage. Using the adjusted International Satellite Cloud Climatology Project (ISCCP) observations of Clement et al. (2009), and the Coupled Model Intercomparison Project Phase 3 (CMIP3) model simulations, this study examines the observed and the simulated low-cloud variations and their relationships with large-scale environmental variables. From the observational analysis, significant correlations are found between low clouds and those of sea surface temperature (SST), lower tropospheric stability (LTS), and sea level pressure (SLP) over tropical marine areas of low cloud prevailing regions during most of the year. Increase of SST coincides with the reduction of LTS and increased vertical motion, which tends to reduce low-level clouds in subtropical oceans. Among the 14 models investigated, CGCM3 and HadGEM1 exhibit more realistic representation of the observed relationship between low-level clouds and large-scale environments. In future climate projection, these two models show a good agreement in the reduction of low-cloud throughout much of the global oceans in response to greenhouse gas forcing, suggesting a positive low-cloud feedback in a climate change context.

  11. An actuator line model simulation with optimal body force projection length scales

    NASA Astrophysics Data System (ADS)

    Martinez-Tossas, Luis; Churchfield, Matthew J.; Meneveau, Charles

    2016-11-01

    In recent work (Martínez-Tossas et al. "Optimal smoothing length scale for actuator line models of wind turbine blades", preprint), an optimal body force projection length-scale for an actuator line model has been obtained. This optimization is based on 2-D aerodynamics and is done by comparing an analytical solution of inviscid linearized flow over a Gaussian body force to the potential flow solution of flow over a Joukowski airfoil. The optimization provides a non-dimensional optimal scale ɛ / c for different Joukowski airfoils, where ɛ is the width of the Gaussian kernel and c is the chord. A Gaussian kernel with different widths in the chord and thickness directions can further reduce the error. The 2-D theory developed is extended by simulating a full scale rotor using the optimal body force projection length scales. Using these values, the tip losses are captured by the LES and thus, no additional explicit tip-loss correction is needed for the actuator line model. The simulation with the optimal values provides excellent agreement with Blade Element Momentum Theory. This research is supported by the National Science Foundation (Grant OISE-1243482, the WINDINSPIRE project).

  12. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): Simulation design and preliminary results

    DOE PAGES

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; ...

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale formore » those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.« less

  13. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): Simulation design and preliminary results

    SciTech Connect

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; Boucher, Olivier; English, J. M.; Irvine, Peter J.; Jones, Andrew; Lawrence, M. G.; MacCracken, Michael C.; Muri, Helene O.; Moore, John C.; Niemeier, Ulrike; Phipps, Steven J.; Sillmann, Jana; Storelvmo, Trude; Wang, Hailong; Watanabe, Shingo

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  14. Fusion Data Grid Service

    NASA Astrophysics Data System (ADS)

    Shasharina, Svetlana; Wang, Nanbor

    2004-11-01

    Simulations and experiments in the fusion and plasma physics community generate large datasets at remote sites. Visualization and analysis of these datasets are difficult because of the incompatibility among the various data formats adopted by simulation, experiments, and analysis tools, and the large sizes of analyzed data. Grids and Web Services technologies are capable of providing solutions for such heterogeneous settings, but need to be customized to the field-specific needs and merged with distributed technologies currently used by the community. This paper describes how we are addressing these issues in the Fusion Grid Service under development. We also present performance results of relevant data transfer mechanisms including binary SOAP, DIME, GridFTP and MDSplus and CORBA. We will describe the status of data converters (between HDF5 and MDSplus data types), developed in collaboration with MIT (J. Stillerman). Finally, we will analyze bottlenecks of MDSplus data transfer mechanism (work performed in collaboration with General Atomics (D. Schissel and M. Qian).

  15. Projected strengthening of Amazonian dry season by constrained climate model simulations

    NASA Astrophysics Data System (ADS)

    Boisier, Juan P.; Ciais, Philippe; Ducharne, Agnès; Guimberteau, Matthieu

    2015-07-01

    The vulnerability of Amazonian rainforest, and the ecological services it provides, depends on an adequate supply of dry-season water, either as precipitation or stored soil moisture. How the rain-bearing South American monsoon will evolve across the twenty-first century is thus a question of major interest. Extensive savanization, with its loss of forest carbon stock and uptake capacity, is an extreme although very uncertain scenario. We show that the contrasting rainfall projections simulated for Amazonia by 36 global climate models (GCMs) can be reproduced with empirical precipitation models, calibrated with historical GCM data as functions of the large-scale circulation. A set of these simple models was therefore calibrated with observations and used to constrain the GCM simulations. In agreement with the current hydrologic trends, the resulting projection towards the end of the twenty-first century is for a strengthening of the monsoon seasonal cycle, and a dry-season lengthening in southern Amazonia. With this approach, the increase in the area subjected to lengthy--savannah-prone--dry seasons is substantially larger than the GCM-simulated one. Our results confirm the dominant picture shown by the state-of-the-art GCMs, but suggest that the `model democracy' view of these impacts can be significantly underestimated.

  16. Projected changes in atmospheric river events in Arizona as simulated by global and regional climate models

    NASA Astrophysics Data System (ADS)

    Rivera, Erick R.; Dominguez, Francina

    2016-09-01

    Inland-penetrating atmospheric rivers (ARs) affect the United States Southwest and significantly contribute to cool season precipitation. In this study, we examine the results from an ensemble of dynamically downscaled simulations from the North American Regional Climate Change Assessment Program (NARCCAP) and their driving general circulation models (GCMs) in order to determine statistically significant changes in the intensity of the cool season ARs impacting Arizona and the associated precipitation. Future greenhouse gas emissions follow the A2 emission scenario from the Intergovernmental Panel on Climate Change Fourth Assessment Report simulations. We find that there is a consistent and clear intensification of the AR-related water vapor transport in both the global and regional simulations which reflects the increase in water vapor content due to warmer atmospheric temperatures, according to the Clausius-Clapeyron relationship. However, the response of AR-related precipitation intensity to increased moisture flux and column-integrated water vapor is weak and no significant changes are projected either by the GCMs or the NARCCAP models. This lack of robust precipitation variations can be explained in part by the absence of meaningful changes in both the large-scale water vapor flux convergence and the maximum positive relative vorticity in the GCMs. Additionally, some global models show a robust decrease in relative humidity which may also be responsible for the projected precipitation patterns.

  17. Numerical simulations of the ablative Rayleigh-Taylor instability in planar inertial-confinement-fusion targets using the FastRad3D code

    NASA Astrophysics Data System (ADS)

    Bates, J. W.; Schmitt, A. J.; Karasik, M.; Zalesak, S. T.

    2016-12-01

    The ablative Rayleigh-Taylor (RT) instability is a central issue in the performance of laser-accelerated inertial-confinement-fusion targets. Historically, the accurate numerical simulation of this instability has been a challenging task for many radiation hydrodynamics codes, particularly when it comes to capturing the ablatively stabilized region of the linear dispersion spectrum and modeling ab initio perturbations. Here, we present recent results from two-dimensional numerical simulations of the ablative RT instability in planar laser-ablated foils that were performed using the Eulerian code FastRad3D. Our study considers polystyrene, (cryogenic) deuterium-tritium, and beryllium target materials, quarter- and third-micron laser light, and low and high laser intensities. An initial single-mode surface perturbation is modeled in our simulations as a small modulation to the target mass density and the ablative RT growth-rate is calculated from the time history of areal-mass variations once the target reaches a steady-state acceleration. By performing a sequence of such simulations with different perturbation wavelengths, we generate a discrete dispersion spectrum for each of our examples and find that in all cases the linear RT growth-rate γ is well described by an expression of the form γ = α [ k g / ( 1 + ɛ k L m ) ] 1 / 2 - β k V a , where k is the perturbation wavenumber, g is the acceleration of the target, Lm is the minimum density scale-length, Va is the ablation velocity, and ɛ is either one or zero. The dimensionless coefficients α and β in the above formula depend on the particular target and laser parameters and are determined from two-dimensional simulation results through the use of a nonlinear curve-fitting procedure. While our findings are generally consistent with those of Betti et al. (Phys. Plasmas 5, 1446 (1998)), the ablative RT growth-rates predicted in this investigation are somewhat smaller than the values previously reported for the

  18. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  19. Neutron shielding for a new projected proton therapy facility: A Geant4 simulation study.

    PubMed

    Cadini, Francesco; Bolst, David; Guatelli, Susanna; Beltran, Chris; Jackson, Michael; Rosenfeld, Anatoly B

    2016-12-01

    In this work, we used the Monte Carlo-based Geant4 simulation toolkit to calculate the ambient dose equivalents due to the secondary neutron field produced in a new projected proton therapy facility. In particular the facility geometry was modeled in Geant4 based on the CAD design. Proton beams were originated with an energy of 250MeV in the gantry rooms with different angles with respect to the patient; a fixed 250MeV proton beam was also modeled. The ambient dose equivalent was calculated in several locations of interest inside and outside the facility, for different scenarios. The simulation results were compared qualitatively to previous work on an existing facility bearing some similarities with the design under study, showing that the ambient dose equivalent ranges obtained are reasonable. The ambient dose equivalents, calculated by means of the Geant4 simulation, were compared to the Australian regulatory limits and showed that the new facility will not pose health risks for the public or staff, with a maximum equivalent dose rate equal to 7.9mSv/y in the control rooms and maze exit areas and 1.3·10(-1)mSv/y close to the walls, outside the facility, under very conservative assumptions. This work represents the first neutron shielding verification analysis of a new projected proton therapy facility and, as such, it may serve as a new source of comparison and validation for the international community, besides confirming the viability of the project from a radioprotection point of view.

  20. Spinal fusion

    MedlinePlus

    Liu G, Wong HK. Laminectomy and fusion. In: Shen FH, Samartzis D, Fessler RG, eds. Textbook of the Cervical Spine . Philadelphia, PA: Elsevier Saunders; 2015:chap 34. Wood GW. Arthrodesis of the spine. In: Canale ST, Beaty JH, eds. Campbell's Operative ...

  1. Changing Climate Extremes in the Northeast: CMIP5 Simulations and Projections

    NASA Astrophysics Data System (ADS)

    Thibeault, J. M.; Seth, A.

    2013-12-01

    Extreme climate events are known to have severe impacts on human and natural systems. As greenhouse warming progresses, a major concern is the potential for an increase in the frequency and intensity of extreme events. The Northeast (defined as the Northeast US, southern Quebec, and southeastern Ontario) is sensitive to climate extremes. The region is prone to flooding and drought, which poses challenges for infrastructure and water resource management, and increases risks to agriculture and forests. Extreme heat can be dangerous to human health, especially in the large urban centers of the Northeast. Annual average temperatures have steadily increased since the 1970s, accompanied by more frequent extremely hot weather, a longer growing season, and fewer frost days. Heavy precipitation events have become more frequent in recent decades. This research examines multi-model projections of annual and monthly extreme indices for the Northeast, using extreme indices computed by the Expert Team on Climate Change Detection and Indices (ETCCDI) for twenty-three global climate models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) for the 20th century historical and RCP8.5 experiments. Model simulations are compared to HadEX2 and ERA-interim gridded observations. CMIP5 simulations are consistent with observations - conditions in the Northeast are already becoming warmer and wetter. Projections indicate significant shifts toward warmer and wetter conditions by the middle century (2041-2070). Most indices are projected to be largely outside their late 20th century ranges by the late century (2071-2099). These results provide important information to stakeholders developing plans to lessen the adverse impacts of a warmer and wetter climate in the Northeast.

  2. A NATIONAL COLLABORATORY TO ADVANCE THE SCIENCE OF HIGH TEMPERATURE PLASMA PHYSICS FOR MAGNETIC FUSION

    SciTech Connect

    Allen R. Sanderson; Christopher R. Johnson

    2006-08-01

    This report summarizes the work of the University of Utah, which was a member of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it the NFC built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was itself a collaboration, itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, and Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. The complete finial report is attached as an addendum. The In the collaboration, the primary technical responsibility of the University of Utah in the collaboration was to develop and deploy an advanced scientific visualization service. To achieve this goal, the SCIRun Problem Solving Environment (PSE) is used on FusionGrid for an advanced scientific visualization service. SCIRun is open source software that gives the user the ability to create complex 3D visualizations and 2D graphics. This capability allows for the exploration of complex simulation results and the comparison of simulation and experimental data. SCIRun on FusionGrid gives the scientist a no-license-cost visualization capability that rivals present day commercial visualization packages. To accelerate the usage of SCIRun within the fusion community, a stand-alone application built on top of SCIRun was developed and deployed. This application, FusionViewer, allows users who are unfamiliar with SCIRun to quickly create

  3. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  4. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA) project: a novel bioengineering goal

    PubMed Central

    Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Baldoli, Ilaria; Bellanti, Lisa; Gentile, Marzia; Cecchi, Francesca; Sigali, Emilio; Tognarelli, Selene; Ghirri, Paolo; Mazzoleni, Stefano; Menciassi, Arianna; Cuttano, Armando; Boldrini, Antonio; Laschi, Cecilia; Dario, Paolo

    2013-01-01

    Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1) a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2) the prototyping phase; and (3) the on-field system validation. PMID:23966804

  5. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    NASA Technical Reports Server (NTRS)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  6. Simulations of Plasma-Liner Formation and Implosion for the PLX- α Project

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Cassibry, Jason; Schillo, Kevin; Shih, Wen; Yates, Kevin; Hsu, Scott; PLX-Alpha Collaboration

    2016-10-01

    Detailed numerical studies of the propagation and merger of high-Mach-number plasma jets and the formation and implosion of plasma liners have been performed using the FronTier and SPH codes enhanced with radiation, physical diffusion, and plasma-EOS models. These simulations support the Plasma Liner Experiment-ALPHA (PLX- α) project (see S. Hsu's talk in this session). Simulations predict properties of plasma liners, in particular 4 π-averaged liner density, ram pressure, and Mach number, the degree of non-uniformity, strength of primary and secondary shock waves, and scalings with the number of plasma jets, initial jet parameters, and other input data. In addition to direct analysis of liner states, simulations also provide synthetic data for direct comparison to experimental data from a multi-chord interferometer and survey and high-resolution spectrometers. Code verification and comparisons as well as predictions for the first series of PLX- α experiments with 6 and 7 jets will be presented. Verified against experimental data, both codes will be used for predictive simulations of plasma liners for PLX- α experiments and potential scaled-up future experiments. Supported by the ARPA-E ALPHA program.

  7. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    SciTech Connect

    Bryant, R M; Holloway, F W; Van Arsdall, P J

    1999-01-15

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  8. Evaluation of Tropospheric Water Vapor Simulations from the Atmospheric Model Intercomparison Project

    NASA Technical Reports Server (NTRS)

    Gaffen, Dian J.; Rosen, Richard D.; Salstein, David A.; Boyle, James S.

    1997-01-01

    Simulations of humidity from 28 general circulation models for the period 1979-88 from the Atmospheric Model Intercomparison Project are compared with observations from radiosondes over North America and the globe and with satellite microwave observations over the Pacific basin. The simulations of decadal mean values of precipitable water (W) integrated over each of these regions tend to be less moist than the real atmosphere in all three cases; the median model values are approximately 5% less than the observed values. The spread among the simulations is larger over regions of high terrain, which suggests that differences in methods of resolving topographic features are important. The mean elevation of the North American continent is substantially higher in the models than is observed, which may contribute to the overall dry bias of the models over that area. The authors do not find a clear association between the mean topography of a model and its mean W simulation, however, which suggests that the bias over land is not purely a matter of orography. The seasonal cycle of W is reasonably well simulated by the models, although over North America they have a tendency to become moister more quickly in the spring than is observed. The interannual component of the variability of W is not well captured by the models over North America. Globally, the simulated W values show a signal correlated with the Southern Oscillation index but the observations do not. This discrepancy may be related to deficiencies in the radiosonde network, which does not sample the tropical ocean regions well. Overall, the interannual variability of W, as well as its climatology and mean seasonal cycle, are better described by the median of the 28 simulations than by individual members of the ensemble. Tests to learn whether simulated precipitable water, evaporation, and precipitation values may be related to aspects of model formulation yield few clear signals, although the authors find, for

  9. Review of alternative concepts for magnetic fusion

    SciTech Connect

    Krakowski, R.A.; Miller, R.L.; Hagenson, R.L.

    1980-01-01

    Although the Tokamak represents the mainstay of the world's quest for magnetic fusion power, with the tandem mirror serving as a primary backup concept in the US fusion program, a wide range of alternative fusion concepts (AFC's) have been and are being pursued. This review presents a summary of past and present reactor projections of a majority of AFC's. Whenever possible, quantitative results are given.

  10. Converting Snow Depth to SWE: The Fusion of Simulated Data with Remote Sensing Retrievals and the Airborne Snow Observatory

    NASA Astrophysics Data System (ADS)

    Bormann, K.; Marks, D. G.; Painter, T. H.; Hedrick, A. R.; Deems, J. S.

    2015-12-01

    Snow cover monitoring has greatly benefited from remote sensing technology but, despite their critical importance, spatially distributed measurements of snow water equivalent (SWE) in mountain terrain remain elusive. Current methods of monitoring SWE rely on point measurements and are insufficient for distributed snow science and effective management of water resources. Many studies have shown that the spatial variability in SWE is largely controlled by the spatial variability in snow depth. JPL's Airborne Snow Observatory mission (ASO) combines LiDAR and spectrometer instruments to retrieve accurate and very high-resolution snow depth measurements at the watershed scale, along with other products such as snow albedo. To make best use of these high-resolution snow depths, spatially distributed snow density data are required to leverage SWE from the measured snow depths. Snow density is a spatially and temporally variable property that cannot yet be reliably extracted from remote sensing techniques, and is difficult to extrapolate to basin scales. However, some physically based snow models have shown skill in simulating bulk snow densities and therefore provide a pathway for snow depth to SWE conversion. Leveraging model ability where remote sensing options are non-existent, ASO employs a physically based snow model (iSnobal) to resolve distributed snow density dynamics across the basin. After an adjustment scheme guided by in-situ data, these density estimates are used to derive the elusive spatial distribution of SWE from the observed snow depth distributions from ASO. In this study, we describe how the process of fusing model data with remote sensing retrievals is undertaken in the context of ASO along with estimates of uncertainty in the final SWE volume products. This work will likely be of interest to those working in snow hydrology, water resource management and the broader remote sensing community.

  11. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-06-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  12. Final Report on Project 01-ERD-017 ''Smart Nanostructures From Computer Simulations''

    SciTech Connect

    Grossman, J C; Williamson, A J

    2004-02-13

    This project had two main objectives. The first major goal was to develop new, powerful computational simulation capabilities. It was important that these tools have the combination of the accuracy needed to describe the quantum mechanical nature of nanoscale systems and the efficiency required to be applied to realistic, experimentally derived materials. The second major goal was to apply these computational methods to calculate and predict the properties of quantum dots--initially composed of silicon, but then of other elements--which could be used to build novel nanotechnology devices. The driving factor of our purpose has been that, through the development and successful application of these tools, we would generate a new capability at LLNL that could be used to make nanostructured materials ''smarter'', e.g., by selectively predicting how to engineering specific, desired properties. To carry out the necessary work to successfully complete this project and deliver on our goals, we established a two-pronged effort from the beginning: (1) to work on developing new, more efficient algorithms and quantum simulation tools, and (2) to solve problems and make predictions regarding properties of quantum dots which were being studied experimentally here at Livermore.

  13. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    USGS Publications Warehouse

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  14. Cold fusion, Alchemist's dream

    SciTech Connect

    Clayton, E.D.

    1989-09-01

    In this report the following topics relating to cold fusion are discussed: muon catalysed cold fusion; piezonuclear fusion; sundry explanations pertaining to cold fusion; cosmic ray muon catalysed cold fusion; vibrational mechanisms in excited states of D{sub 2} molecules; barrier penetration probabilities within the hydrogenated metal lattice/piezonuclear fusion; branching ratios of D{sub 2} fusion at low energies; fusion of deuterons into {sup 4}He; secondary D+T fusion within the hydrogenated metal lattice; {sup 3}He to {sup 4}He ratio within the metal lattice; shock induced fusion; and anomalously high isotopic ratios of {sup 3}He/{sup 4}He.

  15. Cirrus Parcel Model Comparison Project. Phase 1: The Critical Components to Simulate Cirrus Initiation Explicitly.

    NASA Astrophysics Data System (ADS)

    Lin, Ruei-Fong; O'C. Starr, David; Demott, Paul J.; Cotton, Richard; Sassen, Kenneth; Jensen, Eric; Kärcher, Bernd; Liu, Xiaohong

    2002-08-01

    The Cirrus Parcel Model Comparison Project, a project of the GCSS [Global Energy and Water Cycle Experiment (GEWEX) Cloud System Studies] Working Group on Cirrus Cloud Systems, involves the systematic comparison of current models of ice crystal nucleation and growth for specified, typical, cirrus cloud environments. In Phase 1 of the project reported here, simulated cirrus cloud microphysical properties from seven models are compared for `warm' (40°C) and `cold' (60°C) cirrus, each subject to updrafts of 0.04, 0.2, and 1 m s1. The models employ explicit microphysical schemes wherein the size distribution of each class of particles (aerosols and ice crystals) is resolved into bins or the evolution of each individual particle is traced. Simulations are made including both homogeneous and heterogeneous ice nucleation mechanisms (all-mode simulations). A single initial aerosol population of sulfuric acid particles is prescribed for all simulations. Heterogeneous nucleation is disabled for a second parallel set of simulations in order to isolate the treatment of the homogeneous freezing (of haze droplets) nucleation process. Analysis of these latter simulations is the primary focus of this paper.Qualitative agreement is found for the homogeneous-nucleation-only simulations; for example, the number density of nucleated ice crystals increases with the strength of the prescribed updraft. However, significant quantitative differences are found. Detailed analysis reveals that the homogeneous nucleation rate, haze particle solution concentration, and water vapor uptake rate by ice crystal growth (particularly as controlled by the deposition coefficient) are critical components that lead to differences in the predicted microphysics.Systematic differences exist between results based on a modified classical theory approach and models using an effective freezing temperature approach to the treatment of nucleation. Each method is constrained by critical freezing data from

  16. Massively parallel simulation with DOE's ASCI supercomputers : an overview of the Los Alamos Crestone project

    SciTech Connect

    Weaver, R. P.; Gittings, M. L.

    2004-01-01

    The Los Alamos Crestone Project is part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative, or ASCI Program. The main goal of this software development project is to investigate the use of continuous adaptive mesh refinement (CAMR) techniques for application to problems of interest to the Laboratory. There are many code development efforts in the Crestone Project, both unclassified and classified codes. In this overview I will discuss the unclassified SAGE and the RAGE codes. The SAGE (SAIC adaptive grid Eulerian) code is a one-, two-, and three-dimensional multimaterial Eulerian massively parallel hydrodynamics code for use in solving a variety of high-deformation flow problems. The RAGE CAMR code is built from the SAGE code by adding various radiation packages, improved setup utilities and graphics packages and is used for problems in which radiation transport of energy is important. The goal of these massively-parallel versions of the codes is to run extremely large problems in a reasonable amount of calendar time. Our target is scalable performance to {approx}10,000 processors on a 1 billion CAMR computational cell problem that requires hundreds of variables per cell, multiple physics packages (e.g. radiation and hydrodynamics), and implicit matrix solves for each cycle. A general description of the RAGE code has been published in [l],[ 2], [3] and [4]. Currently, the largest simulations we do are three-dimensional, using around 500 million computation cells and running for literally months of calendar time using {approx}2000 processors. Current ASCI platforms range from several 3-teraOPS supercomputers to one 12-teraOPS machine at Lawrence Livermore National Laboratory, the White machine, and one 20-teraOPS machine installed at Los Alamos, the Q machine. Each machine is a system comprised of many component parts that must perform in unity for the successful run of these simulations. Key features of any massively parallel system

  17. Simulation to Seismic Fluid Substitution Modeling at the Illinois Basin - Decatur Project

    NASA Astrophysics Data System (ADS)

    Will, R. A.

    2015-12-01

    The Illinois Basin - Decatur Project (IBDP) is one of the most advanced US Department of Energy-funded carbon dioxide (CO2) sequestration projects. The goal of injecting 1 million tonnes of CO2 over a three year period was reached in November 2014 and the project is now in the post injection site closure (PISC) phase. A number of seismic methods are being utilized in the IBDP PISC plume monitoring program. These include time lapse three-dimensional (3D) vertical seismic profile (VSP) surveys, time-lapse surface seismic surveys, and passive seismic monitoring. While each seismic monitoring method has inherent spatial resolution and imaging footprint characteristics, all fundamentally rely on variation of reservoir elastic properties in response to injection induced changes in saturation and pressure conditions. These variations in elastic properties, and the resulting time-lapse seismic response, are often subtle and non-unique with respect to saturation and pressure effects. Elastic properties of saturated porous media may be estimated using rock physics theory and fluid substitution methods; however, the complexity of typical reservoir rock and fluid systems under injection conditions, and the subtlety of the resulting changes in elastic properties, dictate the need for representative estimates of the reservoir geologic framework, reservoir rock physics, and the anticipated plume geometry. At IBDP a "simulation-to-seismic" workflow has been used to develop accurate estimates of 3D time-lapse elastic property and seismic signal responses for CO2 plumes generated using a calibrated compositional flow simulation model. The anticipated time-lapse response for the IBDP surface and VSP time-lapse surveys have been estimated using ranges of rock physics parameters derived from geophysical logs. These investigations highlight the importance of geologic controls on plume geometry in monitoring program design as well as during model-based interpretation of time

  18. Dynamical simulation of the fission process and anisotropy of the fission fragment angular distributions of excited nuclei produced in fusion reactions

    NASA Astrophysics Data System (ADS)

    Eslamizadeh, H.

    2016-10-01

    Abstract. A stochastic approach based on four-dimensional Langevin equations was applied to calculate the anisotropy of fission fragment angular distributions, average prescission neutron multiplicity, and the fission probability in a wide range of fissile parameters for the compound nuclei 197Tl,225Pa,248Cf , and 264Rf produced in fusion reactions. Three collective shape coordinates plus the projection of total spin of the compound nucleus to the symmetry axis K were considered in the four-dimensional dynamical model. In the dynamical calculations, nuclear dissipation was generated through the chaos-weighted wall and window friction formula. Furthermore, in the dynamical calculations the dissipation coefficient of K ,γk was considered as a free parameter, and its magnitude inferred by fitting measured data on the anisotropy of fission fragment angular distributions for the compound nuclei 197Tl,225Pa,248Cf , and 264Rf. Comparison of the calculated results for the anisotropy of fission fragment angular distributions with the experimental data showed that the results of the calculations are in good agreement with the experimental data by using values of the dissipation coefficient of K equal to (0.185-0.205), (0.175-0.192), (0.077-0.090), and (0.075-0.085) (MeVzs ) -1 /2 for the compound nuclei 197Tl,225Pa,248Cf , and 264Rf, respectively. It was also shown that the influence of the dissipation coefficient of K on the results of the calculations of the prescission neutron multiplicity and fission probability is small.

  19. Simulation technology used for risky assessment in deep exploration project in China

    NASA Astrophysics Data System (ADS)

    jiao, J.; Huang, D.; Liu, J.

    2013-12-01

    Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in

  20. Investigating the potential of the Pan-Planets project using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Koppenhoefer, J.; Afonso, C.; Saglia, R. P.; Henning, Th.

    2009-02-01

    Using Monte Carlo simulations we analyze the potential of the upcoming transit survey Pan-Planets. The analysis covers the simulation of realistic light curves (including the effects of ingress/egress and limb-darkening) with both correlated and uncorrelated noise as well as the application of a box-fitting-least-squares detection algorithm. In this work we show how simulations can be a powerful tool in defining and optimizing the survey strategy of a transiting planet survey. We find the Pan-Planets project to be competitive with all other existing and planned transit surveys with the main power being the large 7 square degree field of view. In the first year we expect to find up to 25 Jupiter-sized planets with periods below 5 days around stars brighter than V = 16.5 mag. The survey will also be sensitive to planets with longer periods and planets with smaller radii. After the second year of the survey, we expect to find up to 9 Warm Jupiters with periods between 5 and 9 days and 7 Very Hot Saturns around stars brighter than V = 16.5 mag as well as 9 Very Hot Neptunes with periods from 1 to 3 days around stars brighter than i' = 18.0 mag.

  1. The LISA Pathfinder Simulator for the Science and Technology Operations Center: Simulator Reuse Across the Project Life-Cycle: Practical Experiences and Lessons Learned

    NASA Astrophysics Data System (ADS)

    van der Plas, Peter; Leorato, Christiano

    2010-08-01

    During the operational phase of the Lisa Pathfinder (LPF) mission, the Science and Technology Operations Center (STOC) will be in charge of the operations of the LPF experiments. For the STOC to be able to perform its planning activities, an experiment simulator is required. The STOC simulator is based on the reuse of two simulators, which had originally been developed by EADS Astrium to support previous phases of the project life-cycle. This paper describes the STOC Simulator development approach, the used technologies and the high-level design. It then focuses on the specific implications of the reuse of the existing simulators: relevant issues are highlighted, together with the adopted solutions. Finally, the paper reports the first feedback on the actual usage of the STOC Simulator and then summarizes the lessons learned.

  2. Hardware-Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-08-04

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32-bit floating point texture capabilities to obtain solutions to the radiative transport equation for X-rays. The hardware accelerated solutions are accurate enough to enable scientists to explore the experimental design space with greater efficiency than the methods currently in use. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedral meshes that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester.

  3. Sensor fusion for intelligent process control.

    SciTech Connect

    Connors, John J.; Hill, Kevin; Hanekamp, David; Haley, William F.; Gallagher, Robert J.; Gowin, Craig; Farrar, Arthur R.; Sheaffer, Donald A.; DeYoung, Mark A.; Bertram, Lee A.; Dodge, Craig; Binion, Bruce; Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R.; Tiwary, Rajiv; Stokes, Michael R.; Miller, Alan J.; Michael, Richard W.; Mayer, Raymond M.; Jiao, Yu; Smith, Philip J.; Arbab, Mehran; Hillaire, Robert G.

    2004-08-01

    An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

  4. Data requirements for EOR surfactant-polymer process simulation and analysis of El Dorado pilot-project simulation, Butler County, Kansas. Volume II. Appendices

    SciTech Connect

    Claridge, E.L.; Lohse, A.

    1983-01-01

    The results of computer simulation of the El Dorado surfactant-polymer EOR pilot project, Butler County, Kansas indicated that conventional data from the project and other data in the public domain were not adequate for geologic, reservoir and process characterizations in a complex numerical simulation. As used by GURC in geologic characterization, and by INTERCOMP in process characterization and input into the CFTE simulator, the collective body of field and chemical data and related assumptions necessary for simulator input was not sufficient to predict how the chemical flood would behave in the Admire 650-foot sandstone reservoir. Based upon this study, a comprehensive body of data requirements for EOR simulation is defined in detail. Geologic characterization includes descriptors for rock, interwell and intrasystem correlations; reservoir characterization includes descriptors for fluid/rock, production, and flow rate properties; process characterization includes descriptors for chemical properties, interactions and functions. Reservoir heterogeneity is a principal problem in EOR simulation. It can be overcome within reasonable economic limits by successive orders of descriptors from: microscale (rock), achieved through borehole and core analyses; to macroscale (interwell), achieved through multiple borehole correlations; to megascale (intrasystem), achieved through extrapolation of rock and correlative well data into a generic depositional model that contains a description of internal mass properties within a given external morphology. Volume II contains appendices for: flow chart for surfactant-polymer process simulation; INTERCOMP reports to GURC describing the CFTE simulator program used in this study.

  5. Data requirements for EOR surfactant-polymer process simulation and analysis of El Dorado pilot-project simulation, Butler County, Kansas. Volume I. Technical report

    SciTech Connect

    Claridge, E.L.; Lohse, A.

    1983-01-01

    The results of computer simulation of the El Dorado surfactant-polymer EOR pilot project, Butler County, Kansas indicated that conventional data from the project and other data in the public domain were not adequate for geologic, reservoir and process characterizations in a complex numerical simulation. As used by GURC in geologic characterization, and by INTERCOMP in process characterization and input into the CFTE simulator, the collective body of field and chemical data and related assumptions necessary for simulator input was not sufficient to predict how the chemical flood would behave in the Admire 650-foot sandstone reservoir. Based upon this study, a comprehensive body of data requirements for EOR simulation is defined in detail. Geologic characterization includes descriptors for rock, interwell and intrasystem correlations; reservoir characterization includes descriptors for fluid/rock, production, and flow rate properties; process characterization includes descriptors for chemical properties, interactions and functions. Reservoir heterogeneity is a principal problem in EOR simulation. It can be overcome within reasonable economic limits by successive orders of descriptors from: microscale (rock), achieved through borehole and core analyses; to macroscale (interwell), achieved through multiple borehole correlations; to megascale (intrasystem), achieved through extrapolation of rock and correlative well data into a generic depositional model that contains a description of internal mass properties within a given external morphology. Volume II contains appendices for: flow chart for surfactant-polymer process simulation; INTERCOMP reports to GURC describing the CFTE simulator program used in this study.

  6. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  7. To Create Space on Earth: The Space Environment Simulation Laboratory and Project Apollo

    NASA Technical Reports Server (NTRS)

    Walters, Lori C.

    2003-01-01

    Few undertakings in the history of humanity can compare to the great technological achievement known as Project Apollo. Among those who witnessed Armstrong#s flickering television image were thousands of people who had directly contributed to this historic moment. Amongst those in this vast anonymous cadre were the personnel of the Space Environment Simulation Laboratory (SESL) at the Manned Spacecraft Center (MSC) in Houston, Texas. SESL houses two large thermal-vacuum chambers with solar simulation capabilities. At a time when NASA engineers had a limited understanding of the effects of extremes of space on hardware and crews, SESL was designed to literally create the conditions of space on Earth. With interior dimensions of 90 feet in height and a 55-foot diameter, Chamber A dwarfed the Apollo command/service module (CSM) it was constructed to test. The chamber#s vacuum pumping capacity of 1 x 10(exp -6) torr can simulate an altitude greater than 130 miles above the Earth. A "lunar plane" capable of rotating a 150,000-pound test vehicle 180 deg replicates the revolution of a craft in space. To reproduce the temperature extremes of space, interior chamber walls cool to -280F as two banks of carbon arc modules simulate the unfiltered solar light/heat of the Sun. With capabilities similar to that of Chamber A, early Chamber B tests included the Gemini modular maneuvering unit, Apollo EVA mobility unit and the lunar module. Since Gemini astronaut Charles Bassett first ventured into the chamber in 1966, Chamber B has assisted astronauts in testing hardware and preparing them for work in the harsh extremes of space.

  8. Simulation of extreme reservoir level distribution with the SCHADEX method (EXTRAFLO project)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel; Penot, David; Garavaglia, Federico

    2013-04-01

    -to-volume ratios and hydrographs applied to each simulated event. This allows accounting for different flood dynamics, depending on the season, the generating precipitation event, the soil saturation state, etc. In both cases, a hydraulic simulation of dam operation is performed, in order to compute the distribution of maximum reservoir levels. Results are detailed for an extreme return level, showing that a 1000 years return level reservoir level can be reached during flood events whose components (peaks, volumes) are not necessarily associated with such return level. The presentation will be illustrated by the example of a fictive dam on the Tech River at Reynes (South of France, 477 km²). This study has been carried out within the EXTRAFLO project, Task 8 (https://extraflo.cemagref.fr/). References: Paquet, E., Gailhard, J. and Garçon, R. (2006), Evolution of the GRADEX method: improvement by atmospheric circulation classification and hydrological modeling, La Houille Blanche, 5, 80-90. doi:10.1051/lhb:2006091. Paquet, E., Garavaglia, F., Garçon, R. and Gailhard, J. (2012), The SCHADEX method: a semi-continuous rainfall-runoff simulation for extreme food estimation, Journal of Hydrology, under revision

  9. Simulator Network Project Report: A tool for improvement of teaching materials and targeted resource usage in Skills Labs

    PubMed Central

    Damanakis, Alexander; Blaum, Wolf E.; Stosch, Christoph; Lauener, Hansjörg; Richter, Sabine; Schnabel, Kai P.

    2013-01-01

    During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network). PMID:23467581

  10. Simulator Network project report: a tool for improvement of teaching materials and targeted resource usage in Skills Labs.

    PubMed

    Damanakis, Alexander; Blaum, Wolf E; Stosch, Christoph; Lauener, Hansjörg; Richter, Sabine; Schnabel, Kai P

    2013-01-01

    During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network).

  11. Cirrus Parcel Model Comparison Project. Phase 1; The Critical Components to Simulate Cirrus Initiation Explicitly

    NASA Technical Reports Server (NTRS)

    Lin, Ruei-Fong; Starr, David OC; DeMott, Paul J.; Cotton, Richard; Sassen, Kenneth; Jensen, Eric; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The Cirrus Parcel Model Comparison Project, a project of the GCSS (GEWEX Cloud System Studies) Working Group on Cirrus Cloud Systems, involves the systematic comparison of current models of ice crystal nucleation and growth for specified, typical, cirrus cloud environments. In Phase I of the project reported here, simulated cirrus cloud microphysical properties are compared for situations of "warm" (40 C) and "cold" (-60 C) cirrus, both subject to updrafts of 4, 20 and 100 centimeters per second. Five models participated. The various models employ explicit microphysical schemes wherein the size distribution of each class of particles (aerosols and ice crystals) is resolved into bins or treated separately. Simulations are made including both the homogeneous and heterogeneous ice nucleation mechanisms. A single initial aerosol population of sulfuric acid particles is prescribed for all simulations. To isolate the treatment of the homogeneous freezing (of haze droplets) nucleation process, the heterogeneous nucleation mechanism is disabled for a second parallel set of simulations. Qualitative agreement is found for the homogeneous-nucleation- only simulations, e.g., the number density of nucleated ice crystals increases with the strength of the prescribed updraft. However, significant quantitative differences are found. Detailed analysis reveals that the homogeneous nucleation rate, haze particle solution concentration, and water vapor uptake rate by ice crystal growth (particularly as controlled by the deposition coefficient) are critical components that lead to differences in predicted microphysics. Systematic bias exists between results based on a modified classical theory approach and models using an effective freezing temperature approach to the treatment of nucleation. Each approach is constrained by critical freezing data from laboratory studies, but each includes assumptions that can only be justified by further laboratory research. Consequently, it is not yet

  12. Simulation of strongly correlated fermions in two spatial dimensions with fermionic projected entangled-pair states

    NASA Astrophysics Data System (ADS)

    Corboz, Philippe; Orús, Román; Bauer, Bela; Vidal, Guifré

    2010-04-01

    We explain how to implement, in the context of projected entangled-pair states (PEPSs), the general procedure of fermionization of a tensor network introduced in P. Corboz and G. Vidal, Phys. Rev. B 80, 165129 (2009). The resulting fermionic PEPS, similar to previous proposals, can be used to study the ground state of interacting fermions on a two-dimensional lattice. As in the bosonic case, the cost of simulations depends on the amount of entanglement in the ground state and not directly on the strength of interactions. The present formulation of fermionic PEPS leads to a straightforward numerical implementation that allowed us to recycle much of the code for bosonic PEPS. We demonstrate that fermionic PEPS are a useful variational ansatz for interacting fermion systems by computing approximations to the ground state of several models on an infinite lattice. For a model of interacting spinless fermions, ground state energies lower than Hartree-Fock results are obtained, shifting the boundary between the metal and charge-density wave phases. For the t-J model, energies comparable with those of a specialized Gutzwiller-projected ansatz are also obtained.

  13. PROJECT CLEA: Two Decades of Astrophysics Research Simulations for Astronomy Education

    NASA Astrophysics Data System (ADS)

    Marschall, Laurence A.; Snyder, G.; Cooper, P.

    2013-01-01

    Since 1992, Project CLEA (Contemporary Laboratory Experiences in Astronomy) has been developing simulations for the astronomy laboratory that engage students in the experience of modern astrophysical research. Though designed for introductory undergraduate courses, CLEA software can be flexibly configured for use in high-school classes and in upper-level observational astronomy classes, and has found usage in a wide spectrum of classrooms and on-line courses throughout the world. Now at the two-decade mark, CLEA has produced 16 exercises covering a variety of planetary, stellar, and extragalactic research topics at wavelengths from radio to X-ray. Project CLEA’s most recent product, VIREO, the Virtual Educational Observatory, is a flexible all-sky environment for developing a variety of further exercises. We review the current CLEA offerings and look to the future, especially describing further challenges in developing and maintaining the functionality of CLEA and similar activities as the current investigators wind down the funded development process. This research was sponsored throughout the world. by the National Science Foundation, Gettysburg College, and NASA's XMM-Newton mission.

  14. Fusion: an energy source for synthetic fuels

    SciTech Connect

    Fillo, J A; Powell, J; Steinberg, M

    1980-01-01

    The decreasing availability of fossil fuels emphasizes the need to develop systems which will produce synthetic fuel to substitute for and supplement the natural supply. An important first step in the synthesis of liquid and gaseous fuels is the production of hydrogen. Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of approx. 40 to 60% and hydrogen production efficiencies by high temperature electrolysis of approx. 50 to 70% are projected for fusion reactors using high temperature blankets. Fusion/coal symbiotic systems appear economically promising for the first generation of commercial fusion synfuels plants. Coal production requirements and the environmental effects of large-scale coal usage would be greatly reduced by a fusion/coal system. In the long term, there could be a gradual transition to an inexhaustible energy system based solely on fusion.

  15. Advanced fission and fossil plant economics-implications for fusion

    SciTech Connect

    Delene, J.G.

    1994-09-01

    In order for fusion energy to be a viable option for electric power generation, it must either directly compete with future alternatives or serve as a reasonable backup if the alternatives become unacceptable. This paper discusses projected costs for the most likely competitors with fusion power for baseload electric capacity and what these costs imply for fusion economics. The competitors examined include advanced nuclear fission and advanced fossil-fired plants. The projected costs and their basis are discussed. The estimates for these technologies are compared with cost estimates for magnetic and inertial confinement fusion plants. The conclusion of the analysis is that fusion faces formidable economic competition. Although the cost level for fusion appears greater than that for fission or fossil, the costs are not so high as to preclude fusion`s potential competitiveness.

  16. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast.

    PubMed

    Chen, L; Boone, J M; Abbey, C K; Hargreaves, J; Bateni, C; Lindfors, K K; Yang, K; Nosratieh, A; Hernandez, A; Gazi, P

    2015-04-21

    The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33, 0.71, 1.5 and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast.The percent correct of the human observer's responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p < 0.05) better in the case of thin-section images, compared to thick section images similar to mammography, for all but the 1 mm lesion diameter lesions. For example, the average of three radiologist's performance for 3 mm diameter lesions was 92% correct for thin section breast CT images while it was 67% for the simulated projection images. A gradual reduction in observer performance was observed as the section thickness increased beyond about 1 mm. While a performance difference based on breast density was seen in both breast CT and the projection image results, the average radiologist performance using breast CT images in dense breasts outperformed the performance using simulated projection images in fatty breasts for all lesion diameters except 11 mm. The average radiologist performance outperformed that of the average physicist observer, however trends

  17. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast

    NASA Astrophysics Data System (ADS)

    Chen, L.; Boone, J. M.; Abbey, C. K.; Hargreaves, J.; Bateni, C.; Lindfors, K. K.; Yang, K.; Nosratieh, A.; Hernandez, A.; Gazi, P.

    2015-04-01

    The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33, 0.71, 1.5 and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast. The percent correct of the human observer’s responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p < 0.05) better in the case of thin-section images, compared to thick section images similar to mammography, for all but the 1 mm lesion diameter lesions. For example, the average of three radiologist’s performance for 3 mm diameter lesions was 92% correct for thin section breast CT images while it was 67% for the simulated projection images. A gradual reduction in observer performance was observed as the section thickness increased beyond about 1 mm. While a performance difference based on breast density was seen in both breast CT and the projection image results, the average radiologist performance using breast CT images in dense breasts outperformed the performance using simulated projection images in fatty breasts for all lesion diameters except 11 mm. The average radiologist performance outperformed that of the average physicist

  18. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    PubMed

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  19. Monte Carlo simulations for the space radiation superconducting shield project (SR2S)

    NASA Astrophysics Data System (ADS)

    Vuolo, M.; Giraudo, M.; Musenich, R.; Calvelli, V.; Ambroglini, F.; Burger, W. J.; Battiston, R.

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield - a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  20. Collaborative Simulation and Testing of the Superconducting Dipole Prototype Magnet for the FAIR Project

    NASA Astrophysics Data System (ADS)

    Zhu, Yinfeng; Zhu, Zhe; Xu, Houchang; Wu, Weiyue

    2012-08-01

    The superconducting dipole prototype magnet of the collector ring for the Facility for Antiproton and Ion Research (FAIR) is an international cooperation project. The collaborative simulation and testing of the developed prototype magnet is presented in this paper. To evaluate the mechanical strength of the coil case during quench, a 3-dimensional (3D) electromagnetic (EM) model was developed based on the solid97 magnetic vector element in the ANSYS commercial software, which includes the air region, coil and yoke. EM analysis was carried out with a peak operating current at 278 A. Then, the solid97 element was transferred into the solid185 element, the coupled analysis was switched from electromagnetic to structural, and the finite element model for the coil case and glass-fiber reinforced composite (G10) spacers was established by the ANSYS Parametric Design Language based on the 3D model from the CATIA V5 software. However, to simulate the friction characteristics inside the coil case, the conta173 surface-to-surface contact element was established. The results for the coil case and G10 spacers show that they are safe and have sufficient strength, on the basis of testing in discharge and quench scenarios.

  1. Terascale Optimal PDE Simulations

    SciTech Connect

    David Keyes

    2009-07-28

    The Terascale Optimal PDE Solvers (TOPS) Integrated Software Infrastructure Center (ISIC) was created to develop and implement algorithms and support scientific investigations performed by DOE-sponsored researchers. These simulations often involve the solution of partial differential equations (PDEs) on terascale computers. The TOPS Center researched, developed and deployed an integrated toolkit of open-source, optimal complexity solvers for the nonlinear partial differential equations that arise in many DOE application areas, including fusion, accelerator design, global climate change and reactive chemistry. The algorithms created as part of this project were also designed to reduce current computational bottlenecks by orders of magnitude on terascale computers, enabling scientific simulation on a scale heretofore impossible.

  2. Extreme rainfall in Serbia, May 2014, simulation using WRF NMM and RainFARM: DRIHM project

    NASA Astrophysics Data System (ADS)

    Dekić, Ljiljana; Mihalović, Ana; Dimitrijević, Vladimir; Rebora, Nicola; Parodi, Antonio

    2015-04-01

    Extreme rainfall in Serbia, May 2014, simulation using WRF NMM and RainFARM: DRIHM project Ljiljana Dekić (1), Ana Mihalović (1), Vladimir Dimitrijević (1), Nicola Rebora (2), Antonio Parodi (2) (1)Republic HydroMeteorological Service of Serbia, Belgrade, Serbia, (2)CIMA Research Foundation, Savona, Italy In May 2014 Balkan region was affected with the continuous heavy rainfall, the heaviest in 120 years of recording observation, causing extensive flooding. Serbia suffered human casualties, huge infrastructure and industrial destruction and agricultural damage. Cyclone development and trajectory was very well predicted by RHMSS operational WRF NMM numerical model but extreme precipitation was not possible to predict with sufficient precision. Simulation of extreme rainfall situations using different numerical weather prediction models can indicate weakness of the model and point out importance of specified physical approach and parameterization schemes. The FP7 Distributed Research Infrastructure for Hydro-Meteorology DRIHM project gives a framework for using different models in forecasting extreme weather events. One of the DRIHM component is Rainfall Filtered Autoregressive Model RainFARM for stochastic rainfall downscaling. Objective of the DRIHM project was developing of standards and conversion of the data for seamless use of meteorological and hydrological models in flood prediction. This paper describes numerical tests and results of WRF NMM nonhydrostatic model and RainFARM downscaling applied on WRF NMM outputs. Different physics options in WRF NMM and their influence on precipitation amount were investigated. RainFARM was applied on every physical option with downscaling from 4km to 500m and 100m horizontal resolution and 100 ensemble members. We analyzed locations on the catchments in Serbia where flooding was the strongest and the most destructive. Statistical evaluation of ensemble output gives new insight into the sub scale precipitation

  3. The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test

    SciTech Connect

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; Butler, Michael J.; Ceverino, Daniel; Choi, Jun-Hwan; Feldmann, Robert; Keller, Ben W.; Lupi, Alessandro; Quinn, Thomas; Revaz, Yves; Wallace, Spencer; Gnedin, Nickolay Y.; Leitner, Samuel N.; Shen, Sijing; Smith, Britton D.; Thompson, Robert; Turk, Matthew J.; Abel, Tom; Arraki, Kenza S.; Benincasa, Samantha M.; Chakrabarti, Sukanya; DeGraf, Colin; Dekel, Avishai; Goldbaum, Nathan J.; Hopkins, Philip F.; Hummels, Cameron B.; Klypin, Anatoly; Li, Hui; Madau, Piero; Mandelker, Nir; Mayer, Lucio; Nagamine, Kentaro; Nickerson, Sarah; O’Shea, Brian W.; Primack, Joel R.; Roca-Fàbrega, Santi; Semenov, Vadim; Shimizu, Ikkoh; Simpson, Christine M.; Todoroki, Keita; Wadsley, James W.; Wise, John H.

    2016-12-20

    Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, we find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.

  4. The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test

    DOE PAGES

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; ...

    2016-12-20

    Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  5. The AGORA High-resolution Galaxy Simulations Comparison Project. II. Isolated Disk Test

    NASA Astrophysics Data System (ADS)

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; Butler, Michael J.; Ceverino, Daniel; Choi, Jun-Hwan; Feldmann, Robert; Keller, Ben W.; Lupi, Alessandro; Quinn, Thomas; Revaz, Yves; Wallace, Spencer; Gnedin, Nickolay Y.; Leitner, Samuel N.; Shen, Sijing; Smith, Britton D.; Thompson, Robert; Turk, Matthew J.; Abel, Tom; Arraki, Kenza S.; Benincasa, Samantha M.; Chakrabarti, Sukanya; DeGraf, Colin; Dekel, Avishai; Goldbaum, Nathan J.; Hopkins, Philip F.; Hummels, Cameron B.; Klypin, Anatoly; Li, Hui; Madau, Piero; Mandelker, Nir; Mayer, Lucio; Nagamine, Kentaro; Nickerson, Sarah; O'Shea, Brian W.; Primack, Joel R.; Roca-Fàbrega, Santi; Semenov, Vadim; Shimizu, Ikkoh; Simpson, Christine M.; Todoroki, Keita; Wadsley, James W.; Wise, John H.; AGORA Collaboration

    2016-12-01

    Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, we find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ˜3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.

  6. The Fusion Energy Option

    NASA Astrophysics Data System (ADS)

    Dean, Stephen O.

    2004-06-01

    Presentations from a Fusion Power Associates symposium, The Fusion Energy Option, are summarized. The topics include perspectives on fossil fuel reserves, fusion as a source for hydrogen production, status and plans for the development of inertial fusion, planning for the construction of the International Thermonuclear Experimental Reactor, status and promise of alternate approaches to fusion and the need for R&D now on fusion technologies.

  7. Carbon Nanotubes Mediate Fusion of Lipid Vesicles.

    PubMed

    Bhaskara, Ramachandra M; Linker, Stephanie M; Vögele, Martin; Köfinger, Jürgen; Hummer, Gerhard

    2017-02-28

    The fusion of lipid membranes is opposed by high energetic barriers. In living organisms, complex protein machineries carry out this biologically essential process. Here we show that membrane-spanning carbon nanotubes (CNTs) can trigger spontaneous fusion of small lipid vesicles. In coarse-grained molecular dynamics simulations, we find that a CNT bridging between two vesicles locally perturbs their lipid structure. Their outer leaflets merge as the CNT pulls lipids out of the membranes, creating an hourglass-shaped fusion intermediate with still intact inner leaflets. As the CNT moves away from the symmetry axis connecting the vesicle centers, the inner leaflets merge, forming a pore that completes fusion. The distinct mechanism of CNT-mediated membrane fusion may be transferable, providing guidance in the development of fusion agents, e.g., for the targeted delivery of drugs or nucleic acids.

  8. The first fusion reactor: ITER

    NASA Astrophysics Data System (ADS)

    Campbell, D. J.

    2016-11-01

    Established by the signature of the ITER Agreement in November 2006 and currently under construction at St Paul-lez-Durance in southern France, the ITER project [1,2] involves the European Union (including Switzerland), China, India, Japan, the Russian Federation, South Korea and the United States. ITER (`the way' in Latin) is a critical step in the development of fusion energy. Its role is to provide an integrated demonstration of the physics and technology required for a fusion power plant based on magnetic confinement.

  9. SATSIM—A real-time multi-satellite simulator for test and validation in formation flying projects

    NASA Astrophysics Data System (ADS)

    Bodin, Per; Nylund, Matti; Battelino, Milan

    2012-05-01

    The satellite simulator SATSIM was developed during the experimental PRISMA multi-satellite formation flying project and was primarily aimed to validate the Guidance, Navigation and Control system (GNC) and the on-board software in a simulated real-time environment. The SATSIM system has as a main feature the ability to simulate sensors and actuators, spacecraft dynamics, intra-satellite communication protocols, environmental disturbances, solar illumination conditions as well as solar and lunar blinding. The core of the simulator consists of MATLAB/Simulink models of the spacecraft hardware and the space environment. The models run on a standard personal computer that in the simplest scenario may be connected to satellite controller boards through a CAN (Controller Area Network) data bus. SATSIM is, in conjunction with the RAMSES Test and Verification system, able to perform open-loop, hardware-in-the-loop as well as full-fledged closed-loop tests through the utilisation of peripheral sensor unit simulators. The PRISMA satellites were launched in June 2010 and the project is presently in its operational phase. This paper describes how a low cost but yet reliable simulator such as the SATSIM platform in different configurations has been used through the different phases of a multi-satellite project, from early test of onboard software running on satellite controller boards in a lab environment, to full-fledged closed-loop tests of satellite flight models.

  10. The EAGLE project: simulating the evolution and assembly of galaxies and their environments

    NASA Astrophysics Data System (ADS)

    Schaye, Joop; Crain, Robert A.; Bower, Richard G.; Furlong, Michelle; Schaller, Matthieu; Theuns, Tom; Dalla Vecchia, Claudio; Frenk, Carlos S.; McCarthy, I. G.; Helly, John C.; Jenkins, Adrian; Rosas-Guevara, Y. M.; White, Simon D. M.; Baes, Maarten; Booth, C. M.; Camps, Peter; Navarro, Julio F.; Qu, Yan; Rahmati, Alireza; Sawala, Till; Thomas, Peter A.; Trayford, James

    2015-01-01

    We introduce the Virgo Consortium's Evolution and Assembly of GaLaxies and their Environments (EAGLE) project, a suite of hydrodynamical simulations that follow the formation of galaxies and supermassive black holes in cosmologically representative volumes of a standard Λ cold dark matter universe. We discuss the limitations of such simulations in light of their finite resolution and poorly constrained subgrid physics, and how these affect their predictive power. One major improvement is our treatment of feedback from massive stars and active galactic nuclei (AGN) in which thermal energy is injected into the gas without the need to turn off cooling or decouple hydrodynamical forces, allowing winds to develop without predetermined speed or mass loading factors. Because the feedback efficiencies cannot be predicted from first principles, we calibrate them to the present-day galaxy stellar mass function and the amplitude of the galaxy-central black hole mass relation, also taking galaxy sizes into account. The observed galaxy stellar mass function is reproduced to ≲ 0.2 dex over the full resolved mass range, 108 < M*/M⊙ ≲ 1011, a level of agreement close to that attained by semi-analytic models, and unprecedented for hydrodynamical simulations. We compare our results to a representative set of low-redshift observables not considered in the calibration, and find good agreement with the observed galaxy specific star formation rates, passive fractions, Tully-Fisher relation, total stellar luminosities of galaxy clusters, and column density distributions of intergalactic C IV and O VI. While the mass-metallicity relations for gas and stars are consistent with observations for M* ≳ 109 M⊙ (M* ≳ 1010 M⊙ at intermediate resolution), they are insufficiently steep at lower masses. For the reference model, the gas fractions and temperatures are too high for clusters of galaxies, but for galaxy groups these discrepancies can be resolved by adopting a higher

  11. Geophysical data fusion for subsurface imaging. Phase 1

    SciTech Connect

    Hoekstra, P.; Vandergraft, J.; Blohm, M.; Porter, D.

    1993-08-01

    A geophysical data fusion methodology is under development to combine data from complementary geophysical sensors and incorporate geophysical understanding to obtain three dimensional images of the subsurface. The research reported here is the first phase of a three phase project. The project focuses on the characterization of thin clay lenses (aquitards) in a highly stratified sand and clay coastal geology to depths of up to 300 feet. The sensor suite used in this work includes time-domain electromagnetic induction (TDEM) and near surface seismic techniques. During this first phase of the project, enhancements to the acquisition and processing of TDEM data were studied, by use of simulated data, to assess improvements for the detection of thin clay layers. Secondly, studies were made of the use of compressional wave and shear wave seismic reflection data by using state-of-the-art high frequency vibrator technology. Finally, a newly developed processing technique, called ``data fusion,`` was implemented to process the geophysical data, and to incorporate a mathematical model of the subsurface strata. Examples are given of the results when applied to real seismic data collected at Hanford, WA, and for simulated data based on the geology of the Savannah River Site.

  12. Establishment of an Institute for Fusion Studies. Technical progress report, November 1, 1994--October 31, 1995

    SciTech Connect

    1995-07-01

    The Institute for Fusion Studies is a national center for theoretical fusion plasma physics research. Its purposes are to (1) conduct research on theoretical questions concerning the achievement of controlled fusion energy by means of magnetic confinement--including both fundamental problems of long-range significance, as well as shorter-term issues; (2) serve as a national and international center for information exchange by hosting exchange visits, conferences, and workshops; and (3) train students and postdoctoral research personnel for the fusion energy program and plasma physics research areas. During FY 1995, a number of significant scientific advances were achieved at the IFS, both in long-range fundamental problems as well as in near-term strategic issues, consistent with the Institute`s mandate. Examples of these achievements include, for example, tokamak edge physics, analytical and computational studies of ion-temperature-gradient-driven turbulent transport, alpha-particle-excited toroidal Alfven eigenmode nonlinear behavior, sophisticated simulations for the Numerical Tokamak Project, and a variety of non-tokamak and non-fusion basic plasma physics applications. Many of these projects were done in collaboration with scientists from other institutions. Research discoveries are briefly described in this report.

  13. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  14. UAS in the NAS Project: Large-Scale Communication Architecture Simulations with NASA GRC Gen5 Radio Model

    NASA Technical Reports Server (NTRS)

    Kubat, Gregory

    2016-01-01

    This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.

  15. A Study of the Efficacy of Project-Based Learning Integrated with Computer-Based Simulation--STELLA

    ERIC Educational Resources Information Center

    Eskrootchi, Rogheyeh; Oskrochi, G. Reza

    2010-01-01

    Incorporating computer-simulation modelling into project-based learning may be effective but requires careful planning and implementation. Teachers, especially, need pedagogical content knowledge which refers to knowledge about how students learn from materials infused with technology. This study suggests that students learn best by actively…

  16. Projecting South Asian summer precipitation in CMIP3 models: A comparison of the simulations with and without black carbon

    NASA Astrophysics Data System (ADS)

    Li, Shuanglin; Mahmood, Rashed

    2017-02-01

    Considering the importance of black carbon (BC), this study began by comparing the 20th century simulation of South Asian summer climate in IPCC CMIP3, based on the scenario of models with and without BC. Generally, the multi-model mean of the models that include BC reproduced the observed climate relatively better than those that did not. Then, the 21st century South Asian summer precipitation was projected based on the IPCC CMIP3 projection simulations. The projected precipitation in the present approach exhibited a considerable difference from the multimodel ensemble mean (MME) of IPCC AR4 projection simulations, and also from the MME of the models that ignore the effect of BC. In particular, the present projection exhibited a dry anomaly over the central Indian Peninsula, sandwiched between wet conditions on the southern and northern sides of Pakistan and India, rather than homogeneous wet conditions as seen in the MME of IPCC AR4. Thus, the spatial pattern of South Asian summer rainfall in the future may be more complicated than previously thought.

  17. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes >1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa (“displacement-per-atom”, the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  18. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes > 1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa ("displacement-per-atom", the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  19. Consequences of simulating terrestrial N dynamics for projecting future terrestrial C storage

    NASA Astrophysics Data System (ADS)

    Zaehle, S.; Friend, A. D.; Friedlingstein, P.

    2009-04-01

    We present results of a new land surface model, O-CN, which includes a process-based coupling between the terrestrial cycling of energy, water, carbon, and nitrogen. The model represents the controls of the terrestrial nitrogen (N) cycling on carbon (C) pools and fluxes through photosynthesis, respiration, changes in allocation patterns, as well as soil organic matter decomposition, and explicitly accounts for N leaching and gaseous losses. O-CN has been shown to give realistic results in comparison to observations at a wide range of scales, including in situ flux measurements, productivity databases, and atmospheric CO2 concentration data. Notably, O-CN simulates realistic responses of net primary productivity, foliage area, and foliage N content to elevated atmospheric [CO2] as evidenced at free air carbon dioxide enrichment (FACE) sites (Duke, Oak Ridge). We re-examine earlier model-based assessments of the terrestrial C sequestration potential using a global transient O-CN simulation driven by increases in atmospheric [CO2], N deposition and climatic changes over the 21st century. We find that accounting for terrestrial N cycling about halves the potential to store C in response to increases in atmospheric CO2 concentrations; mainly due to a reduction of the net C uptake in temperate and boreal forests. Nitrogen deposition partially alleviates the effect of N limitation, but is by far not sufficient to compensate for the effect completely. These findings underline the importance of an accurate representation of nutrient limitations in future projections of the terrestrial net CO2 exchanges and therefore land-climate feedback studies.

  20. Interannual tropical rainfall variability in general circulation model simulations associated with the atmospheric model intercomparison project

    SciTech Connect

    Sperber, K.R.; Palmer, T.N.

    1996-11-01

    The interannual variability of rainfall over the Indian subcontinent, the African Sahel, and the Nordeste region of Brazil have been evaluated in 32 models for the period 1979 - 88 as part of the Atmospheric Model Intercomparison Project (AMIP). The interannual variations of Nordeste rainfall are the most readily captured, owing to the intimate link with Pacific and Atlantic sea surface temperatures. The precipitation variations over India and the Sahel are less well simulated. Additionally, an Indian monsoon wind shear index was calculated for each model. This subset of models also had a rainfall climatology that was in better agreement with observations, indicating a link between systematic model error and the ability to simulate interannual variations. A suite of six European Centre for Medium-Range Weather Forecasts (ECMWF) AMIP runs (differing only in their initial conditions) have also been examined. As observed, all-India rainfall was enhanced in 1988 relative to 1987 in each of these realizations. All-India rainfall variability during other years showed little or no predictability, possibly due to internal chaotic dynamics associated with intraseasonal monsoon fluctuations and/or unpredictable land surface process interactions. The interannual variations of Nordeste rainfall were best represented. The State University of New York at Albany /National Center for Atmospheric Research Genesis model was run in five initial condition realizations. In this model, the Nordeste rainfall variability was also best reproduced. However, for all regions the skill was less than that of the ECMWF model. The relationships of the all-India and Sahel rainfall/SST teleconnections with horizontal resolution, convection scheme closure, and numerics have been evaluated. 64 refs., 13 figs., 3 tabs.

  1. Superconducting magnets for fusion applications

    SciTech Connect

    Henning, C.D.

    1987-07-02

    Fusion magnet technology has made spectacular advances in the past decade; to wit, the Mirror Fusion Test Facility and the Large Coil Project. However, further advances are still required for advanced economical fusion reactors. Higher fields to 14 T and radiation-hardened superconductors and insulators will be necessary. Coupled with high rates of nuclear heating and pulsed losses, the next-generation magnets will need still higher current density, better stability and quench protection. Cable-in-conduit conductors coupled with polyimide insulations and better steels seem to be the appropriate path. Neutron fluences up to 10/sup 19/ neutrons/cm/sup 2/ in niobium tin are achievable. In the future, other amorphous superconductors could raise these limits further to extend reactor life or decrease the neutron shielding and corresponding reactor size.

  2. Laser fusion experiments at LLL

    SciTech Connect

    Ahlstrom, H.G.

    1980-06-16

    These notes present the experimental basis and status for laser fusion as developed at LLL. Two other chapters, one authored by K.A. Brueckner and the other by C. Max, present the theoretical implosion physics and laser plasma interaction physics. The notes consist of six sections. The first is an introductory section which provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future.

  3. Z-Pinch Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Miernik, Janie

    2011-01-01

    Fusion-based nuclear propulsion has the potential to enable fast interplanetary transportation. Shorter trips are better for humans in the harmful radiation environment of deep space. Nuclear propulsion and power plants can enable high Ispand payload mass fractions because they require less fuel mass. Fusion energy research has characterized the Z-Pinch dense plasma focus method. (1) Lightning is form of pinched plasma electrical discharge phenomena. (2) Wire array Z-Pinch experiments are commonly studied and nuclear power plant configurations have been proposed. (3) Used in the field of Nuclear Weapons Effects (NWE) testing in the defense industry, nuclear weapon x-rays are simulated through Z-Pinch phenomena.

  4. Photo-fusion reactions in a new compact device for ELI

    SciTech Connect

    Moustaizis, S. D.; Auvray, P.; Hora, H.; Lalousis, P.; Larour, J.; Mourou, G.

    2012-07-09

    In the last few years significant progress on technological, experimental and numerical studies on fusion process in high density and high temperature plasmas produced by a high intensity laser pulse interaction with clusters in a high external applied magnetic field, enable us to propose a compact photo-fusion magnetic device for high neutron production. For the purpose of the project a pulsed magnetic field driver with values up to 110 Tesla has been developed which allows increasing the trapping time of the high density plasma in the device and improving the neutron yield. Numerical simulations show that the proposed device is capable of producing up to 10{sup 9}-10{sup 10} neutrons per laser shot with an external magnetic field of 150 Tesla. The proposed device can be used for experiments and numerical code validation concerning different conventional and (or) exotic fusion fuels.

  5. Revised Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Miller, Lisa D.

    2009-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Southern Delivery System (SDS) project is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various Environmental Impact Statements (EIS) alternatives and plans by Pueblo West to discharge treated wastewater into the reservoir. Wastewater plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (year 2006 demand conditions) were compared to the No Action scenario (projected demands in

  6. Inertial confinement fusion

    SciTech Connect

    Powers, L.; Condouris, R.; Kotowski, M.; Murphy, P.W.

    1992-01-01

    This issue of the ICF Quarterly contains seven articles that describe recent progress in Lawrence Livermore National Laboratory's ICF program. The Department of Energy recently initiated an effort to design a 1--2 MJ glass laser, the proposed National Ignition Facility (NIF). These articles span various aspects of a program which is aimed at moving forward toward such a facility by continuing to use the Nova laser to gain understanding of NIF-relevant target physics, by developing concepts for an NIF laser driver, and by envisioning a variety of applications for larger ICF facilities. This report discusses research on the following topics: Stimulated Rotational Raman Scattering in Nitrogen; A Maxwell Equation Solver in LASNEX for the Simulation of Moderately Intense Ultrashort Pulse Experiments; Measurements of Radial Heat-Wave Propagation in Laser-Produced Plasmas; Laser-Seeded Modulation Growth on Directly Driven Foils; Stimulated Raman Scattering in Large-Aperture, High-Fluence Frequency-Conversion Crystals; Fission Product Hazard Reduction Using Inertial Fusion Energy; Use of Inertial Confinement Fusion for Nuclear Weapons Effects Simulations.

  7. The CMIP6 Sea-Ice Model Intercomparison Project (SIMIP): Understanding sea ice through climate-model simulations

    DOE PAGES

    Notz, Dirk; Jahn, Alexandra; Holland, Marika; ...

    2016-09-23

    A better understanding of the role of sea ice for the changing climate of our planet is the central aim of the diagnostic Coupled Model Intercomparison Project 6 (CMIP6)-endorsed Sea-Ice Model Intercomparison Project (SIMIP). To reach this aim, SIMIP requests sea-ice-related variables from climate-model simulations that allow for a better understanding and, ultimately, improvement of biases and errors in sea-ice simulations with large-scale climate models. This then allows us to better understand to what degree CMIP6 model simulations relate to reality, thus improving our confidence in answering sea-ice-related questions based on these simulations. Furthermore, the SIMIP protocol provides a standardmore » for sea-ice model output that will streamline and hence simplify the analysis of the simulated sea-ice evolution in research projects independent of CMIP. To reach its aims, SIMIP provides a structured list of model output that allows for an examination of the three main budgets that govern the evolution of sea ice, namely the heat budget, the momentum budget, and the mass budget. Furthermore, we explain the aims of SIMIP in more detail and outline how its design allows us to answer some of the most pressing questions that sea ice still poses to the international climate-research community.« less

  8. The CMIP6 Sea-Ice Model Intercomparison Project (SIMIP): Understanding sea ice through climate-model simulations

    SciTech Connect

    Notz, Dirk; Jahn, Alexandra; Holland, Marika; Hunke, Elizabeth; Massonnet, François; Stroeve, Julienne; Tremblay, Bruno; Vancoppenolle, Martin

    2016-09-23

    A better understanding of the role of sea ice for the changing climate of our planet is the central aim of the diagnostic Coupled Model Intercomparison Project 6 (CMIP6)-endorsed Sea-Ice Model Intercomparison Project (SIMIP). To reach this aim, SIMIP requests sea-ice-related variables from climate-model simulations that allow for a better understanding and, ultimately, improvement of biases and errors in sea-ice simulations with large-scale climate models. This then allows us to better understand to what degree CMIP6 model simulations relate to reality, thus improving our confidence in answering sea-ice-related questions based on these simulations. Furthermore, the SIMIP protocol provides a standard for sea-ice model output that will streamline and hence simplify the analysis of the simulated sea-ice evolution in research projects independent of CMIP. To reach its aims, SIMIP provides a structured list of model output that allows for an examination of the three main budgets that govern the evolution of sea ice, namely the heat budget, the momentum budget, and the mass budget. Furthermore, we explain the aims of SIMIP in more detail and outline how its design allows us to answer some of the most pressing questions that sea ice still poses to the international climate-research community.

  9. The CMIP6 Sea-Ice Model Intercomparison Project (SIMIP): understanding sea ice through climate-model simulations

    NASA Astrophysics Data System (ADS)

    Notz, Dirk; Jahn, Alexandra; Holland, Marika; Hunke, Elizabeth; Massonnet, François; Stroeve, Julienne; Tremblay, Bruno; Vancoppenolle, Martin

    2016-09-01

    A better understanding of the role of sea ice for the changing climate of our planet is the central aim of the diagnostic Coupled Model Intercomparison Project 6 (CMIP6)-endorsed Sea-Ice Model Intercomparison Project (SIMIP). To reach this aim, SIMIP requests sea-ice-related variables from climate-model simulations that allow for a better understanding and, ultimately, improvement of biases and errors in sea-ice simulations with large-scale climate models. This then allows us to better understand to what degree CMIP6 model simulations relate to reality, thus improving our confidence in answering sea-ice-related questions based on these simulations. Furthermore, the SIMIP protocol provides a standard for sea-ice model output that will streamline and hence simplify the analysis of the simulated sea-ice evolution in research projects independent of CMIP. To reach its aims, SIMIP provides a structured list of model output that allows for an examination of the three main budgets that govern the evolution of sea ice, namely the heat budget, the momentum budget, and the mass budget. In this contribution, we explain the aims of SIMIP in more detail and outline how its design allows us to answer some of the most pressing questions that sea ice still poses to the international climate-research community.

  10. The JUMP student project: two weeks of space simulation in a Mars-like environment.

    NASA Astrophysics Data System (ADS)

    de Crombrugghe, Guerric; de Lobkowicz, Ysaline; van Vynckt, Delphine; Reydams, Marc; Denies, Jonathan; Jago, Alban; Le Maire, Victor

    JUMP is a student initiative which aim is to simulate during two weeks the life of astronauts in a Mars-like environment. The simulation will be held in the Mars Desert Research Station (MDRS) a habitat installed by the Mars Society (MS) in the Utah desert. The crew is composed of six students, helped by a remote support of four students, all from different background (engineering, physics, mathematics, biology, and architecture) and degree (bachelor, master, PhD), under the supervision of researchers from several institutes. Several researches will be conducted during the simulation. We shall report on the science and technical results, and implications for Earth-Mars comparative studies. JASE: The Jump Astronaut Safety Experiment (JASE) consists in a deployable Yagi antenna with basic elec-tronics, providing an extremely light and simple way to prevent the solar flares and observe Jupiter bursts. JADE: The Jump Angular Detection Experiment (JADE) is an innovative an-gular particle detector used to determine the irradiation of the surface and monitor the charged particle distribution in Mars' atmosphere. Even if its resolution is low, it is a very light solution compared to pixel detectors. JAPE: The Jump Astronaut Potatoes Experiment (JAPE) will try to grow and eat in a space-like environment high-performance potatoes developed by the Groupe de Recherche en Physiologie Végétale (GRPV) of the UCL in the frame of the Micro-e Ecological Life Support System Alternative (MELiSSA) project of the ESA. JABE: The Jump soil Analysis with a Backpack drill Experiment (JABE) aim to validate a sample procedure, generate vertical profiles of the humidity with a MEMS sensor, and analyze soil samples with a spectrometer. The crew will therefore use a backpack drill, which is portable, fast and easy to use. JARE: The goal of the Jump Astronaut-Rover interaction Experiment (JARE) is to determine how a rover can help an astronaut in his task, and how it is possible to improve this

  11. Comparison of projection skills of deterministic ensemble methods using pseudo-simulation data generated from multivariate Gaussian distribution

    NASA Astrophysics Data System (ADS)

    Oh, Seok-Geun; Suh, Myoung-Seok

    2016-03-01

    The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.

  12. Future projections of precipitation characteristics in East Asia simulated by the MRI CGCM2

    NASA Astrophysics Data System (ADS)

    Kitoh, Akio; Hosaka, Masahiro; Adachi, Yukimasa; Kamiguchi, Kenji

    2005-07-01

    Projected changes in precipitation characteristics around the mid-21st century and end-of-the-century are analyzed using the daily precipitation output of the 3-member ensemble Meteorological Research Institute global ocean-atmosphere coupled general circulation model (MRI-CGCM2) simulations under the Special Report on Emissions Scenarios (SRES) A2 and B2 scenarios. It is found that both the frequency and intensity increase in about 40% of the globe, while both the frequency and intensity decrease in about 20% of the globe. These numbers differ only a few percent from decade to decade of the 21st century and between the A2 and B2 scenarios. Over the rest of the globe (about one third), the precipitation frequency decreases but its intensity increases, suggesting a shift of precipitation distribution toward more intense events by global warming. South China is such a region where the summertime wet-day frequency decreases but the precipitation intensity increases. This is related to increased atmospheric moisture content due to global warming and an intensified and more westwardly extended North Pacific subtropical anticyclone, which may be related with an El Niño-like mean sea surface temperature change. On the other hand, a decrease in summer precipitation is noted in North China, thus augmenting a south-to-north precipitation contrast more in the future.

  13. Evaluation of Arctic Sea Ice Thickness Simulated by Arctic Ocean Model Intercomparison Project Models

    NASA Technical Reports Server (NTRS)

    Johnson, Mark; Proshuntinsky, Andrew; Aksenov, Yevgeny; Nguyen, An T.; Lindsay, Ron; Haas, Christian; Zhang, Jinlun; Diansky, Nikolay; Kwok, Ron; Maslowski, Wieslaw; Hakkinen, Sirpa; Ashik, Igor; De Cuevas, Beverly

    2012-01-01

    Six Arctic Ocean Model Intercomparison Project model simulations are compared with estimates of sea ice thickness derived from pan-Arctic satellite freeboard measurements (2004-2008); airborne electromagnetic measurements (2001-2009); ice draft data from moored instruments in Fram Strait, the Greenland Sea, and the Beaufort Sea (1992-2008) and from submarines (1975-2000); and drill hole data from the Arctic basin, Laptev, and East Siberian marginal seas (1982-1986) and coastal stations (1998-2009). Despite an assessment of six models that differ in numerical methods, resolution, domain, forcing, and boundary conditions, the models generally overestimate the thickness of measured ice thinner than approximately 2 mand underestimate the thickness of ice measured thicker than about approximately 2m. In the regions of flat immobile landfast ice (shallow Siberian Seas with depths less than 25-30 m), the models generally overestimate both the total observed sea ice thickness and rates of September and October ice growth from observations by more than 4 times and more than one standard deviation, respectively. The models do not reproduce conditions of fast ice formation and growth. Instead, the modeled fast ice is replaced with pack ice which drifts, generating ridges of increasing ice thickness, in addition to thermodynamic ice growth. Considering all observational data sets, the better correlations and smaller differences from observations are from the Estimating the Circulation and Climate of the Ocean, Phase II and Pan-Arctic Ice Ocean Modeling and Assimilation System models.

  14. Fusion energy

    NASA Astrophysics Data System (ADS)

    1990-09-01

    The main purpose of the International Thermonuclear Experimental Reactor (ITER) is to develop an experimental fusion reactor through the united efforts of many technologically advanced countries. The ITER terms of reference, issued jointly by the European Community, Japan, the USSR, and the United States, call for an integrated international design activity and constitute the basis of current activities. Joint work on ITER is carried out under the auspices of the International Atomic Energy Agency (IAEA), according to the terms of quadripartite agreement reached between the European Community, Japan, the USSR, and the United States. The site for joint technical work sessions is at the Max Planck Institute of Plasma Physics. Garching, Federal Republic of Germany. The ITER activities have two phases: a definition phase performed in 1988 and the present design phase (1989 to 1990). During the definition phase, a set of ITER technical characteristics and supporting research and development (R and D) activities were developed and reported. The present conceptual design phase of ITER lasts until the end of 1990. The objectives of this phase are to develop the design of ITER, perform a safety and environmental analysis, develop site requirements, define future R and D needs, and estimate cost, manpower, and schedule for construction and operation. A final report will be submitted at the end of 1990. This paper summarizes progress in the ITER program during the 1989 design phase.

  15. Fusion energy

    SciTech Connect

    Not Available

    1990-09-01

    The main purpose of the International Thermonuclear Experimental Reactor (ITER) is to develop an experimental fusion reactor through the united efforts of many technologically advanced countries. The ITER terms of reference, issued jointly by the European Community, Japan, the USSR, and the United States, call for an integrated international design activity and constitute the basis of current activities. Joint work on ITER is carried out under the auspices of the International Atomic Energy Agency (IAEA), according to the terms of quadripartite agreement reached between the European Community, Japan, the USSR, and the United States. The site for joint technical work sessions is at the MaxPlanck Institute of Plasma Physics. Garching, Federal Republic of Germany. The ITER activities have two phases: a definition phase performed in 1988 and the present design phase (1989--1990). During the definition phase, a set of ITER technical characteristics and supporting research and development (R D) activities were developed and reported. The present conceptual design phase of ITER lasts until the end of 1990. The objectives of this phase are to develop the design of ITER, perform a safety and environmental analysis, develop site requirements, define future R D needs, and estimate cost, manpower, and schedule for construction and operation. A final report will be submitted at the end of 1990. This paper summarizes progress in the ITER program during the 1989 design phase.

  16. Inertial fusion experiments and theory

    NASA Astrophysics Data System (ADS)

    Mima, Kunioki; Tikhonchuk, V.; Perlado, M.

    2011-09-01

    Inertial fusion research is approaching a critical milestone, namely the demonstration of ignition and burn. The world's largest high-power laser, the National Ignition Facility (NIF), is under operation at the Lawrence Livermore National Laboratory (LLNL), in the USA. Another ignition machine, Laser Mega Joule (LMJ), is under construction at the CEA/CESTA research centre in France. In relation to the National Ignition Campaign (NIC) at LLNL, worldwide studies on inertial fusion applications to energy production are growing. Advanced ignition schemes such as fast ignition, shock ignition and impact ignition, and the inertial fusion energy (IFE) technology are under development. In particular, the Fast Ignition Realization Experiment (FIREX) at the Institute of Laser Engineering (ILE), Osaka University, and the OMEGA-EP project at the Laboratory for Laser Energetics (LLE), University Rochester, and the HiPER project in the European Union (EU) for fast ignition and shock ignition are progressing. The IFE technology research and development are advanced in the frameworks of the HiPER project in EU and the LIFE project in the USA. Laser technology developments in the USA, EU, Japan and Korea were major highlights in the IAEA FEC 2010. In this paper, the status and prospects of IFE science and technology are described.

  17. Magnetic-Nozzle Studies for Fusion Propulsion Applications: Gigawatt Plasma Source Operation and Magnetic Nozzle Analysis

    NASA Technical Reports Server (NTRS)

    Gilland, James H.; Mikekkides, Ioannis; Mikellides, Pavlos; Gregorek, Gerald; Marriott, Darin

    2004-01-01

    This project has been a multiyear effort to assess the feasibility of a key process inherent to virtually all fusion propulsion concepts: the expansion of a fusion-grade plasma through a diverging magnetic field. Current fusion energy research touches on this process only indirectly through studies of plasma divertors designed to remove the fusion products from a reactor. This project was aimed at directly addressing propulsion system issues, without the expense of constructing a fusion reactor. Instead, the program designed, constructed, and operated a facility suitable for simulating fusion reactor grade edge plasmas, and to examine their expansion in an expanding magnetic nozzle. The approach was to create and accelerate a dense (up to l0(exp 20)/m) plasma, stagnate it in a converging magnetic field to convert kinetic energy to thermal energy, and examine the subsequent expansion of the hot (100's eV) plasma in a subsequent magnetic nozzle. Throughout the project, there has been a parallel effort between theoretical and numerical design and modelling of the experiment and the experiment itself. In particular, the MACH2 code was used to design and predict the performance of the magnetoplasmadynamic (MPD) plasma accelerator, and to design and predict the design and expected behavior for the magnetic field coils that could be added later. Progress to date includes the theoretical accelerator design and construction, development of the power and vacuum systems to accommodate the powers and mass flow rates of interest to out research, operation of the accelerator and comparison to theoretical predictions, and computational analysis of future magnetic field coils and the expected performance of an integrated source-nozzle experiment.

  18. Multimodel simulations of carbon monoxide: Comparison with observations and projected near-future changes

    NASA Astrophysics Data System (ADS)

    Shindell, D. T.; Faluvegi, G.; Stevenson, D. S.; Krol, M. C.; Emmons, L. K.; Lamarque, J.-F.; PéTron, G.; Dentener, F. J.; Ellingsen, K.; Schultz, M. G.; Wild, O.; Amann, M.; Atherton, C. S.; Bergmann, D. J.; Bey, I.; Butler, T.; Cofala, J.; Collins, W. J.; Derwent, R. G.; Doherty, R. M.; Drevet, J.; Eskes, H. J.; Fiore, A. M.; Gauss, M.; Hauglustaine, D. A.; Horowitz, L. W.; Isaksen, I. S. A.; Lawrence, M. G.; Montanaro, V.; Müller, J.-F.; Pitari, G.; Prather, M. J.; Pyle, J. A.; Rast, S.; Rodriguez, J. M.; Sanderson, M. G.; Savage, N. H.; Strahan, S. E.; Sudo, K.; Szopa, S.; Unger, N.; van Noije, T. P. C.; Zeng, G.

    2006-10-01

    We analyze present-day and future carbon monoxide (CO) simulations in 26 state-of-the-art atmospheric chemistry models run to study future air quality and climate change. In comparison with near-global satellite observations from the MOPITT instrument and local surface measurements, the models show large underestimates of Northern Hemisphere (NH) extratropical CO, while typically performing reasonably well elsewhere. The results suggest that year-round emissions, probably from fossil fuel burning in east Asia and seasonal biomass burning emissions in south-central Africa, are greatly underestimated in current inventories such as IIASA and EDGAR3.2. Variability among models is large, likely resulting primarily from intermodel differences in representations and emissions of nonmethane volatile organic compounds (NMVOCs) and in hydrologic cycles, which affect OH and soluble hydrocarbon intermediates. Global mean projections of the 2030 CO response to emissions changes are quite robust. Global mean midtropospheric (500 hPa) CO increases by 12.6 ± 3.5 ppbv (16%) for the high-emissions (A2) scenario, by 1.7 ± 1.8 ppbv (2%) for the midrange (CLE) scenario, and decreases by 8.1 ± 2.3 ppbv (11%) for the low-emissions (MFR) scenario. Projected 2030 climate changes decrease global 500 hPa CO by 1.4 ± 1.4 ppbv. Local changes can be much larger. In response to climate change, substantial effects are seen in the tropics, but intermodel variability is quite large. The regional CO responses to emissions changes are robust across models, however. These range from decreases of 10-20 ppbv over much of the industrialized NH for the CLE scenario to CO increases worldwide and year-round under A2, with the largest changes over central Africa (20-30 ppbv), southern Brazil (20-35 ppbv) and south and east Asia (30-70 ppbv). The trajectory of future emissions thus has the potential to profoundly affect air quality over most of the world's populated areas.

  19. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    SciTech Connect

    Xu, Y; Tian, Z; Jiang, S; Jia, X; Zhou, L

    2015-06-15

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  20. The accomplishment of the Engineering Design Activities of IFMIF/EVEDA: The European-Japanese project towards a Li(d,xn) fusion relevant neutron source

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Ibarra, A.; Abal, J.; Abou-Sena, A.; Arbeiter, F.; Arranz, F.; Arroyo, J. M.; Bargallo, E.; Beauvais, P.-Y.; Bernardi, D.; Casal, N.; Carmona, J. M.; Chauvin, N.; Comunian, M.; Delferriere, O.; Delgado, A.; Diaz-Arocas, P.; Fischer, U.; Frisoni, M.; Garcia, A.; Garin, P.; Gobin, R.; Gouat, P.; Groeschel, F.; Heidinger, R.; Ida, M.; Kondo, K.; Kikuchi, T.; Kubo, T.; Le Tonqueze, Y.; Leysen, W.; Mas, A.; Massaut, V.; Matsumoto, H.; Micciche, G.; Mittwollen, M.; Mora, J. C.; Mota, F.; Nghiem, P. A. P.; Nitti, F.; Nishiyama, K.; Ogando, F.; O'hira, S.; Oliver, C.; Orsini, F.; Perez, D.; Perez, M.; Pinna, T.; Pisent, A.; Podadera, I.; Porfiri, M.; Pruneri, G.; Queral, V.; Rapisarda, D.; Roman, R.; Shingala, M.; Soldaini, M.; Sugimoto, M.; Theile, J.; Tian, K.; Umeno, H.; Uriot, D.; Wakai, E.; Watanabe, K.; Weber, M.; Yamamoto, M.; Yokomine, T.

    2015-08-01

    The International Fusion Materials Irradiation Facility (IFMIF), presently in its Engineering Validation and Engineering Design Activities (EVEDA) phase under the frame of the Broader Approach Agreement between Europe and Japan, accomplished in summer 2013, on schedule, its EDA phase with the release of the engineering design report of the IFMIF plant, which is here described. Many improvements of the design from former phases are implemented, particularly a reduction of beam losses and operational costs thanks to the superconducting accelerator concept, the re-location of the quench tank outside the test cell (TC) with a reduction of tritium inventory and a simplification on its replacement in case of failure, the separation of the irradiation modules from the shielding block gaining irradiation flexibility and enhancement of the remote handling equipment reliability and cost reduction, and the water cooling of the liner and biological shielding of the TC, enhancing the efficiency and economy of the related sub-systems. In addition, the maintenance strategy has been modified to allow a shorter yearly stop of the irradiation operations and a more careful management of the irradiated samples. The design of the IFMIF plant is intimately linked with the EVA phase carried out since the entry into force of IFMIF/EVEDA in June 2007. These last activities and their on-going accomplishment have been thoroughly described elsewhere (Knaster J et al [19]), which, combined with the present paper, allows a clear understanding of the maturity of the European-Japanese international efforts. This released IFMIF Intermediate Engineering Design Report (IIEDR), which could be complemented if required concurrently with the outcome of the on-going EVA, will allow decision making on its construction and/or serve as the basis for the definition of the next step, aligned with the evolving needs of our fusion community.

  1. Projected axis ratios of galaxy clusters in the Horizon-AGN simulation: Impact of baryon physics and comparison with observations

    NASA Astrophysics Data System (ADS)

    Suto, Daichi; Peirani, Sébastien; Dubois, Yohan; Kitayama, Tetsu; Nishimichi, Takahiro; Sasaki, Shin; Suto, Yasushi

    2017-02-01

    We characterize the non-sphericity of galaxy clusters by the projected axis ratio of spatial distribution of star, dark matter, and X-ray surface brightness (XSB). We select 40 simulated groups and clusters of galaxies with mass larger than 5 × 1013 M⊙ from the Horizon simulation that fully incorporates the relevant baryon physics, in particular, the active galactic nucleus feedback. We find that the baryonic physics around the central region of galaxy clusters significantly affects the non-sphericity of dark matter distribution even beyond the central region, approximately up to half of the virial radius. Therefore it is very difficult to predict the probability density function (PDF) of the projected axis ratio of XSB from dark-matter-only N-body simulations as attempted in previous studies. Indeed, we find that the PDF derived from our simulated clusters exhibits much better agreement with that from the observed X-ray clusters. This indicates that our present methodology to estimate the non-sphericity directly from the Horizon simulation is useful and promising. Further improvements in both numerical modeling and observational data will establish the non-sphericity of clusters as a cosmological test complementary to more conventional statistics based on spherically averaged quantities.

  2. A Reliability-Based Track Fusion Algorithm

    PubMed Central

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments. PMID:25950174

  3. Projected Global Hydrologic Cycles Using New Combine Earth System Moels from Multi-Model Multi-Scenario Simulation

    NASA Astrophysics Data System (ADS)

    Shadkam Torbati, S.; Kabat, P.; Ludwig, F.; Beyene, T.

    2011-12-01

    Simulating land surface hydrological states, fluxes and drought requires a comprehensive set of atmospheric forcing data at consistent temporal and spatial scales that can be used to evaluate changes in the global hydrological cycle. The European integrating project COMBINE brings together research groups to advance Earth system models (ESMs) for more accurate climate projections and for reduced uncertainty in the prediction of climate by including key physical and biogeochemical processes. We report the current state of the art of sensitivity of the global hydrological cycle for multi-scenario using available EU-WATCH historical data and future climate projections generated by Combine which will follow the specifications of the Coupled Model Intercomparison Project (CMIP5) protocol for IPCC AR5. The choice of the scenarios were made on the basis of the CMIP5 protocol, which recommends the Representative Concentration Scenario 4.5 (RCP4.5) and 8.5 (RCP8.5) for the core climate projections to 2100 and the RCP4.5 scenario for core decadal climate predictions to 2035. A detailed description of the bias-correction and spatial downscaling method used and evaluation of the data set will be assessed by deriving a land surface hydrological models globally and at specific river basins as a case study. The project will be able to contribute to the IPCC-AR5 data archives.

  4. The New England Climate Adaptation Project: Enhancing Local Readiness to Adapt to Climate Change through Role-Play Simulations

    NASA Astrophysics Data System (ADS)

    Rumore, D.; Kirshen, P. H.; Susskind, L.

    2014-12-01

    Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.

  5. Simulation and projection of summer surface air temperature over China: a comparison between a RCM and the driving global model

    NASA Astrophysics Data System (ADS)

    Li, Donghuan; Zhou, Tianjun; Zou, Liwei

    2016-04-01

    The regional climate model (version 3, RegCM3) with the horizontal resolution of 50 km was employed to downscale the historical and projected climate changes over CORDEX East Asia domain, nested within the global climate system model FGOALS-g2 (Flexible Global Ocean-Atmosphere-Land System Model: Grid-point Version 2). The simulated (1986-2005) and projected (2046-2065) summer surface air temperature changes under RCP8.5 scenario over China were compared between the RegCM3 and FGOALS-g2. The air temperature indices used in this study included tmx (daily maximum temperature), t2m (daily average temperature) and tmn (daily minimum temperature), and extreme high-temperature events included TXx (max tmx), TX90p (warm days) and WSDI (warm spell duration). Results indicated that both models could reasonably reproduce the climatological distribution of surface air temperature and extreme high-temperature events. Compared to the driving global climate model, the detailed characteristics of summer surface air temperature were better simulated in RegCM3 due to its higher horizontal resolution. Under the RCP8.5 scenario, summer surface air temperature over China will increase significantly during the middle of 21st century. RegCM3 projected larger increase of tmx than tmn over most regions of China, but in the western Tibet Plateau, the increase of tmn was larger. In the projection of FGOALS-g2, the projected changes of the three temperature indices (t2m, tmn, and tmx) were similar with larger increases over northeastern China and Tibet Plateau. Extreme high-temperature events were projected to increase significantly in both models. TX90p will increase more than 60% compared to present day, while WSDI will become twice of present day. Key words: Summer surface air temperature; Extreme high-temperature events; Regional climate model; Climate change

  6. Projecting Wind Energy Potential Under Climate Change with Ensemble of Climate Model Simulations

    NASA Astrophysics Data System (ADS)

    Jain, A.; Shashikanth, K.; Ghosh, S.; Mukherjee, P. P.

    2013-12-01

    Recent years have witnessed an increasing global concern over energy sustainability and security, triggered by a number of issues, such as (though not limited to): fossil fuel depletion, energy resource geopolitics, economic efficiency versus population growth debate, environmental concerns and climate change. Wind energy is a renewable and sustainable form of energy in which wind turbines convert the kinetic energy of wind into electrical energy. Global warming and differential surface heating may significantly impact the wind velocity and hence the wind energy potential. Sustainable design of wind mills requires understanding the impacts of climate change on wind energy potential, which we evaluate here with multiple General Circulation Models (GCMs). GCMs simulate the climate variables globally considering the greenhouse emission scenarios provided as Representation Concentration path ways (RCPs). Here we use new generation climate model outputs obtained from Coupled model Intercomparison Project 5(CMIP5). We first compute the wind energy potential with reanalysis data (NCEP/ NCAR), at a spatial resolution of 2.50, where the gridded data is fitted to Weibull distribution and with the Weibull parameters, the wind energy densities are computed at different grids. The same methodology is then used, to CMIP5 outputs (resultant of U-wind and V-wind) of MRI, CMCC, BCC, CanESM, and INMCM4 for historical runs. This is performed separately for four seasons globally, MAM, JJA, SON and DJF. We observe the muti-model average of wind energy density for historic period has significant bias with respect to that of reanalysis product. Here we develop a quantile based superensemble approach where GCM quantiles corresponding to selected CDF values are regressed to reanalysis data. It is observed that this regression approach takes care of both, bias in GCMs and combination of GCMs. With superensemble, we observe that the historical wind energy density resembles quite well with

  7. Kinematic Sunyaev-Zel'dovich effect with projected fields. II. Prospects, challenges, and comparison with simulations

    NASA Astrophysics Data System (ADS)

    Ferraro, Simone; Hill, J. Colin; Battaglia, Nick; Liu, Jia; Spergel, David N.

    2016-12-01

    The kinematic Sunyaev-Zel'dovich (kSZ) signal is a powerful probe of the cosmic baryon distribution. The kSZ signal is proportional to the integrated free electron momentum rather than the electron pressure (which sources the thermal SZ signal). Since velocities should be unbiased on large scales, the kSZ signal is an unbiased tracer of the large-scale electron distribution, and thus can be used to detect the "missing baryons" that evade most observational techniques. While most current methods for kSZ extraction rely on the availability of very accurate redshifts, we revisit a method that allows measurements even in the absence of redshift information for individual objects. It involves cross-correlating the square of an appropriately filtered cosmic microwave background (CMB) temperature map with a projected density map constructed from a sample of large-scale structure tracers. We show that this method will achieve high signal-to-noise when applied to the next generation of high-resolution CMB experiments, provided that component separation is sufficiently effective at removing foreground contamination. Considering statistical errors only, we forecast that this estimator can yield S /N ≈3 , 120 and over 150 for Planck, Advanced ACTPol, and a hypothetical Stage IV CMB experiment, respectively, in combination with a galaxy catalog from WISE, and about 20% larger S /N for a galaxy catalog from the proposed SPHEREx experiment. We show that the basic estimator receives a contribution due to leakage from CMB lensing, but that this term can be effectively removed by either direct measurement or marginalization, with little effect on the kSZ significance. We discuss possible sources of systematic contamination and propose mitigation strategies for future surveys. We compare the theoretical predictions to numerical simulations and validate the approximations in our analytic approach. This work serves as a companion paper to the first kSZ measurement with this method

  8. Viral membrane fusion.

    PubMed

    Harrison, Stephen C

    2015-05-01

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a "fusion loop" or "fusion peptide") engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics.

  9. Viral membrane fusion

    PubMed Central

    Harrison, Stephen C.

    2015-01-01

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a “fusion loop” or “fusion peptide”) engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics. PMID:25866377

  10. Polarimeter for the General Fusion SPECTOR machine

    NASA Astrophysics Data System (ADS)

    Carle, Patrick; Froese, Aaron; Wong, Adrian; Howard, Stephen; O'Shea, Peter; Laberge, Michel

    2016-11-01

    A polarimeter has been designed to measure Faraday rotation and help to understand the profile of its safety factor, q, on the recently built SPECTOR magnetized target fusion machine at General Fusion. The polarimeter uses two counter-rotating, circularly polarized, 118.8 μm beams to probe the plasma. Grad-Shafranov simulations have been used to investigate the effect of measurement error and chord geometry.

  11. The potential effects of tobacco control in China: projections from the China SimSmoke simulation model

    PubMed Central

    Rodríguez-Buño, Ricardo L; Hu, Teh-Wei; Moran, Andrew E

    2014-01-01

    Objective To use a computer simulation model to project the potential impact in China of tobacco control measures on smoking, as recommended by the World Health Organization Framework Convention on Tobacco Control (FCTC), being fully implemented. Design Modelling study. Setting China. Population Males and females aged 15-74 years. Intervention Incremental impact of more complete implementation of WHO FCTC policies simulated using SimSmoke, a Markov computer simulation model of tobacco smoking prevalence, smoking attributable deaths, and the impact of tobacco control policies. Data on China’s adult population, current and former smoking prevalence, initiation and cessation rates, and past policy levels were entered into SimSmoke in order to predict past smoking rates and to project future status quo rates. The model was validated by comparing predicted smoking prevalence with smoking prevalence measured in tobacco surveys from 1996-2010. Main outcome measures Projected future smoking prevalence and smoking attributable deaths from 2013-50. Results Status quo tobacco policy simulations projected a decline in smoking prevalence from 51.3% in 2015 to 46.5% by 2050 in males and from 2.1% to 1.3% in females. Of the individual FCTC recommended tobacco control policies, increasing the tobacco excise tax to 75% of the retail price was projected to be the most effective, incrementally reducing current smoking compared with the status quo by 12.9% by 2050. Complete and simultaneous implementation of all FCTC policies was projected to incrementally reduce smoking by about 40% relative to the 2050 status quo levels and to prevent approximately 12.8 million smoking attributable deaths and 154 million life years lost by 2050. Conclusions Complete implementation of WHO FCTC recommended policies would prevent more than 12.8 million smoking attributable deaths in China by 2050. Implementation of FCTC policies would alleviate a substantial portion of the tobacco related health

  12. On pigeons and people: A preliminary look at the columban simulation project

    PubMed Central

    Epstein, Robert

    1981-01-01

    Simulations of complex human behaviors with pigeons are providing plausible environmental accounts of such behaviors, as well as data-based commentaries on non-behavioristic psychology. Behaviors said to show “symbolic communication,” “insight,” “self-awareness,” and the “spontaneous use of memoranda” have thus far been simulated, and other simulations are in progress. ImagesFigure 1Figure 2Figure 3Figure 4Figure 5 PMID:22478538

  13. Cold fusion research

    SciTech Connect

    1989-11-01

    I am pleased to forward to you the Final Report of the Cold Fusion Panel. This report reviews the current status of cold fusion and includes major chapters on Calorimetry and Excess Heat, Fusion Products and Materials Characterization. In addition, the report makes a number of conclusions and recommendations, as requested by the Secretary of Energy.

  14. Magneto-Inertial Fusion

    SciTech Connect

    Wurden, G. A.; Hsu, S. C.; Intrator, T. P.; Grabowski, T. C.; Degnan, J. H.; Domonkos, M.; Turchi, P. J.; Campbell, E. M.; Sinars, D. B.; Herrmann, M. C.; Betti, R.; Bauer, B. S.; Lindemuth, I. R.; Siemon, R. E.; Miller, R. L.; Laberge, M.; Delage, M.

    2015-11-17

    In this community white paper, we describe an approach to achieving fusion which employs a hybrid of elements from the traditional magnetic and inertial fusion concepts, called magneto-inertial fusion (MIF). The status of MIF research in North America at multiple institutions is summarized including recent progress, research opportunities, and future plans.

  15. Virtual Airspace Modeling and Simulation (VAMS) Project First Technical Interchange Meeting

    NASA Technical Reports Server (NTRS)

    Beard, Robert; Kille, Robert; Kirsten, Richard; Rigterink, Paul; Sielski, Henry; Gratteau, Melinda F. (Editor)

    2002-01-01

    A three-day NASA Virtual Airspace and Modeling Project (VAMS) Technical Interchange Meeting (TIM) was held at the NASA Ames Research Center in Mountain View, CA. on May 21 through May 23,2002. The purpose of this meeting was to share initial concept information sponsored by the VAMS Project. An overall goal of the VAMS Project is to develop validated, blended, robust and transition-able air transportation system concepts over the next five years that will achieve NASA's long-term Enterprise Aviation Capacity goals. This document describes the presentations at the TIM, their related questions and answers, and presents the TIM recommendations.

  16. High-Fidelity Simulation Meets Athletic Training Education: An Innovative Collaborative Teaching Project

    ERIC Educational Resources Information Center

    Palmer, Elizabeth; Edwards, Taylor; Racchini, James

    2014-01-01

    High-fidelity simulation is frequently used in nursing education to provide students with simulated experiences prior to and throughout clinical coursework that involves direct patient care. These high-tech exercises take advantage of the benefits of a standardized patient or mock patient encounter, while eliminating some of the drawbacks…

  17. Incorporating Reflective Practice into Team Simulation Projects for Improved Learning Outcomes

    ERIC Educational Resources Information Center

    Wills, Katherine V.; Clerkin, Thomas A.

    2009-01-01

    The use of simulation games in business courses is a popular method for providing undergraduate students with experiences similar to those they might encounter in the business world. As such, in 2003 the authors were pleased to find a classroom simulation tool that combined the decision-making and team experiences of a senior management group with…

  18. Two algorithms to compute projected correlation functions in molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Carof, Antoine; Vuilleumier, Rodolphe; Rotenberg, Benjamin

    2014-03-01

    An explicit derivation of the Mori-Zwanzig orthogonal dynamics of observables is presented and leads to two practical algorithms to compute exactly projected observables (e.g., random noise) and projected correlation function (e.g., memory kernel) from a molecular dynamics trajectory. The algorithms are then applied to study the diffusive dynamics of a tagged particle in a Lennard-Jones fluid, the properties of the associated random noise, and a decomposition of the corresponding memory kernel.

  19. Magnetized target fusion and fusion propulsion

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Ronald C.

    2002-01-01

    Magnetized target fusion (MTF) is a thermonuclear fusion concept that is intermediate between the two mainline approaches, magnetic confinement and inertial confinement fusion (MCF and ICF). MTF incorporates some aspects of each and offers advantages over each of the mainline approaches. First, it provides a means of reducing the driver power requirements, thereby admitting a wider range of drivers than ICF. Second, the magnetic field is only used for insulation, not confinement, and the plasma is wall confined, so that plasma instabilities are traded in for hydrodynamic instabilities. However, the degree of compression required to reach fusion condition is lower than for ICF, so that hydrodynamic instabilities are much less threatening. The standoff driver innovation proposes to dynamically form the target plasma and a gaseous shell that compresses and confines the target plasma. Therefore, fusion target fabrication is traded in for a multiplicity of plasma guns, which must work in synchrony. The standoff driver embodiment of MTF leads to a fusion propulsion system concept that is potentially compact and lightweight. We will discuss the underlying physics of MTF and some of the details of the fusion propulsion concept using the standoff driver approach. We discuss here the optimization of an MTF target design for space propulsion. .

  20. New Capabilities for Modeling Intense Beams in Heavy Ion Fusion Drivers

    SciTech Connect

    Friedman, A; Barnard, J J; Bieniosek, F M; Celata, C M; Cohen, R H; Davidson, R C; Grote, D P; Haber, I; Henestroza, E; Lee, E P; Lund, S M; Qin, H; Sharp, W M; Startsev, E; Vay, J L

    2003-09-09

    Significant advances have been made in modeling the intense beams of heavy-ion beam-driven Inertial Fusion Energy (Heavy Ion Fusion). In this paper, a roadmap for a validated, predictive driver simulation capability, building on improved codes and experimental diagnostics, is presented, as are examples of progress. The Mesh Refinement and Particle-in-Cell methods were integrated in the WARP code; this capability supported an injector experiment that determined the achievable current rise time, in good agreement with calculations. In a complementary effort, a new injector approach based on the merging of {approx}100 small beamlets was simulated, its basic feasibility established, and an experimental test designed. Time-dependent 3D simulations of the High Current Experiment (HCX) were performed, yielding voltage waveforms for an upcoming study of bunch-end control. Studies of collective beam modes which must be taken into account in driver designs were carried out. The value of using experimental data to tomographically ''synthesize'' a 4D beam particle distribution and so initialize a simulation was established; this work motivated further development of new diagnostics which yield 3D projections of the beam phase space. Other developments, including improved modeling of ion beam focusing and transport through the fusion chamber environment and onto the target, and of stray electrons and their effects on ion beams, are briefly noted.

  1. Phenomenology-Based Inverse Scattering for Sensor Information Fusion

    DTIC Science & Technology

    2006-09-15

    SENSOR INFORMATION FUSION Kung-Hau Ding 15 September 2006 Final Report Approved for Public Release; Distribution...Hanscom AFB MA 01731-2909 TECHNICAL REPORT Title: Phenomenology-Based Inverse Scattering for Sensors Information Fusion Unlimited, Statement A...Scattering for Sensor Information Fusion 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61102F 6. AUTHOR(S) 5d. PROJECT NUMBER 2304 Kung

  2. Modeling and Simulation of Longitudinal Dynamics for Low Energy Ring_High Energy Ring at the Positron-Electron Project

    SciTech Connect

    Rivetta, Claudio; Mastorides, T.; Fox, J.D.; Teytelman, D.; Van Winkle, D.; /SLAC

    2007-03-06

    A time domain dynamic modeling and simulation tool for beam-cavity interactions in the Low Energy Ring (LER) and High Energy Ring (HER) at the Positron-Electron Project (PEP-II) is presented. Dynamic simulation results for PEP-II are compared to measurements of the actual machine. The motivation for this tool is to explore the stability margins and performance limits of PEP-II radio-frequency (RF) systems at future higher currents and upgraded RF configurations. It also serves as a test bed for new control algorithms and can define the ultimate limits of the low-level RF (LLRF) architecture. The time domain program captures the dynamic behavior of the beam-cavity-LLRF interaction based on a reduced model. The ring current is represented by macrobunches. Multiple RF stations in the ring are represented via one or two macrocavities. Each macrocavity captures the overall behavior of all the 2 or 4 cavity RF stations. Station models include nonlinear elements in the klystron and signal processing. This enables modeling the principal longitudinal impedance control loops interacting via the longitudinal beam model. The dynamics of the simulation model are validated by comparing the measured growth rates for the LER with simulation results. The simulated behavior of the LER at increased operation currents is presented via low-mode instability growth rates. Different control strategies are compared and the effects of both the imperfections in the LLRF signal processing and the nonlinear drivers and klystrons are explored.

  3. Evaluation of a Pilot Project to Introduce Simulation-Based Team Training to Pediatric Surgery Trauma Room Care

    PubMed Central

    Heimberg, Ellen; Hoffmann, Florian; Heinzel, Oliver; Kirschner, Hans-Joachim; Heinrich, Martina

    2017-01-01

    Introduction. Several studies in pediatric trauma care have demonstrated substantial deficits in both prehospital and emergency department management. Methods. In February 2015 the PAEDSIM collaborative conducted a one and a half day interdisciplinary, simulation based team-training course in a simulated pediatric emergency department. 14 physicians from the medical fields of pediatric surgery, pediatric intensive care and emergency medicine, and anesthesia participated, as well as four pediatric nurses. After a theoretical introduction and familiarization with the simulator, course attendees alternately participated in six simulation scenarios and debriefings. Each scenario incorporated elements of pediatric trauma management as well as Crew Resource Management (CRM) educational objectives. Participants completed anonymous pre- and postcourse questionnaires and rated the course itself as well as their own medical qualification and knowledge of CRM. Results. Participants found the course very realistic and selected scenarios highly relevant to their daily work. They reported a feeling of improved medical and nontechnical skills as well as no uncomfortable feeling during scenarios or debriefings. Conclusion. To our knowledge this pilot-project represents the first successful implementation of a simulation-based team-training course focused on pediatric trauma care in German-speaking countries with good acceptance. PMID:28286528

  4. Viral membrane fusion

    SciTech Connect

    Harrison, Stephen C.

    2015-05-15

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a “fusion loop” or “fusion peptide”) engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics. - Highlights: • Viral fusion proteins overcome the high energy barrier to lipid bilayer merger. • Different molecular structures but the same catalytic mechanism. • Review describes properties of three known fusion-protein structural classes. • Single-virion fusion experiments elucidate mechanism.

  5. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant

    NASA Astrophysics Data System (ADS)

    Camplani, M.; Malizia, A.; Gelfusa, M.; Barbato, F.; Antonelli, L.; Poggi, L. A.; Ciparisse, J. F.; Salgado, L.; Richetta, M.; Gaudio, P.

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  6. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant

    SciTech Connect

    Camplani, M.; Malizia, A.; Gelfusa, M.; Poggi, L. A.; Ciparisse, J. F.; Richetta, M.; Gaudio, P.; Barbato, F.; Antonelli, L.; Salgado, L.

    2016-01-15

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles’ velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  7. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant.

    PubMed

    Camplani, M; Malizia, A; Gelfusa, M; Barbato, F; Antonelli, L; Poggi, L A; Ciparisse, J F; Salgado, L; Richetta, M; Gaudio, P

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  8. Three-dimensional numerical reservoir simulation of the EGS Demonstration Project at The Geysers geothermal field

    NASA Astrophysics Data System (ADS)

    Borgia, Andrea; Rutqvist, Jonny; Oldenburg, Curt M.; Hutchings, Lawrence; Garcia, Julio; Walters, Mark; Hartline, Craig; Jeanne, Pierre; Dobson, Patrick; Boyle, Katie

    2013-04-01

    The Enhanced Geothermal System (EGS) Demonstration Project, currently underway at the Northwest Geysers, California, aims to demonstrate the feasibility of stimulating a deep high-temperature reservoir (up to 400 °C) through water injection over a 2-year period. On October 6, 2011, injection of 25 l/s started from the Prati 32 well at a depth interval of 1850-2699 m below sea level. After a period of almost 2 months, the injection rate was raised to 63 l/s. The flow rate was then decreased to 44 l/s after an additional 3.5 months and maintained at 25 l/s up to August 20, 2012. Significant well-head pressure changes were recorded at Prati State 31 well, which is separated from Prati 32 by about 500 m at reservoir level. More subdued pressure increases occur at greater distances. The water injection caused induced seismicity in the reservoir in the vicinity of the well. Microseismic monitoring and interpretation shows that the cloud of seismic events is mainly located in the granitic intrusion below the injection zone, forming a cluster elongated SSE-NNW (azimuth 170°) that dips steeply to the west. In general, the magnitude of the events increases with depth and the hypocenter depth increases with time. This seismic cloud is hypothesized to correlate with enhanced permeability in the high-temperature reservoir and its variation with time. Based on the existing borehole data, we use the GMS™ GUI to construct a realistic three-dimensional (3D) geologic model of the Northwest Geysers geothermal field. This model includes, from the top down, a low permeability graywacke layer that forms the caprock for the reservoir, an isothermal steam zone (known as the normal temperature reservoir) within metagraywacke, a hornfels zone (where the high-temperature reservoir is located), and a felsite layer that is assumed to extend downward to the magmatic heat source. We then map this model onto a rectangular grid for use with the TOUGH2 multiphase, multicomponent, non

  9. The Virtual ChemLab Project: A Realistic and Sophisticated Simulation of Inorganic Qualitative Analysis

    NASA Astrophysics Data System (ADS)

    Woodfield, Brian F.; Catlin, Heidi R.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg

    2004-11-01

    We have created a set of sophisticated and realistic laboratory simulations for use in freshman- and sophomore-level chemistry classes and laboratories called Virtual ChemLab. We have completed simulations for Inorganic Qualitative Analysis, Organic Synthesis and Organic Qualitative Analysis, Experiments in Quantum Chemistry, Gas Properties, Titration Experiments, and Calorimetric and Thermochemical Experiments. The purpose of our simulations is to reinforce concepts taught in the classroom, provide an environment for creative learning, and emphasize the thinking behind instructional laboratory experiments. We have used the inorganic simulation extensively with thousands of students in our department at Brigham Young University. We have learned from our evaluation that: (i) students enjoy using these simulations and find them to be an asset in learning effective problem-solving strategies, (ii) students like the fact that they can both reproduce experimental procedures and explore various topics in ways they choose, and (iii) students naturally divide themselves into two groups: creative learners, who excel in an open-ended environment of virtual laboratories, and structured learners, who struggle in this same environment. In this article, we describe the Inorganic Qualitative Analysis simulation; we also share specific evaluation findings from using the inorganic simulation in classroom and laboratory settings.

  10. Magnetic mirror fusion: status and prospects

    SciTech Connect

    Post, R.F.

    1980-02-11

    Two improved mirror systems, the tandem mirror (TM) and the field-reversed mirror (FRM) are being intensively studied. The twin practical aims of these studies: to improve the economic prospects for mirror fusion power plants and to reduce the size and/or complexity of such plants relative to earlier approaches to magnetic fusion. While at the present time the program emphasis is still strongly oriented toward answering scientific questions, the emphasis is shifting as the data accumulates and as larger facilities - ones with a heavy technological and engineering orientation - are being prepared. The experimental and theoretical progress that led to the new look in mirror fusion research is briefly reviewed, the new TM and the FRM ideas are outlined, and the projected future course of mirror fusion research is discussed.

  11. INTRODUCTION: Status report on fusion research

    NASA Astrophysics Data System (ADS)

    Burkart, Werner

    2005-10-01

    members' personal views on the latest achievements in fusion research, including magnetic and inertial confinement scenarios. The report describes fusion fundamentals and progress in fusion science and technology, with ITER as a possible partner in the realization of self-sustainable burning plasma. The importance of the socio-economic aspects of energy production using fusion power plants is also covered. Noting that applications of plasma science are of broad interest to the Member States, the report addresses the topic of plasma physics to assist in understanding the achievements of better coatings, cheaper light sources, improved heat-resistant materials and other high-technology materials. Nuclear fusion energy production is intrinsically safe, but for ITER the full range of hazards will need to be addressed, including minimising radiation exposure, to accomplish the goal of a sustainable and environmentally acceptable production of energy. We anticipate that the role of the Agency will in future evolve from supporting scientific projects and fostering information exchange to the preparation of safety principles and guidelines for the operation of burning fusion plasmas with a Q > 1. Technical progress in inertial and magnetic confinement, as well as in alternative concepts, will lead to a further increase in international cooperation. New means of communication will be needed, utilizing the best resources of modern information technology to advance interest in fusion. However, today the basis of scientific progress is still through journal publications and, with this in mind, we trust that this report will find an interested readership. We acknowledge with thanks the support of the members of the IFRC as an advisory body to the Agency. Seven chairmen have presided over the IFRC since its first meeting in 1971 in Madison, USA, ensuring that the IAEA fusion efforts were based on the best professional advice possible, and that information on fusion developments has

  12. Timber assessment market model (1993): Structure, projections and policy simulations. Forest Service general technical report

    SciTech Connect

    Adams, D.M.; Haynes, R.W.

    1996-11-01

    The 1993 timber assessment market model (TAMM) is a spatial model of the solid-wood and timber inventory elements of the U.S. forest products sector. The TAMM model provides annual projections of volumes and prices in the solidwood products and sawtimber stumpage markets and estimates of total timber harvest and inventory by geographic region for periods of up to 50 years. TAMM and its companion models that project pulpwood and fuelwood use were developed to support the quinquennial Resource Planning Act (RPA) timber assessments and assessment updates conducted by the USDA Forest Service. The report summaries the methods used to develop the various components of TAMM and the estimates of key behavioral parameters used in the TAMM structure, and also illustrates the use of TAMM with a base and several scenario projections.

  13. The drive-wise project: driving simulator training increases real driving performance in healthy older drivers

    PubMed Central

    Casutt, Gianclaudio; Theill, Nathan; Martin, Mike; Keller, Martin; Jäncke, Lutz

    2014-01-01

    Background: Age-related cognitive decline is often associated with unsafe driving behavior. We hypothesized that 10 active training sessions in a driving simulator increase cognitive and on-road driving performance. In addition, driving simulator training should outperform cognitive training. Methods: Ninety-one healthy active drivers (62–87 years) were randomly assigned to one of three groups: (1) a driving simulator training group, (2) an attention training group (vigilance and selective attention), or (3) a control group. The main outcome variables were on-road driving and cognitive performance. Seventy-seven participants (85%) completed the training and were included in the analyses. Training gains were analyzed using a multiple regression analysis with planned orthogonal comparisons. Results: The driving simulator-training group showed an improvement in on-road driving performance compared to the attention-training group. In addition, both training groups increased cognitive performance compared to the control group. Conclusion: Driving simulator training offers the potential to enhance driving skills in older drivers. Compared to the attention training, the simulator training seems to be a more powerful program for increasing older drivers' safety on the road. PMID:24860497

  14. Engineering Challenges in Antiproton Triggered Fusion Propulsion

    SciTech Connect

    Cassenti, Brice; Kammash, Terry

    2008-01-21

    During the last decade antiproton triggered fusion propulsion has been investigated as a method for achieving high specific impulse, high thrust in a nuclear pulse propulsion system. In general the antiprotons are injected into a pellet containing fusion fuel with a small amount of fissionable material (i.e., an amount less than the critical mass) where the products from the fission are then used to trigger a fusion reaction. Initial calculations and simulations indicate that if magnetically insulated inertial confinement fusion is used that the pellets should result in a specific impulse of between 100,000 and 300,000 seconds at high thrust. The engineering challenges associated with this propulsion system are significant. For example, the antiprotons must be precisely focused. The pellet must be designed to contain the fission and initial fusion products and this will require strong magnetic fields. The fusion fuel must be contained for a sufficiently long time to effectively release the fusion energy, and the payload must be shielded from the radiation, especially the excess neutrons emitted, in addition to many other particles. We will review the recent progress, possible engineering solutions and the potential performance of these systems.

  15. Simulation of five ground-water withdrawal projections for the Black Mesa area, Navajo and Hopi Indian Reservations, Arizona

    USGS Publications Warehouse

    Brown, J.G.; Eychaner, J.H.

    1988-01-01

    The N Aquifer is the main source of water in the 5,400 sq mi Black Mesa area in the Navajo and Hopi Indian Reservations in northeastern Arizona. Water in the aquifer is under confined conditions in the central 3,300 sq mi of the area. Maximum saturated thickness is about 1,050 ft. Annual groundwater withdrawals from 1972 through 1986 averaged 5,480 acre-ft and included 3,820 acre-ft used to operate a coal mine on Black Mesa. As a result, water levels have declined in a large part of the aquifer. The coal company has applied for a permanent permit under the Surface Mining Control and Reclamation Act of 1977. An existing mathematical model of the aquifer in the Black Mesa area was converted to a newer model program and recalibrated by using revised estimates of selected aquifer parameters and a finer spatial grid. The model was used to simulate four groundwater withdrawal alternatives that combined the existing and proposed mining plans with projected constant or increasing pumpage for nearby communities. A fifth alternative combined increasing community pumpage with no mine withdrawals and was used as a basis for comparison. Simulated water levels for the year 2031 in the coal-lease area are projected to be 60 ft lower than in 1985 for the proposed mining plan combined with growing community pumpage and > 100 ft lower than predevelopment water levels over an area of 1,660 sq mi. Groundwater would rise to within 100 ft of predevelopment levels < 10 yr after mine withdrawals cease. Withdrawals at the mine were a minor factor in determining simulated water levels at most communities in the study area. Water levels at Tuba City were not affected by mine pumpage in any projection. (Author 's abstract)

  16. Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020

    USGS Publications Warehouse

    Kernodle, J.M.

    1998-01-01

    The ground-water-flow model of the Albuquerque Basin (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) was updated to include new information on the hydrogeologic framework (Hawley, J.W., Haase, C.S., and Lozinsky, R.P., 1995, An underground view of the Albuquerque Basin: Proceedings of the 39th Annual New Mexico Water Conference, November 3-4, 1994, p. 37-55). An additional year of ground-water-withdrawal data was appended to the simulation of the historical period and incorporated into the base for future projections to the year 2020. The revised model projects the simulated ground-water levels associated with an aerally enlarged occurrence of the relatively high hydraulic conductivity in the upper part of the Santa Fe Group east and west of the Rio Grande in the Albuquerque area and north to Bernalillo. Although the differences between the two model versions are substantial, the revised model does not contradict any previous conclusions about the effect of City of Albuquerque ground-water withdrawals on flow in the Rio Grande or the net benefits of an effort to conserve ground water. Recent revisions to the hydrogeologic model (Hawley, J.W., Haneberg, W.C., and Whitworth, P.M., in press, Hydrogeologic investigations in the Albuquerque Basin, central New Mexico, 1992-1995: Socorro, New Mexico Bureau of Mines and Mineral Resources Open- File Report 402) of the Albuquerque Basin eventually will require that this model version also be revised and updated.

  17. US fusion in crisis as ITER costs soar

    NASA Astrophysics Data System (ADS)

    Clery, Daniel

    2013-04-01

    The future of magnetic fusion research in the US is looking increasingly bleak as government funding chaos caused by a budget stalemate in Congress plus increasing costs to build the ITER international fusion project in Cadarache, France, is putting a squeeze on the country's domestic programme.

  18. The Fight for Fusion: A Modern Nuclear War.

    ERIC Educational Resources Information Center

    Rogers, Adam; Sereda, David

    1992-01-01

    Describes the work of Bogdan Maglich with helium-based fusion and barriers to its development resulting from lack of government support, competition for funding, and political pet projects. Compares tritium-based to helium-based fusion and the potential for nonradioactive nuclear power to supply the world's energy requirements with no negative…

  19. Materials research for fusion

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Moeslang, A.; Muroga, T.

    2016-05-01

    Fusion materials research started in the early 1970s following the observation of the degradation of irradiated materials used in the first commercial fission reactors. The technological challenges of fusion energy are intimately linked with the availability of suitable materials capable of reliably withstanding the extremely severe operational conditions of fusion reactors. Although fission and fusion materials exhibit common features, fusion materials research is broader. The harder mono-energetic spectrum associated with the deuterium-tritium fusion neutrons (14.1 MeV compared to <2 MeV on average for fission neutrons) releases significant amounts of hydrogen and helium as transmutation products that might lead to a (at present undetermined) degradation of structural materials after a few years of operation. Overcoming the historical lack of a fusion-relevant neutron source for materials testing is an essential pending step in fusion roadmaps. Structural materials development, together with research on functional materials capable of sustaining unprecedented power densities during plasma operation in a fusion reactor, have been the subject of decades of worldwide research efforts underpinning the present maturity of the fusion materials research programme.

  20. Winter warming and summer monsoon reduction after volcanic eruptions in Coupled Model Intercomparison Project 5 (CMIP5) simulations

    NASA Astrophysics Data System (ADS)

    Zambri, Brian; Robock, Alan

    2016-10-01

    Though previous studies have shown that state-of-the-art climate models are rather imperfect in their simulations of the climate response to large volcanic eruptions, the results depend on how the analyses were done. Observations show that all recent large tropical eruptions were followed by winter warming in the first Northern Hemisphere (NH) winter after the eruption, with little such response in the second winter, yet a number of the evaluations have combined the first and second winters. We have looked at just the first winter after large eruptions since 1850 in the Coupled Model Intercomparison Project 5 historical simulations and find that most models do produce a winter warming signal, with warmer temperatures over NH continents and a stronger polar vortex in the lower stratosphere. We also examined NH summer precipitation responses in the first year after these large volcanic eruptions and find clear reductions of summer monsoon rainfall.

  1. Validation of CME Detection Software (CACTus) by Means of Simulated Data, and Analysis of Projection Effects on CME Velocity Measurements

    NASA Astrophysics Data System (ADS)

    Bonte, K.; Jacobs, C.; Robbrecht, E.; de Groof, A.; Berghmans, D.; Poedts, S.

    2011-05-01

    In the context of space weather forecasting, an automated detection of coronal mass ejections (CMEs) becomes more and more important for efficiently handling a large data flow which is expected from recently-launched and future solar missions. In this paper we validate the detection software package "CACTus" by applying the program to synthetic data from our 3D time-dependent CME simulations instead of observational data. The main strength of this study is that we know in advance what should be detected. We describe the sensitivities and strengths of automated detection, more specific for the CACTus program, resulting in a better understanding of CME detection on one hand and the calibration of the CACTus software on the other hand, suggesting possible improvements of the package. In addition, the simulation is an ideal tool to investigate projection effects on CME velocity measurements.

  2. Reconfigurable computing for Monte Carlo simulations: Results and prospects of the Janus project

    NASA Astrophysics Data System (ADS)

    Baity-Jesi, M.; Baños, R. A.; Cruz, A.; Fernandez, L. A.; Gil-Narvion, J. M.; Gordillo-Guerrero, A.; Guidetti, M.; Iñiguez, D.; Maiorano, A.; Mantovani, F.; Marinari, E.; Martin-Mayor, V.; Monforte-Garcia, J.; Muñoz Sudupe, A.; Navarro, D.; Parisi, G.; Pivanti, M.; Perez-Gaviro, S.; Ricci-Tersenghi, F.; Ruiz-Lorenzo, J. J.; Schifano, S. F.; Seoane, B.; Tarancon, A.; Tellez, P.; Tripiccione, R.; Yllanes, D.

    2012-08-01

    We describe Janus, a massively parallel FPGA-based computer optimized for the simulation of spin glasses, theoretical models for the behavior of glassy materials. FPGAs (as compared to GPUs or many-core processors) provide a complementary approach to massively parallel computing. In particular, our model problem is formulated in terms of binary variables, and floating-point operations can be (almost) completely avoided. The FPGA architecture allows us to run many independent threads with almost no latencies in memory access, thus updating up to 1024 spins per cycle. We describe Janus in detail and we summarize the physics results obtained in four years of operation of this machine; we discuss two types of physics applications: long simulations on very large systems (which try to mimic and provide understanding about the experimental non-equilibrium dynamics), and low-temperature equilibrium simulations using an artificial parallel tempering dynamics. The time scale of our non-equilibrium simulations spans eleven orders of magnitude (from picoseconds to a tenth of a second). On the other hand, our equilibrium simulations are unprecedented both because of the low temperatures reached and for the large systems that we have brought to equilibrium. A finite-time scaling ansatz emerges from the detailed comparison of the two sets of simulations. Janus has made it possible to perform spin-glass simulations that would take several decades on more conventional architectures. The paper ends with an assessment of the potential of possible future versions of the Janus architecture, based on state-of-the-art technology.

  3. Plasma asymmetry due to the magnetic filter in fusion-type negative ion sources: Comparisons between two and three-dimensional particle-in-cell simulations

    SciTech Connect

    Fubiani, G. Boeuf, J. P.

    2014-07-15

    Previously reported 2D Particle-In-Cell Monte Carlo Collisions (PIC-MCC) simulations of negative ion sources under conditions similar to those of the ITER neutral beam injection system have shown that the presence of the magnetic filter tends to generate asymmetry in the plasma properties in the extraction region. In this paper, we show that these conclusions are confirmed by 3D PIC-MCC simulations and we provide quantitative comparisons between the 2D and 3D model predictions.

  4. Simulation of a Forensic Chemistry Problem: A Multidisciplinary Project for Secondary School Chemistry Students.

    ERIC Educational Resources Information Center

    Long, G. A.

    1995-01-01

    Describes a project that uses a multidisciplinary approach to problem solving in analyzing a crime scene and suspect evidence. Requires each student to work effectively in a team, communicate in both written and oral forms, perform hands-on laboratory manipulations, and realize that the entire class was depending on their individual contributions…

  5. A Tire Gasification Senior Design Project That Integrates Laboratory Experiments and Computer Simulation

    ERIC Educational Resources Information Center

    Weiss, Brian; Castaldi, Marco J.

    2006-01-01

    A reactor to convert waste rubber tires to useful products such as CO and H2, was investigated in a university undergraduate design project. The student worked individually with mentorship from a faculty professor who aided the student with professional critique. The student was able to research the background of the field and conceive of a novel…

  6. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  7. Muon Catalyzed Fusion

    NASA Technical Reports Server (NTRS)

    Armour, Edward A.G.

    2007-01-01

    Muon catalyzed fusion is a process in which a negatively charged muon combines with two nuclei of isotopes of hydrogen, e.g, a proton and a deuteron or a deuteron and a triton, to form a muonic molecular ion in which the binding is so tight that nuclear fusion occurs. The muon is normally released after fusion has taken place and so can catalyze further fusions. As the muon has a mean lifetime of 2.2 microseconds, this is the maximum period over which a muon can participate in this process. This article gives an outline of the history of muon catalyzed fusion from 1947, when it was first realised that such a process might occur, to the present day. It includes a description of the contribution that Drachrnan has made to the theory of muon catalyzed fusion and the influence this has had on the author's research.

  8. Multi-Scale Simulations of Past and Future Projections of Hydrology in Lake Tahoe Basin, California-Nevada (Invited)

    NASA Astrophysics Data System (ADS)

    Niswonger, R. G.; Huntington, J. L.; Dettinger, M. D.; Rajagopal, S.; Gardner, M.; Morton, C. G.; Reeves, D. M.; Pohll, G. M.

    2013-12-01

    Water resources in the Tahoe basin are susceptible to long-term climate change and extreme events because it is a middle-altitude, snow-dominated basin that experiences large inter-annual climate variations. Lake Tahoe provides critical water supply for its basin and downstream populations, but changes in water supply are obscured by complex climatic and hydrologic gradients across the high relief, geologically complex basin. An integrated surface and groundwater model of the Lake Tahoe basin has been developed using GSFLOW to assess the effects of climate change and extreme events on surface and groundwater resources. Key hydrologic mechanisms are identified with this model that explains recent changes in water resources of the region. Critical vulnerabilities of regional water-supplies and hazards also were explored. Maintaining a balance between (a) accurate representation of spatial features (e.g., geology, streams, and topography) and hydrologic response (i.e., groundwater, stream, lake, and wetland flows and storages), and (b) computational efficiency, is a necessity for the desired model applications. Potential climatic influences on water resources are analyzed here in simulations of long-term water-availability and flood responses to selected 100-year climate-model projections. GSFLOW is also used to simulate a scenario depicting an especially extreme storm event that was constructed from a combination of two historical atmospheric-river storm events as part of the USGS MultiHazards Demonstration Project. Historical simulated groundwater levels, streamflow, wetlands, and lake levels compare well with measured values for a 30-year historical simulation period. Results are consistent for both small and large model grid cell sizes, due to the model's ability to represent water table altitude, streams, and other hydrologic features at the sub-grid scale. Simulated hydrologic responses are affected by climate change, where less groundwater resources will be

  9. A Simulation-Based LED Design Project in Photonics Instruction Based on Industry-University Collaboration

    ERIC Educational Resources Information Center

    Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.

    2011-01-01

    In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…

  10. Compositional simulation and performance analysis of the Prudhoe Bay miscible gas project

    SciTech Connect

    McGuire, P.L.; Moritz, A.L. Jr. )

    1992-08-01

    This paper reports that a pseudocomponent method was developed to use fully compositional reservoir simulation results in the interpretation of separator gas samples. The interpretation provided insight into actual EOR performance by quantifying solvent breakthrough and production rates. Field examples of various reservoir mechanisms affecting the efficiency of Prudhoe Bay EOR are examined.

  11. The Search and Screen Committee: A Simulation. Equity for Women in Higher Education Project.

    ERIC Educational Resources Information Center

    Carroll, Mary R.; And Others

    A simulated search and screen committee activity for selecting three candidates for final interviews for the position of assistant professor of higher education in a College of Education is presented. The institution, "Metropolitan State University," and the position are briefly described, and the process of the appointment of the…

  12. The Numerical Tokamak Project (NTP) simulation of turbulent transport in the core plasma: A grand challenge in plasma physics

    SciTech Connect

    Not Available

    1993-12-01

    The long-range goal of the Numerical Tokamak Project (NTP) is the reliable prediction of tokamak performance using physics-based numerical tools describing tokamak physics. The NTP is accomplishing the development of the most advanced particle and extended fluid model`s on massively parallel processing (MPP) environments as part of a multi-institutional, multi-disciplinary numerical study of tokamak core fluctuations. The NTP is a continuing focus of the Office of Fusion Energy`s theory and computation program. Near-term HPCC work concentrates on developing a predictive numerical description of the core plasma transport in tokamaks driven by low-frequency collective fluctuations. This work addresses one of the greatest intellectual challenges to our understanding of the physics of tokamak performance and needs the most advanced computational resources to progress. We are conducting detailed comparisons of kinetic and fluid numerical models of tokamak turbulence. These comparisons are stimulating the improvement of each and the development of hybrid models which embody aspects of both. The combination of emerging massively parallel processing hardware and algorithmic improvements will result in an estimated 10**2--10**6 performance increase. Development of information processing and visualization tools is accelerating our comparison of computational models to one another, to experimental data, and to analytical theory, providing a bootstrap effect in our understanding of the target physics. The measure of success is the degree to which the experimentally observed scaling of fluctuation-driven transport may be predicted numerically. The NTP is advancing the HPCC Initiative through its state-of-the-art computational work. We are pushing the capability of high performance computing through our efforts which are strongly leveraged by OFE support.

  13. FANS Simulation of Propeller Wash at Navy Harbors (ESTEP Project ER-201031)

    DTIC Science & Technology

    2016-08-01

    support of Environmental Security Technology Certification Program (ESTEP) Project ER-201-031 by the Environmental Sciences Branch ( Code 71750), of...the Advanced Systems and Applied Sciences Division ( Code 71700), Space and Naval Warfare Systems Center Pacific (SSC Pacific), San Diego, CA; and the...this study, the Finite-Analytic Navier–Stokes code was employed to solve the Reynolds-Averaged Navier–Stokes equations in conjunction with advanced

  14. Community Project for Accelerator Science and Simulation (ComPASS) Final Report

    SciTech Connect

    Cary, John R.; Cowan, Benjamin M.; Veitzer, S. A.

    2016-03-04

    Tech-X participated across the full range of ComPASS activities, with efforts in the Energy Frontier primarily through modeling of laser plasma accelerators and dielectric laser acceleration, in the Intensity Frontier primarily through electron cloud modeling, and in Uncertainty Quantification being applied to dielectric laser acceleration. In the following we present the progress and status of our activities for the entire period of the ComPASS project for the different areas of Energy Frontier, Intensity Frontier and Uncertainty Quantification.

  15. Simulation of the Ground-Water Flow System in 1992, and Simulated Effects of Projected Ground-Water Withdrawals in 2020 in the New Jersey Coastal Plain

    USGS Publications Warehouse

    Gordon, Alison D.

    2003-01-01

    In 1992, ground-water withdrawals from the unconfined and confined aquifers in the New Jersey Coastal Plain totaled about 300 million gallons per day, and about 70 percent (200 million galllons per day) of this water was pumped from confined aquifers. The withdrawals have created large cones of depression in several Coastal Plain aquifers near populated areas, particularly in Camden and Ocean Counties. The continued decline of water levels in confined aquifers could cause saltwater intrusion, reduction of stream discharge near the outcrop areas of these aquifers, and depletion of the ground-water supply. Because of this, withdrawals from wells located within these critical areas have been reduced in the Potomac-Raritan-Magothy aquifer system, the Englishtown aquifer system, and the Wenonah-Mount Laurel aquifer. A computer-based model that simulates freshwater and saltwater flow was used to simulate transient ground-water flow conditions and the location of the freshwater-saltwater interface during 1989-92 in the New Jersey Coastal Plain. This simulation was used as the baseline for comparison of water levels and flow budgets. Four hypothetical withdrawal scenarios were simulated in which ground-water withdrawals were either increased or decreased. In scenario 1, withdrawals from wells located within critical area 2 in the Potomac-Raritan-Magothy aquifer system were reduced by amounts ranging from 0 to 35 percent of withdrawals prior to 1992. Critical area 2 is mainly located in Camden County, and most of Burlington and Gloucester Counties. With the reductions, water levels recovered about 30 feet in the regional cone of depression centered in Camden County in the Upper Potomac-Raritan-Magothy aquifer and by 20 ft in the Lower and Middle Potomac-Raritan-Magothy aquifers. In scenarios 2 to 4, withdrawals projected for 2020 were input to the model. In scenario 2, withdrawal restrictions within the critical areas were imposed in the Potomac-Raritan-Magothy aquifer

  16. Simulating the Cranfield geological carbon sequestration project with high-resolution static models and an accurate equation of state

    SciTech Connect

    Soltanian, Mohamad Reza; Amooie, Mohammad Amin; Cole, David R.; Graham, David E.; Hosseini, Seyyed Abolfazl; Hovorka, Susan; Pfiffner, Susan M.; Phelps, Tommy Joe; Moortgat, Joachim

    2016-10-11

    In this study, a field-scale carbon dioxide (CO2) injection pilot project was conducted as part of the Southeast Regional Sequestration Partnership (SECARB) at Cranfield, Mississippi. We present higher-order finite element simulations of the compositional two-phase CO2-brine flow and transport during the experiment. High- resolution static models of the formation geology in the Detailed Area Study (DAS) located below the oil- water contact (brine saturated) are used to capture the impact of connected flow paths on breakthrough times in two observation wells. Phase behavior is described by the cubic-plus-association (CPA) equation of state, which takes into account the polar nature of water molecules. Parameter studies are performed to investigate the importance of Fickian diffusion, permeability heterogeneity, relative permeabilities, and capillarity. Simulation results for the pressure response in the injection well and the CO2 breakthrough times at the observation wells show good agreement with the field data. For the high injection rates and short duration of the experiment, diffusion is relatively unimportant (high P clet numbers), while relative permeabilities have a profound impact on the pressure response. High-permeability pathways, created by fluvial deposits, strongly affect the CO2 transport and highlight the importance of properly characterizing the formation heterogeneity in future carbon sequestration projects.

  17. Simulating the Cranfield geological carbon sequestration project with high-resolution static models and an accurate equation of state

    DOE PAGES

    Soltanian, Mohamad Reza; Amooie, Mohammad Amin; Cole, David R.; ...

    2016-10-11

    In this study, a field-scale carbon dioxide (CO2) injection pilot project was conducted as part of the Southeast Regional Sequestration Partnership (SECARB) at Cranfield, Mississippi. We present higher-order finite element simulations of the compositional two-phase CO2-brine flow and transport during the experiment. High- resolution static models of the formation geology in the Detailed Area Study (DAS) located below the oil- water contact (brine saturated) are used to capture the impact of connected flow paths on breakthrough times in two observation wells. Phase behavior is described by the cubic-plus-association (CPA) equation of state, which takes into account the polar nature ofmore » water molecules. Parameter studies are performed to investigate the importance of Fickian diffusion, permeability heterogeneity, relative permeabilities, and capillarity. Simulation results for the pressure response in the injection well and the CO2 breakthrough times at the observation wells show good agreement with the field data. For the high injection rates and short duration of the experiment, diffusion is relatively unimportant (high P clet numbers), while relative permeabilities have a profound impact on the pressure response. High-permeability pathways, created by fluvial deposits, strongly affect the CO2 transport and highlight the importance of properly characterizing the formation heterogeneity in future carbon sequestration projects.« less

  18. Commnity Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2008-07-01

    The design and performance optimization of particle accelerators is essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC1 Accelerator Science and Technology project, the SciDAC2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modeling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multi-physics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  19. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  20. The Dust Management Project: Characterizing Lunar Environments and Dust, Developing Regolith Mitigation Technology and Simulants

    NASA Technical Reports Server (NTRS)

    Hyatt, Mark J.; Straka, Sharon A.

    2010-01-01

    A return to the Moon to extend human presence, pursue scientific activities, use the Moon to prepare for future human missions to Mars, and expand Earth?s economic sphere, will require investment in developing new technologies and capabilities to achieve affordable and sustainable human exploration. From the operational experience gained and lessons learned during the Apollo missions, conducting long-term operations in the lunar environment will be a particular challenge, given the difficulties presented by the unique physical properties and other characteristics of lunar regolith, including dust. The Apollo missions and other lunar explorations have identified significant lunar dust-related problems that will challenge future mission success. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it?s potentially harmful effects on exploration systems and human explorers. The Dust Management Project (DMP) is tasked with the evaluation of lunar dust effects, assessment of the resulting risks, and development of mitigation and management strategies and technologies related to Exploration Systems architectures. To this end, the DMP supports the overall goal of the Exploration Technology Development Program (ETDP) of addressing the relevant high priority technology needs of multiple elements within the Constellation Program (CxP) and sister ETDP projects. Project scope, plans, and accomplishments will be presented.

  1. Magnetic-confinement fusion

    NASA Astrophysics Data System (ADS)

    Ongena, J.; Koch, R.; Wolf, R.; Zohm, H.

    2016-05-01

    Our modern society requires environmentally friendly solutions for energy production. Energy can be released not only from the fission of heavy nuclei but also from the fusion of light nuclei. Nuclear fusion is an important option for a clean and safe solution for our long-term energy needs. The extremely high temperatures required for the fusion reaction are routinely realized in several magnetic-fusion machines. Since the early 1990s, up to 16 MW of fusion power has been released in pulses of a few seconds, corresponding to a power multiplication close to break-even. Our understanding of the very complex behaviour of a magnetized plasma at temperatures between 150 and 200 million °C surrounded by cold walls has also advanced substantially. This steady progress has resulted in the construction of ITER, a fusion device with a planned fusion power output of 500 MW in pulses of 400 s. ITER should provide answers to remaining important questions on the integration of physics and technology, through a full-size demonstration of a tenfold power multiplication, and on nuclear safety aspects. Here we review the basic physics underlying magnetic fusion: past achievements, present efforts and the prospects for future production of electrical energy. We also discuss questions related to the safety, waste management and decommissioning of a future fusion power plant.

  2. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models

  3. Overview of Theory and Modeling in the Heavy Ion Fusion Virtual National Laboratory

    SciTech Connect

    Davidson, R. C.; Kaganovich, I. D.; Lee, W. W.; Qin, H.; Startsev, E. A.; Tzenov, S; Friedman, A; Barnard, J J; Cohen, R H; Grote, D P; Lund, S M; Sharp, W M; Henestroza, E; Lee, E P; Yu, S S; Vay, J -L; Welch, D R; Rose, D V; Olson, C L; Celata, C. M.

    2003-04-09

    This paper presents analytical and simulation studies of intense heavy ion beam propagation, including the injection, acceleration, transport and compression phases, and beam transport and focusing in background plasma in the target chamber. Analytical theory and simulations that support the High Current Experiment (HCX), the Neutralized Transport Experiment (NTX), and the advanced injector development program are being used to provide a basic understanding of the nonlinear beam dynamics and collective processes, and to develop design concepts for the next-step Integrated Beam Experiment (IBX), an Integrated Research Experiment (IRE), and a heavy ion fusion driver. Three-dimensional (3-D) nonlinear perturbative simulations have been applied to collective instabilities driven by beam temperature anisotropy and to two-stream interactions between the beam ions and any unwanted background electrons. Three-dimensional particle-in-cell simulations of the 2 MV Electrostatic Quadrupole (ESQ) injector have clarified the influence of pulse rise time. Analytical studies and simulations of the drift compression process have been carried out. Syntheses of a four-dimensional (4-D) particle distribution function from phase-space projections have been developed. And, studies of the generation and trapping of stray electrons in the beam self-fields have been performed. Particle-in-cell simulations, involving preformed plasma, are being used to study the influence of charge and current neutralization on the focusing of the ion beam in Neutralized Transport Experiment and in a fusion chamber.

  4. Overview of theory and modeling in the Heavy Ion Fusion Virtual National Laboratory

    SciTech Connect

    Davidson, R.C.; Kaganovich, I.D.; Lee, W.W.; Qin, H.; Startsev, E.A.; Tzenov, S.; Friedman, A.; Barnard, J.J.; Cohen, R.H.; Grote, D.P.; Lund, S.M.; Sharp, W.M.; Celata, C.M.; de Hoon, M.; Henestroza, E.; Lee, E.P.; Yu, S.S.; Vay, J-L.; Welch, D.R.; Rose, D.V.; Olson, C.L.

    2002-05-01

    This paper presents analytical and simulation studies of intense heavy ion beam propagation, including the injection, acceleration, transport and compression phases, and beam transport and focusing in background plasma in the target chamber. Analytical theory and simulations that support the High Current Experiment (HCX), the Neutralized Transport Experiment (NTX), and the advanced injector development program, are being used to provide a basic understanding of the nonlinear beam dynamics and collective processes, and to develop design concepts for the next-step Integrated Beam Experiment (IBX), an Integrated Research Experiment (IRE), and a heavy ion fusion driver. 3-D nonlinear perturbative simulations have been applied to collective instabilities driven by beam temperature anisotropy, and to two-stream interactions between the beam ions and any unwanted background electrons; 3-D particle-in-cell simulations of the 2 MV Electrostatic Quadrupole (ESQ) injector have clarified the influence of pulse rise time; analytical studies and simulations of the drift compression process have been carried out; syntheses of a 4-D particle distribution function from phase-space projections have been developed; and studies of the generation and trapping of stray electrons in the beam self fields have been performed. Particle-in-cell simulations, involving pre-formed plasma, are being used to study the influence of charge and current neutralization on the focusing of the ion beam in NTX and in a fusion chamber.

  5. Acoustically Driven Magnetized Target Fusion At General Fusion: An Overview

    NASA Astrophysics Data System (ADS)

    O'Shea, Peter; Laberge, M.; Donaldson, M.; Delage, M.; the Fusion Team, General

    2016-10-01

    Magnetized Target Fusion (MTF) involves compressing an initial magnetically confined plasma of about 1e23 m-3, 100eV, 7 Tesla, 20 cm radius, >100 μsec life with a 1000x volume compression in 100 microseconds. If near adiabatic compression is achieved, the final plasma of 1e26 m-3, 10keV, 700 Tesla, 2 cm radius, confined for 10 μsec would produce interesting fusion energy gain. General Fusion (GF) is developing an acoustic compression system using pneumatic pistons focusing a shock wave on the CT plasma in the center of a 3 m diameter sphere filled with liquid lead-lithium. Low cost driver, straightforward heat extraction, good tritium breeding ratio and excellent neutron protection could lead to a practical power plant. GF (65 employees) has an active plasma R&D program including both full scale and reduced scale plasma experiments and simulation of both. Although acoustic driven compression of full scale plasmas is the end goal, present compression studies use reduced scale plasmas and chemically accelerated Aluminum liners. We will review results from our plasma target development, motivate and review the results of dynamic compression field tests and briefly describe the work to date on the acoustic driver front.

  6. The Change of First-flowering Date over South Korea Projected from Downscaled IPCC AR5 Simulation: Peach and Pear

    NASA Astrophysics Data System (ADS)

    Ahn, J. B.; Hur, J.

    2014-12-01

    The variations in the first-flowering date (FFD) of peach (Prunus persica) and pear (Pyrus pyrifolia) under future climate change in South Korea are investigated using simulations obtained from five models of the fifth Coupled Model Intercomparison Project. For the study, daily temperature simulations with Historical (1986-2005), and RCP (2071-2090) 4.5 and 8.5 scenarios are statistically downscaled to 50 peach and pear FFD (FFDpeach and FFDpear, respectively) observation sites over South Korea. The number of days transformed to standard temperature (DTS) method is selected as the phenological model and applied to simulations for estimating FFDpeach and FFDpear over South Korea, due to its superior performance on the target plants and region compared to the growing degree days (GDD) and chill days (CD) methods. In the analysis, mean temperatures for early spring (February to April) over South Korea in 2090 under RCP4.5 and 8.5 scenarios are expected to have increased by 1.9K and 3.3K, respectively. Among the early spring months of February to April, February shows the largest temperature increase of 2.1K and 3.7K for RCP4.5 and 8.5 scenarios, respectively. The increased temperature during February and March accelerates the plant growth rate and thereby advances FFDpeach by 7.0 and 12.7 days and FFDpear by 6.1 and 10.7 days, respectively. These results imply that the present flowering of peach and pear in the middle of April will have advanced to late March or early April by the end of this century. Acknowledgements This work was carried out with the support of the Rural Development Administration Cooperative Research Program for Agriculture Science and Technology Development under Grant Project No. PJ009953, Republic of Korea.

  7. Control Room Training for the Hyper-X Project Utilizing Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Lux-Baumann, Jesica; Dees, Ray; Fratello, David

    2006-01-01

    The NASA Dryden Flight Research Center flew two Hyper-X research vehicles and achieved hypersonic speeds over the Pacific Ocean in March and November 2004. To train the flight and mission control room crew, the NASA Dryden simulation capability was utilized to generate telemetry and radar data, which was used in nominal and emergency mission scenarios. During these control room training sessions personnel were able to evaluate and refine data displays, flight cards, mission parameter allowable limits, and emergency procedure checklists. Practice in the mission control room ensured that all primary and backup Hyper-X staff were familiar with the nominal mission and knew how to respond to anomalous conditions quickly and successfully. This report describes the technology in the simulation environment and the Mission Control Center, the need for and benefit of control room training, and the rationale and results of specific scenarios unique to the Hyper-X research missions.

  8. GPS Radiation Measurements: Instrument Modeling and Simulation (Project w14_gpsradiation)

    SciTech Connect

    Sullivan, John P.

    2016-11-29

    The following topics are covered: electron response simulations and typical calculated response. Monte Carlo calculations of the response of future charged particle instruments (dosimeters) intended to measure the flux of charged particles in space were performed. The electron channels are called E1- E11 – each of which is intended to detect a different range of electron energies. These instruments are on current and future GPS satellites.

  9. Simulation of Plasma Jet Merger and Liner Formation within the PLX- α Project

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Chen, Hsin-Chiang; Shih, Wen; Hsu, Scott

    2015-11-01

    Detailed numerical studies of the propagation and merger of high Mach number argon plasma jets and the formation of plasma liners have been performed using the newly developed method of Lagrangian particles (LP). The LP method significantly improves accuracy and mathematical rigor of common particle-based numerical methods such as smooth particle hydrodynamics while preserving their main advantages compared to grid-based methods. A brief overview of the LP method will be presented. The Lagrangian particle code implements main relevant physics models such as an equation of state for argon undergoing atomic physics transformation, radiation losses in thin optical limit, and heat conduction. Simulations of the merger of two plasma jets are compared with experimental data from past PLX experiments. Simulations quantify the effect of oblique shock waves, ionization, and radiation processes on the jet merger process. Results of preliminary simulations of future PLX- alpha experiments involving the ~ π / 2 -solid-angle plasma-liner configuration with 9 guns will also be presented. Partially supported by ARPA-E's ALPHA program.

  10. Project DANA: multiagent simulation and fuzzy rules for international crisis detection--can we forestall wars?

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.; Colautti, Andre

    1999-11-01

    Assessing the conflicting potential of an international situation is very important in the exercise of Defence duty. Mastering a formal method allowing the detection of risky situations is a necessity. Our aim was to develop a highly operational method twinned with a computer simulation tool which can explore a huge number of potential war zones, and can test many hypotheses with high accuracy within reasonable time. We use a multi-agents system to describe an international situation. The agent coding allows us to give computer existence to very abstract concepts such as: a government, the economy, the armed forces, the foreign policy... We give to these agents fuzzy rules of behavior, those rules represent human expertise. In order to yardstick our model we used the Falklands war to make our first simulations. The main distortion between the historical reality and our simulations comes from our fuzzy controller which causes a great loss of information. We are going to change it to a more efficient one in order to fit the historical reality. Agent coding with fuzzy rules allows human experts to keep close to their statements and expertise, and they can handle this kind of tool quite easily.

  11. Simulation of disruptions on C-Mod in support of the new outer divertor project

    NASA Astrophysics Data System (ADS)

    Poli, F.; Kessel, C.; Titus, P.; Zhang, H.; Doody, J.; Granetz, R.; Lipschultz, B.

    2011-10-01

    Disruptions in C-Mod lead to large forces on structures inside the vacuum vessel and can be grouped in two classes depending on whether they begin with a thermal quench (midplane disruptions) or not (VDEs). VDEs induce the largest currents in the lower divertor, which is being re-designed to be toroidally continuous and allow operation at high temperatures (< 600C). Both types of disruptions have been simulated with TSC and the vector potential has been integrated in the ANSYS code (ANSYS® Multiphysics, Release 12.1) to calculate magnetic fields, induced currents in the structures of interest and forces. These forces are then used to calculate stress and deformation in the part. The TSC simulations are adjusted (thermal quench time, halo temperature and width, etc) to match the plasma characteristics as close as possible to experiments. The results of these simulations will be shown and the dependence of disruption time scales and characteristics on these plasma parameters and the new outer divertor structures will be discussed. This work is supported by the US Department of Energy under DE-AC02-CH0911466 and DE-FC02-99ER54512.

  12. The Living Heart Project: A robust and integrative simulator for human heart function.

    PubMed

    Baillargeon, Brian; Rebelo, Nuno; Fox, David D; Taylor, Robert L; Kuhl, Ellen

    2014-11-01

    The heart is not only our most vital, but also our most complex organ: Precisely controlled by the interplay of electrical and mechanical fields, it consists of four chambers and four valves, which act in concert to regulate its filling, ejection, and overall pump function. While numerous computational models exist to study either the electrical or the mechanical response of its individual chambers, the integrative electro-mechanical response of the whole heart remains poorly understood. Here we present a proof-of-concept simulator for a four-chamber human heart model created from computer topography and magnetic resonance images. We illustrate the governing equations of excitation-contraction coupling and discretize them using a single, unified finite element environment. To illustrate the basic features of our model, we visualize the electrical potential and the mechanical deformation across the human heart throughout its cardiac cycle. To compare our simulation against common metrics of cardiac function, we extract the pressure-volume relationship and show that it agrees well with clinical observations. Our prototype model allows us to explore and understand the key features, physics, and technologies to create an integrative, predictive model of the living human heart. Ultimately, our simulator will open opportunities to probe landscapes of clinical parameters, and guide device design and treatment planning in cardiac diseases such as stenosis, regurgitation, or prolapse of the aortic, pulmonary, tricuspid, or mitral valve.

  13. The Living Heart Project: A robust and integrative simulator for human heart function

    PubMed Central

    Baillargeon, Brian; Rebelo, Nuno; Fox, David D.; Taylor, Robert L.; Kuhl, Ellen

    2014-01-01

    The heart is not only our most vital, but also our most complex organ: Precisely controlled by the interplay of electrical and mechanical fields, it consists of four chambers and four valves, which act in concert to regulate its filling, ejection, and overall pump function. While numerous computational models exist to study either the electrical or the mechanical response of its individual chambers, the integrative electro-mechanical response of the whole heart remains poorly understood. Here we present a proof-of-concept simulator for a four-chamber human heart model created from computer topography and magnetic resonance images. We illustrate the governing equations of excitation-contraction coupling and discretize them using a single, unified finite element environment. To illustrate the basic features of our model, we visualize the electrical potential and the mechanical deformation across the human heart throughout its cardiac cycle. To compare our simulation against common metrics of cardiac function, we extract the pressure-volume relationship and show that it agrees well with clinical observations. Our prototype model allows us to explore and understand the key features, physics, and technologies to create an integrative, predictive model of the living human heart. Ultimately, our simulator will open opportunities to probe landscapes of clinical parameters, and guide device design and treatment planning in cardiac diseases such as stenosis, regurgitation, or prolapse of the aortic, pulmonary, tricuspid, or mitral valve. PMID:25267880

  14. Computational problems in magnetic fusion research

    SciTech Connect

    Killeen, J.

    1981-08-31

    Numerical calculations have had an important role in fusion research since its beginning, but the application of computers to plasma physics has advanced rapidly in the last few years. One reason for this is the increasing sophistication of the mathematical models of plasma behavior, and another is the increased speed and memory of the computers which made it reasonable to consider numerical simulation of fusion devices. The behavior of a plasma is simulated by a variety of numerical models. Some models used for short times give detailed knowledge of the plasma on a microscopic scale, while other models used for much longer times compute macroscopic properties of the plasma dynamics. The computer models used in fusion research are surveyed. One of the most active areas of research is in time-dependent, three-dimensional, resistive magnetohydrodynamic models. These codes are reviewed briefly.

  15. The WASCAL regional climate simulations for West Africa - how to add value to existing climate projections

    NASA Astrophysics Data System (ADS)

    Arnault, J.; Heinzeller, D.; Klein, C.; Dieng, D.; Smiatek, G.; Bliefernicht, J.; Sylla, M. B.; Kunstmann, H.

    2015-12-01

    With climate change being one of the most severe challenges to rural Africa in the 21st century, West Africa is facing an urgent need to develop effective adaptation and mitigation measures to protect its constantly growing population. WASCAL (West African Science Service Center on Climate Change and Adapted Land Use) is a large-scale research-focused program designed to enhance the resilience of human and environmental systems to climate change and increased variability. An integral part of its climate services is the provisioning of a new set of high resolution, ensemble-based regional climate change scenarios for the region of West Africa. In this contribution, we present the overall concept of the WASCAL regional climate projections and provide information on the dissemination of the data. We discuss the model performance over the validation period for two of the three regional climate models employed, the Weather Research & Forecasting Tool (WRF) and the Consortium for Small-scale Modeling Model COSMO in Climate Mode (COSMO-CLM), and give details about a novel precipitation database used to verify the models. Particular attention is paid to the representation of the dynamics of the West African Summer Monsoon and to the added value of our high resolution models over existing data sets. We further present results on the climate change signal obtained from the WRF model runs for the periods 2020-2050 and 2070-2100 and compare them to current state-of-the-art projections from the CORDEX project. As an example, the figure shows the different climate change signals obtained for the total annual rainfall with respect to the 1980-2010 mean (WRF-E: WASCAL 12km high-resolution run MPI-ESM + WRFV3.5.1, CORDEX-E: 50km medium-resolution run MPI-ESM + RCA4, CORDEX-G: 50km medium-resolution run GFDL-ESM + RCA4).

  16. Analyzing and Projecting U.S. Wildfire Potential Based on NARCCAP Regional Climate Simulations

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Mearns, L. O.

    2012-12-01

    Wildfires usually ignite and spread under hot, dry, and windy conditions. Wildfires, especially catastrophic mega-fires, have increased in recent decades in the United States and other parts of the world. Among the converging factors were extreme weather event such as extended drought. Furthermore, climate has been projected to become warmer worldwide and drier with more frequent droughts in many subtropical and mid-latitude regions including parts of the U.S. due to the greenhouse effect. As a result, wildfires are expected to increase in the future. This study analyzes current features and project future trends of wildfire potential in the continental United States. Fire potential is measured by fire indices including the Keetch-Byram Drought Index and Fosberg Fire Weather Index. The meteorological data used to calculate the fire indices are the dynamical downscaling produced by the North American Regional Climate Change Assessment Program (NARCCAP). Current fire potential generally increases from the eastern to western coast and from cool to warm season. Fire potential has large seasonal and inter-annual variability and spatial connections. Fire potential has shown overall increasing trends in recent decades. The trends are projected to continue this century due to the greenhouse effect. Future fire potential will increase significantly in the Rocky Mountains all seasons and in the Southeast during summer and autumn. Future climate change will also reduce the windows of prescribed burning, which is one of the forest management tools for reducing wildfire risks. The research results are expected to provide useful information for assessing the ecological, environmental, and social impacts of future wildfires and developing mitigation strategies.

  17. Intermediates and kinetics of membrane fusion.

    PubMed Central

    Bentz, J

    1992-01-01

    microscopic activation energies used to simulate the data. Overall, these results clearly show that the intermediates of protein mediated fusion can be studied only by using assays sensitive to the formation of each proposed intermediate. PMID:1420890

  18. Acoustic project for installation of motor generator group by means of computer simulation

    NASA Astrophysics Data System (ADS)

    Ferreira, Jose C.; Zannin, Paulo T.

    2004-05-01

    This work presents an acoustical project for the installation of a motor generator group of electricity in a hotel by means of computer modeling. The noise levels at the site have been obtained without the motor generator group, and via the computer modeling it has been deduced how these levels would be after the installation of the equipment. A possible solution to mitigate the noise impact the equipment would cause on the neighborhood has been indicated, and it has been predicted how the impact would be reduced after the implantation of this solution.

  19. AeroCom INSITU Project: Comparison of Aerosol Optical Properties from In-situ Surface Measurements and Model Simulations

    NASA Astrophysics Data System (ADS)

    Schmeisser, L.; Andrews, E.; Schulz, M.; Fiebig, M.; Zhang, K.; Randles, C. A.; Myhre, G.; Chin, M.; Stier, P.; Takemura, T.; Krol, M. C.; Bian, H.; Skeie, R. B.; da Silva, A. M., Jr.; Kokkola, H.; Laakso, A.; Ghan, S.; Easter, R. C.

    2015-12-01

    AeroCom, an open international collaboration of scientists seeking to improve global aerosol models, recently initiated a project comparing model output to in-situ, surface-based measurements of aerosol optical properties. The model/measurement comparison project, called INSITU, aims to evaluate the performance of a suite of AeroCom aerosol models with site-specific observational data in order to inform iterative improvements to model aerosol modules. Surface in-situ data have the unique property of being traceable to physical standards, which is a big asset in accomplishing the overarching goal of bettering the accuracy of aerosol processes and predicative capability of global climate models. The INSITU project looks at how well models reproduce aerosol climatologies on a variety of time scales, aerosol characteristics and behaviors (e.g., aerosol persistence and the systematic relationships between aerosol optical properties), and aerosol trends. Though INSITU is a multi-year endeavor, preliminary phases of the analysis, using GOCART and other models participating in this AeroCom project, show substantial model biases in absorption and scattering coefficients compared to surface measurements, though the sign and magnitude of the bias varies with location and optical property. Spatial patterns in the biases highlight model weaknesses, e.g., the inability of models to properly simulate aerosol characteristics at sites with complex topography (see Figure 1). Additionally, differences in modeled and measured systematic variability of aerosol optical properties suggest that some models are not accurately capturing specific aerosol co-dependencies, for example, the tendency of in-situ surface single scattering albedo to decrease with decreasing aerosol extinction coefficient. This study elucidates specific problems with current aerosol models and suggests additional model runs and perturbations that could further evaluate the discrepancies between measured and modeled

  20. Cell fusion and nuclear fusion in plants.

    PubMed

    Maruyama, Daisuke; Ohtsu, Mina; Higashiyama, Tetsuya

    2016-12-01

    Eukaryotic cells are surrounded by a plasma membrane and have a large nucleus containing the genomic DNA, which is enclosed by a nuclear envelope consisting of the outer and inner nuclear membranes. Although these membranes maintain the identity of cells, they sometimes fuse to each other, such as to produce a zygote during sexual reproduction or to give rise to other characteristically polyploid tissues. Recent studies have demonstrated that the mechanisms of plasma membrane or nuclear membrane fusion in plants are shared to some extent with those of yeasts and animals, despite the unique features of plant cells including thick cell walls and intercellular connections. Here, we summarize the key factors in the fusion of these membranes during plant reproduction, and also focus on "non-gametic cell fusion," which was thought to be rare in plant tissue, in which each cell is separated by a cell wall.

  1. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    NASA Astrophysics Data System (ADS)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  2. The Ohio River Valley CO2 Storage Project AEP Mountaineer Plant, West Virginia Numerical Simulation and Risk Assessment Report

    SciTech Connect

    Neeraj Gupta

    2008-03-31

    A series of numerical simulations of carbon dioxide (CO{sub 2}) injection were conducted as part of a program to assess the potential for geologic sequestration in deep geologic reservoirs (the Rose Run and Copper Ridge formations), at the American Electric Power (AEP) Mountaineer Power Plant outside of New Haven, West Virginia. The simulations were executed using the H{sub 2}O-CO{sub 2}-NaCl operational mode of the Subsurface Transport Over Multiple Phases (STOMP) simulator (White and Oostrom, 2006). The objective of the Rose Run formation modeling was to predict CO{sub 2} injection rates using data from the core analysis conducted on the samples. A systematic screening procedure was applied to the Ohio River Valley CO{sub 2} storage site utilizing the Features, Elements, and Processes (FEP) database for geological storage of CO{sub 2} (Savage et al., 2004). The objective of the screening was to identify potential risk categories for the long-term geological storage of CO{sub 2} at the Mountaineer Power Plant in New Haven, West Virginia. Over 130 FEPs in seven main classes were assessed for the project based on site characterization information gathered in a geological background study, testing in a deep well drilled on the site, and general site conditions. In evaluating the database, it was apparent that many of the items were not applicable to the Mountaineer site based its geologic framework and environmental setting. Nine FEPs were identified for further consideration for the site. These FEPs generally fell into categories related to variations in subsurface geology, well completion materials, and the behavior of CO{sub 2} in the subsurface. Results from the screening were used to provide guidance on injection system design, developing a monitoring program, performing reservoir simulations, and other risk assessment efforts. Initial work indicates that the significant FEPs may be accounted for by focusing the storage program on these potential issues. The

  3. Controlled Nuclear Fusion.

    ERIC Educational Resources Information Center

    Glasstone, Samuel

    This publication is one of a series of information booklets for the general public published by The United States Atomic Energy Commission. Among the topics discussed are: Importance of Fusion Energy; Conditions for Nuclear Fusion; Thermonuclear Reactions in Plasmas; Plasma Confinement by Magnetic Fields; Experiments With Plasmas; High-Temperature…

  4. Antiproton catalyzed fusion

    SciTech Connect

    Morgan, D.L. Jr.; Perkins, L.J.; Haney, S.W.

    1995-05-15

    Because of the potential application to power production, it is important to investigate a wide range of possible means to achieve nuclear fusion, even those that may appear initially to be infeasible. In antiproton catalyzed fusion, the negative antiproton shields the repulsion between the positively charged nuclei of hydrogen isotopes, thus allowing a much higher level of penetration through the repulsive Coulomb barrier, and thereby greatly enhancing the fusion cross section. Because of their more compact wave function, the more massive antiprotons offer considerably more shielding than do negative muons. The effects of the shielding on fusion cross sections are most predominate, at low energies. If the antiproton could exist in the ground state with a nucleus for a sufficient time without annihilating, the fusion cross sections are so enhanced that at room temperature energies, values up to about 1,000 barns (that for d+t) would be possible. Unfortunately, the cross section for antiproton annihilation with the incoming nucleus is even higher. A model that provides an upper bound for the fusion to annihilation cross section for all relevant energies indicates that each antiproton will catalyze no more than about one fusion. Because the energy required to make one antiproton greatly exceeds the fusion energy that is released, this level of catalysis is far from adequate for power production.

  5. Fusion Science Education Outreach

    NASA Astrophysics Data System (ADS)

    Danielson, C. A.; DIII-D Education Group

    1996-11-01

    This presentation will focus on education outreach activities at General Atomics that have been expanded to include the general population on science education with a focus on fusion energy. Outreach materials are distributed upon request both nationally and internationally. These materials include a notebook containing copies of DIII--D tour panels, fusion poster, new fusion energy video, new fusion energy brochure, and the electromagnetic spectrum curriculum. The 1996 Fusion Forum (held in the House Caucus Room) included a student/ teacher lunch with Energy Secretary Hazel O'Leary and a private visit to the Forum exhibits. The continuing partnership with Kearny High School includes lectures, job shadowing, internship, equipment donations and an award-winning electric car-racing program. Development of distribution by CD of the existing interactive fusion energy kiosk and a virtual reality tour of the DIII--D facility are underway. The DIII--D fusion education WWW site includes e-mail addresses to ``Ask the Wizard,'' and/or receive GA's outreach materials. Steve Rodecker, a local science teacher, aided by DIII--D fusion staff, won his second Tapestry Award; he also was named the ``1995 National Science Teacher of the Year'' and will be present to share his experiences with the DIII--D educational outreach program.

  6. Two Horizons of Fusion

    ERIC Educational Resources Information Center

    Lo, Mun Ling; Chik, Pakey Pui Man

    2016-01-01

    In this paper, we aim to differentiate the internal and external horizons of "fusion." "Fusion" in the internal horizon relates to the structure and meaning of the object of learning as experienced by the learner. It clarifies the interrelationships among an object's critical features and aspects. It also illuminates the…

  7. Fusion Power Deployment

    SciTech Connect

    J.A. Schmidt; J.M. Ogden

    2002-02-06

    Fusion power plants could be part of a future portfolio of non-carbon dioxide producing energy supplies such as wind, solar, biomass, advanced fission power, and fossil energy with carbon dioxide sequestration. In this paper, we discuss key issues that could impact fusion energy deployment during the last half of this century. These include geographic issues such as resource availability, scale issues, energy storage requirements, and waste issues. The resource needs and waste production associated with fusion deployment in the U.S. should not pose serious problems. One important feature of fusion power is the fact that a fusion power plant should be locatable within most local or regional electrical distribution systems. For this reason, fusion power plants should not increase the burden of long distance power transmission to our distribution system. In contrast to fusion power, regional factors could play an important role in the deployment of renewable resources such as wind, solar and biomass or fossil energy with CO2 sequestration. We examine the role of these regional factors and their implications for fusion power deployment.

  8. InFusion: Advancing Discovery of Fusion Genes and Chimeric Transcripts from Deep RNA-Sequencing Data

    PubMed Central

    Okonechnikov, Konstantin; Imai-Matsushima, Aki; Seitz, Alexander; Meyer, Thomas F.; Garcia-Alcalde, Fernando

    2016-01-01

    Analysis of fusion transcripts has become increasingly important due to their link with cancer development. Since high-throughput sequencing approaches survey fusion events exhaustively, several computational methods for the detection of gene fusions from RNA-seq data have been developed. This kind of analysis, however, is complicated by native trans-splicing events, the splicing-induced complexity of the transcriptome and biases and artefacts introduced in experiments and data analysis. There are a number of tools available for the detection of fusions from RNA-seq data; however, certain differences in specificity and sensitivity between commonly used approaches have been found. The ability to detect gene fusions of different types, including isoform fusions and fusions involving non-coding regions, has not been thoroughly studied yet. Here, we propose a novel computational toolkit called InFusion for fusion gene detection from RNA-seq data. InFusion introduces several unique features, such as discovery of fusions involving intergenic regions, and detection of anti-sense transcription in chimeric RNAs based on strand-specificity. Our approach demonstrates superior detection accuracy on simulated data and several public RNA-seq datasets. This improved performance was also evident when evaluating data from RNA deep-sequencing of two well-established prostate cancer cell lines. InFusion identified 26 novel fusion events that were validated in vitro, including alternatively spliced gene fusion isoforms and chimeric transcripts that include intergenic regions. The toolkit is freely available to download from http:/bitbucket.org/kokonech/infusion. PMID:27907167

  9. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model.

  10. Simulation of wave interactions with MHD

    SciTech Connect

    Batchelor, Donald B; Abla, G; Bateman, Glenn; Bernholdt, David E; Berry, Lee A; Bonoli, P.; Bramley, R; Breslau, J.; Chance, M.; Chen, J.; Choi, M.; Elwasif, Wael R; Fu, GuoYong; Harvey, R. W.; Jaeger, Erwin Frederick; Jardin, S. C.; Jenkins, T; Keyes, David E; Klasky, Scott A; Kruger, Scott; Ku, Long-Poe; Lynch, Vickie E; McCune, Douglas; Ramos, J.; Schissel, D.; Schnack,; Wright, J.

    2008-07-01

    The broad scientific objectives of the SWIM (Simulation of Wave Interaction with MHD) project are twofold: (1) improve our understanding of interactions that both radio frequency (RF) wave and particle sources have on extended-MHD phenomena, and to substantially improve our capability for predicting and optimizing the performance of burning plasmas in devices such as ITER: and (2) develop an integrated computational system for treating multiphysics phenomena with the required flexibility and extensibility to serve as a prototype for the Fusion Simulation Project. The Integrated Plasma Simulator (IPS) has been implemented. Presented here are initial physics results on RF effects on MHD instabilities in tokamaks as well as simulation results for tokamak discharge evolution using the IPS.

  11. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    USGS Publications Warehouse

    Keane, R.E.; Rollins, M.; Zhu, Z.-L.

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA. The primary index used to prioritize treatment areas across the nation is Fire Regime Condition Class (FRCC) computed as departures of current conditions from the historical fire and landscape conditions. This paper describes a process that uses an extensive set of ecological models to map FRCC from a departure statistic computed from simulated time series of historical landscape composition. This mapping process uses a data-driven, biophysical approach where georeferenced field data, biogeochemical simulation models, and spatial data libraries are integrated using spatial statistical modeling to map environmental gradients that are then used to predict vegetation and fuels characteristics over space. These characteristics are then fed into a landscape fire and succession simulation model to simulate a time series of historical landscape compositions that are then compared to the composition of current landscapes to compute departure, and the FRCC values. Intermediate products from this process are then used to create ancillary vegetation, fuels, and fire regime layers that are useful in the eventual planning and implementation of fuel and restoration treatments at local scales. The complex integration of varied ecological models at different scales is described and problems encountered during the implementation of this process in the LANDFIRE prototype project are addressed. ?? 2007 Elsevier B.V. All rights reserved.

  12. The BAHAMAS project: calibrated hydrodynamical simulations for large-scale structure cosmology

    NASA Astrophysics Data System (ADS)

    McCarthy, Ian G.; Schaye, Joop; Bird, Simeon; Le Brun, Amandine M. C.

    2017-03-01

    The evolution of the large-scale distribution of matter is sensitive to a variety of fundamental parameters that characterize the dark matter, dark energy, and other aspects of our cosmological framework. Since the majority of the mass density is in the form of dark matter that cannot be directly observed, to do cosmology with large-scale structure, one must use observable (baryonic) quantities that trace the underlying matter distribution in a (hopefully) predictable way. However, recent numerical studies have demonstrated that the mapping between observable and total mass, as well as the total mass itself, are sensitive to unresolved feedback processes associated with galaxy formation, motivating explicit calibration of the feedback efficiencies. Here, we construct a new suite of large-volume cosmological hydrodynamical simulations (called BAHAMAS, for BAryons and HAloes of MAssive Systems), where subgrid models of stellar and active galactic nucleus feedback have been calibrated to reproduce the present-day galaxy stellar mass function and the hot gas mass fractions of groups and clusters in order to ensure the effects of feedback on the overall matter distribution are broadly correct. We show that the calibrated simulations reproduce an unprecedentedly wide range of properties of massive systems, including the various observed mappings between galaxies, hot gas, total mass, and black holes, and represent a significant advance in our ability to mitigate the primary systematic uncertainty in most present large-scale structure tests.

  13. Fusion reactors for hydrogen production via electrolysis

    NASA Astrophysics Data System (ADS)

    Fillo, J. A.; Powell, J. R.; Steinberg, M.

    The decreasing availability of fossil fuels emphasizes the need to develop systems which will produce synthetic fuel to substitute for and supplement the natural supply. An important first step in the synthesis of liquid and gaseous fuels is the production of hydrogen. Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of 40 to 60% and hydrogen production efficiencies by high temperature electrolysis of 50 to 70% are projected for fusion reactors using high temperature blankets.

  14. Fusion looks to the future - again

    SciTech Connect

    Waldrop, M.M.

    1984-11-02

    The $46 million budget cut in the US magnetic fusion program introduced a new approach that abandons the race to build a working power reactor in favor of a long-term emphasis on science, technology, and international cooperation. Administration policies which favor private funding for demonstration projects